Data Driven Decisions and Other Illusions

This Friday marks my 30th college reunion, which means I’ve been working for 30 years. I have spent time at great companies like Y&R, Pfizer, J&J, Bausch & Lomb and The New England Consulting Group. I’ve worked with many smart people along the way and now, as a Consultant at NECG, I get to see smart people from leading companies all making the same mistakes I did over and over and again. Everywhere I look, people think the data they are working with is real, precise, true. I want to be fair, Nielsen and BASES, IRI, IQVIA, Symphony, IPSOS, Millward Brown, Kantar, etc, all provide good products but don’t be fooled, only a tiny fraction of this data is precise enough to call it real/true. Think of it as the equivalent to fat-free half and half. It gets the job done, but it ain’t exactly half and half.

The data provides directional guidance and a foundation for analytics but if you are hanging your hat, career and bonus on finding an absolute truth versus a relative likelihood, beware of the following 5 pitfalls:

1) Beware the data “refresh”: When I was a young marketer, our operating plan process happened June-October and almost every year, sometime in August or September, our data supplier (both IRI and Nielsen) would do a database refresh resulting in changes to the data that could have a big impact on plans. More importantly it can have a big impact in thinking about where to look for volume, etc.

2) Beware the vendor change: Take the same category and two vendors and you will get different numbers. They will also be happy to explain the differences and it’s not their fault if you are not listening. We just finished a Pharma project where the client is in the process of switching between the two leading vendors and the differences in several estimates were over 20%.

3) Beware of the * (Small sample size): I once heard a comedian tell a joke about eating in Italy. One of my favorite twists in the story is when the restaurant owner says “The bread is so fresh, you are going to want to slap the baker.” For different reasons I have always wanted to slap MRD people and vendors who include unreliable sample sizes in the findings but asterisk them. Don’t they know we can’t un-see a number that supports our preconceived notion. Further, when is the last time you heard someone pause mid argument to say asterisk.

4) Beware of the visual: If you are old enough to remember Network ad Cable television then you are old enough to remember the magnificent media plan flow chart with its simple elegance. It captured discrete boxes of activity neatly separated by day part with flights separated by clean off air periods. Anyone who has seen a flow chart of a post-buy analysis can tell you it looks like you gave a drunk a crayon and said: “have at it”.

5) Beware of the data quilt: Most models are built from more than one data source sown together into the quilt that is the model. This means that items #1-#4 are compounded and you have to trust whoever built the model to really understand your business.

At J&J Canada, I had a company car and, as a result, I had to participate in a SafeFleet defensive driving class. The instructor asked someone what it meant when another driver turns their signal on. The person answered that the driver was going to turn. The instructor smiled and said: “No. It means that they may or may not be turning. Your job is to think of the turn signal as indicating a potential outcome not a definite one.” Think of your data in the same way.

Leave a comment