A lot of companies and salespeople rely on data. I’m fine with that, because ultimately, basing your approach and tactics on data is better than all the other options available. Data, however, is just data, and you can easily misread it. The conclusions you pull from data need to be tested, poked and prodded and validated because the story behind the data is more important than the data itself.
During World War II, allied commanders noticed that returning aircraft typically had bullet holes in similar areas of the aircraft. They concluded that these areas needed extra armor, because the enemy was hitting them more often. You might be asking why not just add extra armor to the whole airplane. Airplanes and flight involve a very strict management of weight. So they had to be smart about what sections of the airplanes to strengthen. They thought using the data of where bullet holes were concentrated would reduce the number of planes shot down. Why add armor where there are never any bullet holes? They re-enforced the areas with lots of bullet holes, but after a little while they ran the numbers again and realized the number of shot down planes was NOT going down. It wasn’t working.
It took a mathematician named Abraham Wald to point out that maybe different conclusions could be drawn from the data. Perhaps, he stipulated, the reason certain areas of the planes were rarely showing bullet holes was because the planes hit in those areas were the ones not making it back. That question, that insight, that different way of looking at the data led the commanders to re-enforcing the areas of the planes that had no visible bullet holes in them in returning planes. Once that change was made, there was a drastic reduction in the amount of planes shot down.
So we can rely on data, but we still have to draw the correct conclusions from it. Context matters!
I often work with companies over a longer stretch of time. 1 to 2 year contracts are not unusual in my business. One of the things that I always warn my clients about is activity levels. When we come in and install a proper sales process, and train reps how to use it, and reset the mental positioning of reps around activity, rejection, and qualification, we almost always see a huge rise in activity.
This makes sense. When reps have a better mindset, a better approach, and a better attitude, they make more calls. Management is usually chuffed. We get slaps on the backs and tons of praise. But over the course of time, with more and more training and skills enhancement of the sales reps, activity numbers will peak, and start to actually recede from the high water mark. Sales will continue to rise, but front end pipeline activity will go down. Management will often panic, assuming that this is a harbinger of a drop in sales. A close look at the data however, usually shows that conversions are up so much, that less activity is needed to drive better end results. A medium skilled sales rep makes more calls than a lower skilled rep. But a highly skilled rep actually needs to make less calls than a medium skilled one, because they convert better and take more clients deeper into the proposal and closed won steps.
Bad data conclusions will lead management to try and get the best reps to make as many calls as the middle of the pack reps, when instead, they should be training the middle of the back to convert as well as the top reps.
My favorite fictional character of all time, Francisco Danconia, makes a great statement:
There is no such thing as a contradiction. If you think you are facing a contradiction, check your premises. One of them is wrong”
In other words, look at the data again and ask different questions. Valid data can drive invalid conclusions