What we’ve noticed working in Nepal
Over the past two years, our team has been working with companies in Nepal to connect the dots between their data and decisions. In our efforts, we’ve seen three cognitive biases that appear consistently across enterprises and hinder their ability to make better decisions.
When you-know-who gave the world “alternative facts”, I laughed it off like everyone else. Then I encountered a few C-suite executives in Nepal who said: “I’ve been in this industry for x years, and in this company for y, so I know how this business works; and I know that data doesn’t work in this business”.
So, data doesn’t work for this business, but your experience, which is also data, works? That’s some alternative facts right there.
While there is no denying that domain knowledge is paramount when it comes to making better decisions with data, to deny it’s relevance without even trying to implement data-driven decisions is a clear example of the Dunning-Kruger effect at work. For those uninitiated, the Dunning-Kruger effect is when someone with low ability, expertise, or experience regarding a certain type of task or area of knowledge tend to overestimate their ability or knowledge (Source: Wikipedia) .
What happens when executives underestimate the power of data while overestimating their own knowledge about what works in the business and what doesn’t? Companies miss out on insights that might lead to better decisions simply because they never even looked for those insights in the first place.
2. Ambiguity Effect + Loss Aversion + Status Quo Bias Cocktail
- Ambiguity effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown (Source: Wikipedia).
- Loss Aversion is the tendency to prefer avoiding losses to acquiring equivalent gains (Souce: Wikipedia).
- Status quo bias is the tendency to prefer things to stay relatively the same (Source: Wikipedia).
To illustrate the potency of this cocktail, let’s think about this question: What’s the probability of a favorable outcome if you implement some decision derived from an analytic process that you’ve never tried before?
- The answer if you’re somewhat data aware: We hope positive but unknown till its tested.
- The answer we hear most often: Unknown but lets not test until we know we don’t lose something.
If anyone can tell me how it’s possible to know the consequences of a decision till its taken, we’ll immediately shut down Xabit and come work for you!
This logical fallacy however seems to escape companies in Nepal, the result of which is complete decision paralysis, and continuation of the status quo. What the companies fail to understand is that decision paralysis leads to even worse outcomes in the future. How? Say a company with 1 lakh in sales today takes a data-driven decision that leads to a sales decline of 5%. The impact is a loss of rupees 5,000. Isn’t it better to bear the 5,000 now than when the company’s sales are 1 crore and the expected loss is now 5,00,000? Once again if you’re data aware, you understand the virtuous circle of data-driven decision-making but if you’re not, you’ll छलफल (discuss) your way out of doing anything.
3. Courtesy bias
When we first started Xabit, a friend of mine gave me some timeless advice: नेपालमा जे पनि गर तर चित्त नदुखाउ (Do anything in Nepal but don’t hurt feelings). Him being a Psychologist, he had great insight into the social psyche of our country: the tendency to give an opinion that is more socially correct than one’s true opinion, so as to avoid offending anyone (Source: Wikipedia).
It’s perhaps no news to you that courtesy bias is deep rooted in our culture so naturally this has seeped into corporate culture as well. I’m guilty of having this bias too: I sometimes find myself nodding along with a potential client who clearly has little idea about the topic being discussed; or I let the employees at one of our client’s take reign over data-operationalisation knowing full well that they don’t have the capability to execute it.
My job as a data analyst is to follow the data, with the data in either case being lacks knowledge and doesn’t have ability to execute. Rather, in both cases I fell victim to the courtesy bias, which led to bad decisions: a potential client who had high expectations that could not be delivered and a client who lost faith in data-driven decisions because it was never operationalised well.
If only I’d dealt with my courtesy bias, perhaps the outcomes would have been different. Alas, the decision to deal with it was never taken.
It’s not very courteous of me to end this post with a rant about what’s wrong without giving an idea of what could be done to fix things. With this post, my hope is to make you aware of some of these biases and to get you thinking about how one might counter them. Post your ideas in the comments and we’ll put up our own ideas on a separate post very soon.
Keep Data. Decisions. Repeat-ing
Anup