World War II. Planes belonging to the Americans and Germans are both being hit in the war. And, to protect your planes you armour them. But then, armour makes the planes heavier, difficult to manoeuvre and comes at an additional fuel cost. So the armour needs to be optimum. Now, which part of the plane do you armour? You look at the returning planes and draw pictures of where the bullet marks are on the body. The picture below has this visual – there are more bullets in the fuselage, hardly any in the engines.

Immediate conclusion – armour the fuselage and minimize the damage. Wait! Let’s take a close look. The picture drawn considers the data from the body of planes that returned to the base – how about the ones that did not? Well, they got hit at the engines and crashed and hence did not make it to the data set! This new insight upends the earlier conclusion!
Consider another scenario. A project A witnessing multiple challenges during execution is flagged as high risk, secures top management visibility and oversight (obviously) and finally gets delivered with fanfare. It becomes a celebrated project thanks to the turnaround. Then there’s project B – of similar complexity and size but has a project team that has planned proactively and meticulously to preclude escalations and senior leadership oversight during execution. As a result, Project B is delivered smoothly and efficiently with little external attention. When it’s time for recognition, the “best managed project” award often goes to Project A, the one with the dramatic recovery.
These scenarios exemplify the principle of What You See Is What There Is (WYSIWTI). This concept is discussed by Daniel Kahneman in his award-winning book, “Thinking, Fast and Slow.” When faced with a problem, our natural tendency is to analyse the readily available data to reach a solution. While we might sometimes validate the accuracy of this data, is that sufficient to guarantee the right decision? Often overlooked is the crucial step of validating the completeness of the data and scenarios. Is there missing information/ insights that could significantly impact and skew the results?
Why are we prone to making decisions based on WYSIWTI? Here are three key reasons:
- Availability bias: The human mind tends to form judgments and make decisions based on easily accessible information, often neglecting deeper, less obvious factors. Our perceptions can be strongly influenced by immediate and salient details, leading us to premature conclusions without thorough analysis. This is a typical System 1 response ( Click here to read about System 1 and System 2 Thinking )
- Incorrect assumptions: Decisions can be based on flawed or incomplete assumptions that are not adequately questioned or challenged.
- Incomplete data gathering: The data collected might not be comprehensive and may fail to consider all relevant inputs, parameters or scenarios.
The human brain, despite accounting for less than 2% of our body weight, consumes approximately 20% of our energy. It constantly seeks to optimize energy expenditure. Left to itself, it will often default to a System 1 response, which requires less cognitive effort and thus less energy.
So, how can we prevent or minimize WYSIWTI bias when making non-trivial decisions?
- Actively seek out missing data and potential scenarios. Question the data/ information you have and consider what might be absent.
- Adopt a critical thinking approach. Engage in deliberate, slow thinking (System 2 thinking). Systematically consider all relevant and feasible use cases.
- Gather multiple perspectives, including expert opinions, before making the final decision. This helps to identify blind spots, consider information you might have overlooked and benefit from expert advice.
Consciously applying these strategies will enhance the quality of our decisions and reduce the bias introduced by WYSIWTI.
