3 min read

Three Cognitive Biases Sabotaging Your Project Delivery

Three Cognitive Biases Sabotaging Your Project Delivery
Three Cognitive Biases Sabotaging Your Project Delivery
6:18

Your brain is working against you. Not maliciously—efficiently. The human brain takes up less than 2% of body mass but can consume up to 20% of the body’s energy. To manage that cost, it defaults to System 1 thinking—fast, automatic, pattern-based. Daniel Kahneman documented this extensively in Thinking, Fast and Slow. System 1 is brilliant for routine decisions. It is terrible for the kind of nuanced, ambiguous, high-stakes work that defines project delivery.

There are three cognitive biases that I see tripping up knowledge workers—business analysts, project managers, consultants—over and over again. Understanding them will not eliminate them, but it will give you a fighting chance.

1. Affective Forecasting Bias

This one comes from behavioral economics. Affective forecasting is our attempt to predict how we will feel about future events. The problem is that we are systematically bad at it. We overestimate both the duration and intensity of negative emotional outcomes. We perpetually overestimate the negative.

The components are straightforward: we make predictions about emotional valence (positive or negative), specific emotions, and duration and intensity. The major issue is that we consistently overestimate duration and intensity, and our accuracy is skewed—it favors the positive for things that have already happened but catastrophizes about things that have not happened yet.

In project delivery, this shows up as change resistance rooted in imagined catastrophe. The new system will be a nightmare. The reorganization will destroy our team. The process change will double my workload. These forecasts feel real and urgent, but research consistently shows they are inflated. The actual experience is almost always less painful than the prediction. This has a major impact on scope definition and forecasting in projects, because the people providing estimates are filtering their projections through this bias.

The antidote is not to dismiss people’s concerns—they are genuine—but to help them distinguish between a forecast and a fact. Stories from teams who have already gone through the change are powerful here. Concrete data on actual outcomes versus predicted outcomes can recalibrate the emotional forecast. And sometimes you just need to name it: we are probably overestimating how bad this will be, and here is why.

2. Loss Aversion

Kahneman and Tversky’s prospect theory established that losses loom larger than gains. When the same choice is framed as a loss rather than a gain, people make fundamentally different decisions. The fear of losing what you have is psychologically more powerful than the potential of gaining something better.

This is why people cling to deprecated processes and outdated tools. The loss of the familiar feels worse than the potential benefit of the new. It is why experienced professionals resist reskilling even when the data clearly shows their current skills are depreciating. The pain of letting go of mastery outweighs the abstract promise of future relevance.

For project leaders, the implication is framing. If you present a change as “here is what you are losing,” you will get resistance. If you present the same change as “here is what you are gaining,” you have a better shot. And if you can tell a story about someone who made the transition and what it meant for them, you have the best shot of all. Stories tied to value are ultimately what get people to change a fixed mindset around a specific fear. You cannot really train somebody out of fear. You are going to have to story-tell them out of it first to break through that loss-regret cycle. Then they become open to starting to learn.

3. Confirmation Bias

This is the one that does the most damage in knowledge work. We have a tendency toward what Jonathan Haidt calls a righteous mind—to hold firmly to what we believe and to selectively process information that confirms those beliefs. When we are reading and processing information in this state, our brain only catches that which connects us back to what we already believe.

For business analysts, this is dangerous. If you walk into a requirements session with preconceived notions about how the process works, you will hear the things that confirm your assumptions and miss the things that contradict them. You will produce requirements that reflect your model of the world, not the stakeholder’s reality.

I have seen analysts who deliberately avoid absorbing too much of a business area’s domain knowledge. They learn at a high level what the area does but resist getting deep, because the moment they become a de facto subject matter expert, they start imposing their previous business area’s processes onto the new one. That is not analysis. That is confirmation bias with a methodology certification attached to it.

The countermeasures are structural, not motivational. You cannot willpower your way out of confirmation bias. You need accountability partners who challenge your thinking. You need retrospective practices—a weekly check-in asking what you got wrong, where your assumptions were off, what surprised you. You need the discipline to seek dissenting perspectives rather than avoiding them. Being receptive to opinions that conflict with your own is the critical learning mechanism. And the reality is, a lot of people never learn because they lack that receptiveness.

Coaches are probably the best mechanism to protect yourself from these biases. Mentors can be a great tool as well. A really good manager can help you break habits when you get stuck in confirmation bias around tasks and decisions you are making on autopilot. Or you can wait until you get hit in the face, and then you will have to stop anyway.

People almost always find what they’re expecting to find if they allow their expectations to guide their search. — Bart Ehrman

These three biases are not character flaws. They are features of human cognition. The professionals who outperform their peers are not the ones who are immune to them—no one is—but the ones who have built systems to catch them before they do damage.

The Rise of the AI-Ready Technical Program Manager

The Rise of the AI-Ready Technical Program Manager

The modern enterprise runs on complex systems, cross-functional initiatives, and constant change. In that landscape, the Technical Program Manager...

Read More
The Six Mindset Shifts That Separate Adaptive Leaders from Everyone Else

The Six Mindset Shifts That Separate Adaptive Leaders from Everyone Else

It Is Not About Having Better Answers. It Is About Asking Better Questions.

Read More