Blind Spots & Bad Decisions: Why We Fall for Financial Traps
Share
- Details
- Text
- Audio
- Downloads
- Extra Reading
Why do smart people make dumb financial choices? This lecture explores the surprising link between our psychology and money mistakes. We will see how fear, overconfidence, and even our desire to be liked can cloud our judgment, especially when dealing with financial "experts". Learn how these psychological blind spots worsen conflicts of interest, and how to make smarter financial decisions, free from emotional influence.
Download Text
Blind Spots & Bad Decisions: Why We Fall for Financial Traps
Raghavendra Rau, University of Cambridge
Monday 02 June 2025
Introduction
My name is Raghavendra Rau and I’m a professor at the University of Cambridge. This is the fifth lecture in our series on the human side of finance. In the past four lectures, we’ve explored how conflicts of interest and asymmetric information shape financial decision-making. We’ve looked at how advisors, managers, and insurers—what we might call the “supply side” of finance—sometimes act in their own interests rather than those of the people they’re supposed to serve. But today, we’re going to flip the lens.
Because sometimes, the problem isn’t just with the people giving the advice. Sometimes, the problem is with the people taking it.
We like to think we make financial decisions logically. That if someone gives us good information, we’ll use it well. But in reality, we’re often our own worst enemies. We anchor on irrelevant numbers. We avoid selling losing stocks because it feels like admitting defeat. We follow the crowd into bubbles and buy high, then panic and sell low. And then we tell ourselves it made sense all along.
This lecture is about those mental blind spots - the predictable psychological traps that trip us up, even when we have the information we need. These aren’t just random errors. They’re systematic. And they’re well-documented in decades of research in behavioral finance.
We’ll start by looking at two kinds of psychological bias: belief biases, which affect how we interpret information, and preference biases, which affect how we feel about outcomes. We’ll see how things like overconfidence, loss aversion, and mental accounting lead people to make choices that seem irrational in hindsight—but are entirely human in the moment.
Then we’ll connect these ideas back to earlier lectures. This is important because these behavioral biases don’t just make us vulnerable to bad decisions. They also make it easier for others to exploit us. When you’re already anchored to a high price, it’s easier for a broker to justify a bad deal. When you hate losses more than you like gains, you’re an easy target for insurance up-selling.
This lecture is about learning to spot those traps—not just in the market, but in our own heads.
How Classical Finance Assumes We Think
Let’s start with the basics. Classical finance, what you’ll find in most textbooks and MBA classrooms, assumes that people are rational. This means two things: first, that we form beliefs based on all available information, and second, that we make decisions by weighing costs and benefits to maximize our well-being.
The first part—rational beliefs—means we’re supposed to update our views logically when new data comes in. If you find out that inflation is rising faster than expected, you should lower your estimate of future bond returns. This is called Bayesian updating, and it’s a fancy way of saying, “change your mind when the facts change.”
The second part—rational preferences—means we should choose consistently. If you prefer £10 today over £11 tomorrow, then you should also prefer £100 today over £101 tomorrow. That’s called time-consistent decision-making. Economists call the whole package “Subjective Expected Utility” theory—basically, the idea that people behave like well-calibrated decision engines.
The entire field of modern finance was built on these assumptions. Asset pricing models like the Capital Asset Pricing Model (CAPM), portfolio theory, and even the idea of market efficiency—all of them depend on the belief that people interpret information correctly and make optimal choices.
Now, to be fair, these models aren’t wrong. They’re just incomplete. They’re useful simplifications, like assuming no friction in physics. They help us understand how markets should work. But when you look at how people actually behave, the cracks start to show.
Because in practice, we don’t always update our beliefs correctly. We’re overconfident. We anchor on irrelevant information. We see patterns in noise. And we don’t always choose consistently either. We hate losses more than we enjoy gains. We mentally separate our money into imaginary “accounts.” And we let emotions—like fear and regret—drive our choices.
That’s where behavioral finance comes in. It doesn’t throw out classical models. Instead, it tries to understand where and why real people deviate from them. It brings in ideas from psychology and decision science to explain why markets sometimes look irrational—not because they are broken, but because they are human.
In the next part of the lecture, we’ll dig into these biases—starting with the ones that affect our beliefs.
Part I – Belief Biases: When We Misinterpret the World
Let’s begin with the ways we misjudge the world around us. These are belief biases—distortions in how we process information. The result? We misread signals, overweight the wrong cues, and feel far more confident than we should.
Overconfidence: I Know More Than I Actually Do
One of the most well-documented behavioral biases is overconfidence. Most people think they’re above average—at driving, at managing money, even at predicting the stock market. That’s statistically impossible, of course. But it’s a deeply human trait.
In finance, overconfidence shows up most clearly in trading behavior. In a classic study, psychologist Terrance Odean found that investors who traded more frequently actually earned lower returns than those who traded less. Why? Because they were too confident in their ability to time the market or pick winners. They bought high, sold low, and paid too much in transaction costs. In one follow-up with Barber and Odean, male investors—who tend to be more overconfident—underperformed women because they traded more often.
Overconfidence doesn’t just hurt individual portfolios. It creates excessive trading volume across the market. It contributes to bubbles. And it leads investors to ignore the advice of professionals—not because they’ve assessed the advice, but because they’re convinced they already know better.
Representativeness and Availability: Seeing Patterns That Aren’t There
Another major bias is representativeness—the tendency to judge probability based on similarity. If something looks like a tech stock, we assume it will behave like other tech stocks. If a company had strong earnings growth last quarter, we project that growth will continue—without checking if that’s realistic.
This is closely linked to availability bias—our tendency to judge things based on what’s easy to remember. If you’ve recently heard about a market crash, you’ll probably overestimate the chances of another one. If your friend just made a killing on crypto, you’ll believe that’s a likely outcome—even if most investors lost money.
Both of these biases are rooted in how our brains evolved. We’re wired to spot patterns and jump to conclusions. That’s useful when you’re trying to avoid predators in the wild. But in financial markets, it leads us to overreact to short-term news and underreact to slow, structural change.
Anchoring: Getting Stuck on Irrelevant Numbers
Have you ever hesitated to sell a stock because it’s below the price you paid for it? That’s anchoring. You fixate on a number—like your purchase price—even though it has no bearing on what the stock is worth today.
Anchoring affects all kinds of financial decisions. Real estate investors anchor on previous valuations. Consumers anchor on list prices. And analysts sometimes issue target prices that are influenced by arbitrary reference points—like the stock’s recent high.
In a famous experiment, MBA students were asked if they would buy a bottle of wine for a price equal to the last two digits of their Social Security number. Then they were asked how much they were actually willing to pay. The result? Students with higher SSNs gave higher valuations. The anchor—completely unrelated to wine—skewed their judgment.
Confirmation Bias: Seeing What We Want to See
Once we’ve formed a belief, we tend to stick with it—even in the face of contradictory evidence. This is called confirmation bias. We selectively gather information that supports our view and ignore or downplay anything that challenges it.
If you believe a company is undervalued, you’ll focus on news that reinforces that belief. You’ll dismiss negative earnings reports as temporary. And you’ll spend more time reading bullish commentary than bearish analysis.
This bias can be especially dangerous in the age of social media and algorithm-driven newsfeeds. Once you click on one article, you get shown more of the same. Before long, you’re in an echo chamber, surrounded by information that confirms what you already believed.
These belief biases are powerful—and they’re hard to spot in ourselves. But they shape how we interpret the world, how we respond to financial news, and ultimately, how we invest. In the next section, we’ll look at preference biases—the emotional side of decision-making: how we feel about gains, losses, and risk.
Part II – Preference Biases: When Emotion Overrides Logic
Now let’s turn to the second category of traps: preference biases. These aren’t about how we interpret information — they’re about how we value outcomes. Even when we correctly understand a situation, we can still make poor decisions because of how we feel about the possible gains and losses.
Loss Aversion: Losses Hurt More Than Gains Feel Good
Imagine I offer you a coin toss: if it’s heads, you win £100; if it’s tails, you lose £100. Most people reject that bet—even though the expected value is zero. In fact, studies show people generally need to be offered a potential gain of £200 to accept a 50/50 bet where they could lose £100. That’s because, psychologically, losses hurt about twice as much as equivalent gains feel good.
This idea comes from Prospect Theory, developed by Daniel Kahneman and Amos Tversky. It explains why we take excessive risks to avoid losses and why we’re overly cautious when we’re ahead. For example, investors often hold on to losing stocks for too long, hoping they’ll bounce back, and sell winners too early to “lock in gains.” That’s known as the disposition effect.
This bias is especially powerful during market downturns. After a crash, the fear of further losses often leads people to sell at the bottom—just when long-term investors should be buying.
Mental Accounting: Not All Money Feels the Same
Another deeply human trait is mental accounting. This is our tendency to treat money differently depending on where it comes from or how we label it.
Let’s say you get a £1,000 bonus at work. You’re likely to treat it as “fun money” and spend it more freely than your regular salary, even though it’s exactly the same in financial terms. Inheritances, gambling winnings, and tax refunds often go into different “mental accounts” than savings or salary.
This can lead to irrational decisions. For example, many investors view dividend income as safer or more appropriate to spend than selling an equivalent amount of shares. Others refuse to touch an inheritance for emotional reasons, even when it would help with urgent expenses.
A striking real-world example comes from a study of New York City taxi drivers. They tend to set daily income targets and quit for the day once they hit that goal—even if demand is still high and they could earn more. On slow days, they work longer to reach the target. Economically, this makes no sense: you should work longer on high-demand days. But mentally, they’ve budgeted how much they need to “win” that day.
Narrow Framing: Evaluating Risks in Isolation
Related to mental accounting is narrow framing — the tendency to evaluate decisions one at a time, rather than in the context of a broader strategy.
A famous example comes from Paul Samuelson. He once offered a colleague a bet: win $200 on heads, lose $100 on tails. The colleague refused—but said he’d accept the same bet if allowed to play 100 times. Statistically, the risk becomes negligible over many rounds. But because most people focus on each bet in isolation, they reject it.
This kind of thinking leads investors to avoid diversification, because they’re afraid of individual losses, even when the overall portfolio is strong. It also explains why people are reluctant to rebalance—even though shifting from winners to losers is often the rational move.
Ambiguity Aversion: Preferring the Known to the Unknown
Finally, there’s ambiguity aversion—our discomfort with uncertainty when we don’t even know the probabilities.
If I offer you a choice between:
- A: A 50% chance to win £100, or
- B: A chance to win £100, but you don’t know the odds…
Most people choose A, even if the actual chance in B could be the same or better. We prefer known risks over unknown ones—even when the unknown might be favorable.
In finance, this explains things like home bias, where investors overweight domestic stocks because they feel more familiar, even though international diversification might reduce risk. It also explains why people avoid newer investment products or alternative asset classes, simply because they seem unfamiliar.
So to sum up: belief biases distort how we see the world, and preference biases distort how we respond to it. Together, they help explain why people make financial decisions that look irrational from the outside—but feel entirely justified at the time.
In the next section, we’ll see how these biases not only lead us astray but also make us easier to exploit—especially by financial professionals who know how to press the right psychological buttons.
Psychological Bias Meets Agency Conflict
Now let’s take a step back. In the earlier lectures, we focused on conflicts of interest. We looked at how fund managers, CEOs, and financial advisors might not always act in the best interests of the people they’re supposed to serve. These were what we called agency problems—situations where one party has more information or control than the other, and can exploit that advantage.
But here’s where it gets worse. Even if we know there’s a potential conflict, our own psychological biases make us more likely to fall into the trap anyway. We’re not just passive victims of asymmetric information—we’re active participants in our own misjudgment.
Let’s take a few examples.
Start with overconfidence. Investors often believe they’re better at evaluating funds or financial advice than they actually are. So they might ignore red flags—like excessive fees, high turnover, or vague explanations—because they think they can “see through the sales pitch.” A good advisor will help you stay disciplined. A clever one, who’s not acting in your best interest, will exploit that confidence to upsell you structured products, variable annuities, or actively managed funds that generate high commissions.
Then there’s framing. Many complex financial products are sold not on their actual merits, but on how they’re presented. A structured note might be advertised as “principal protected,” even though the fine print reveals significant risks. A fund might report “average annual returns” that exclude poor-performing periods—exploiting our tendency to see performance in a narrow frame. You saw earlier how even changing a label from “rebate” to “bonus” can affect how likely people are to spend the money. Now apply that logic to product disclosures.
Loss aversion plays a huge role, too. Insurance sales rely heavily on our discomfort with loss. We buy extended warranties not because they’re cost-effective, but because the idea of having to pay out of pocket for a rare breakdown feels worse than simply overpaying now. Similarly, advisors may discourage selling a losing investment—not because it’s the right strategy, but because they know you’ll hesitate to realize a loss.
And then there’s mental accounting. Advisors who understand how we mentally bucket our money can nudge us toward suboptimal decisions. For example, they might recommend dividend-paying stocks for “income” portfolios, even when a total return strategy would be better. Or they might use “capital preservation” language to frame high-fee products aimed at retirees, knowing that older investors often treat retirement savings as sacrosanct.
The common thread here is this: psychological biases don’t just affect how we manage our own money. They also shape how we interact with financial professionals, especially in situations where the interests of advisor and client aren’t perfectly aligned.
That’s what makes these traps so dangerous. We’re not just misled by others — we’re primed to mislead ourselves. And that’s a big reason why behavioral finance matters: it doesn’t just explain our mistakes. It helps us understand why those mistakes are so easy to exploit.
Ethical and Social Dimensions: Why We Don’t Push Back
At this point, you might be wondering: if the financial system is full of conflicts, and we know our own biases make us vulnerable, why don’t we just stop? Why don’t more people walk away from bad advice? Why don’t more investors ask hard questions or switch advisors?
The answer isn’t just cognitive - it’s emotional and social.
One reason is that we want to think well of ourselves. That means we’re highly skilled at rationalizing. When something feels wrong—like a confusing product pitch or a strangely high fee—we often ignore the discomfort, tell ourselves it’s fine, and move on. This is what behavioral ethics researchers call self-serving bias: the tendency to interpret ambiguous situations in a way that protects our self-image.
Another reason is conformity. People look to social cues when making decisions. If your friends are all chasing meme stocks or investing in crypto, you may feel pressure to follow along—even if you have doubts. The classic Asch experiments showed that people will go along with clearly incorrect answers just to fit in. In financial decisions, this kind of social pressure is subtle but powerful.
Then there’s obedience to authority. We’re often reluctant to challenge perceived experts. In Milgram’s famous experiments, participants delivered what they believed were dangerous electric shocks simply because they were told to. In finance, a confident advisor—especially one with credentials or social status—can easily override a client’s gut instincts.
There’s also transparency—or the lack of it. In many cases, the structure of the system itself makes it hard to spot misconduct. You might not realize you're paying high fees or taking on hidden risks. Even when you suspect something’s wrong, the fog of complexity makes it easy to look away. If no one else seems alarmed, you might assume it’s fine.
Finally, our desire to avoid social friction plays a role. It’s awkward to ask someone, especially a long-time advisor, “Are you really acting in my best interest?” So we stay quiet. We defer. We nod. And sometimes, we regret it later.
In short, financial decisions aren’t made in a vacuum. They’re made in a social environment, under pressure, with limited information—and a powerful internal desire to avoid conflict, discomfort, and shame.
This makes us vulnerable—not just to bad advice, but to staying silent when we know better.
Takeaways
Let’s step back and take stock.
So far in this series, we’ve talked about how the financial system often fails investors—not because of bad luck, but because of built-in conflicts of interest. Banks, fund managers, insurers—they don’t always act in ways that align with your best interests. We’ve seen how asymmetric information and misaligned incentives lead to problems like hidden fees, poor product design, and outright exploitation.
But in this lecture, we’ve seen a different side of the problem: the part that lives inside our own heads.
We don’t make decisions like robots. We anchor on irrelevant numbers. We fear losses more than we value gains. We trust our instincts too much, follow the crowd, and avoid uncomfortable conversations. These psychological blind spots are not rare exceptions—they’re the norm. And they don’t just cause honest mistakes. They make it easier for others to manipulate us—even when we have all the information in front of us.
That doesn’t mean we’re doomed. It just means we need to approach financial decisions with more humility and self-awareness. We need to understand not just the structure of the market, but the structure of our own thinking.
Next time, in the final lecture of this series, we’ll talk about what to do with this knowledge. We’ll look at how behavioral insights can actually help us make better financial decisions. Not just by avoiding bad ones—but by designing systems, habits, and environments that make better choices easier.
Because while our brains may come with built-in bugs, we can still build guardrails. And with the right tools, we can learn to outsmart the system—and even outsmart ourselves.
© Professor Raghavendra Rau, 2025
References and Further Reading
- Daniel Kahneman – Thinking, Fast and Slow
The foundational text on cognitive biases and decision-making, including Prospect Theory and System 1 vs. System 2 thinking.
- Richard Thaler – Misbehaving: The Making of Behavioral Economics
A personal and entertaining account of how behavioral economics developed and how real human behavior diverges from textbook models.
- Dan Ariely – Predictably Irrational
A readable introduction to common behavioral quirks in everyday and financial decisions.
- Robert Shiller – Irrational Exuberance
A classic examination of market bubbles and investor psychology, especially relevant to meme stock dynamics and crowd behavior.
References and Further Reading
- Daniel Kahneman – Thinking, Fast and Slow
The foundational text on cognitive biases and decision-making, including Prospect Theory and System 1 vs. System 2 thinking.
- Richard Thaler – Misbehaving: The Making of Behavioral Economics
A personal and entertaining account of how behavioral economics developed and how real human behavior diverges from textbook models.
- Dan Ariely – Predictably Irrational
A readable introduction to common behavioral quirks in everyday and financial decisions.
- Robert Shiller – Irrational Exuberance
A classic examination of market bubbles and investor psychology, especially relevant to meme stock dynamics and crowd behavior.
Part of:
This event was on Mon, 02 Jun 2025
Support Gresham
Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham College plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds.