Humans like to think of themselves as rational, forward-looking creatures. But when it comes to forecasting the future—even our own—we’re often laughably wrong. From personal choices to global crises, our brains are wired with cognitive shortcuts and emotional biases that lead us to consistently underestimate, overestimate, or misjudge reality. Sometimes, the error is small. Other times, it’s devastating.
Here are 10 things humans are notoriously bad at predicting despite centuries of trying.
Related: 10 Seriously Strange Beliefs Humans Held Throughout History
10 How Long Tasks Will Take
The Planning Fallacy: How to Tackle Time Estimates
Whether writing a paper, building a house, or just cleaning the garage, people almost always underestimate how long a task will take, even if they’ve done it before. This mental blind spot is called the planning fallacy, a term coined by Daniel Kahneman and Amos Tversky in 1979. What makes it especially strange is that even when we’re fully aware of our past mistakes, we tend to believe “this time will be different.” It’s a form of irrational optimism that causes individuals, governments, and corporations to overpromise and underdeliver on everything from personal projects to billion-dollar infrastructure.
It’s baked into major disasters like Berlin Brandenburg Airport, which opened nine years late and billions over budget, and the Sydney Opera House, which took ten years longer than expected. Even software engineers, who often work in iterative cycles and use time-tracking tools, still suffer from it—leading to chronic project overruns in the tech world. The human brain tends to visualize ideal conditions and ignores things like delays, interruptions, or burnout. We imagine best-case scenarios and make plans accordingly.[1]
9 What Will Make Us Happy
Why You Probably Can’t Predict Your Own Happiness
Affective forecasting—the ability to predict how we’ll feel in the future—is something humans consistently get wrong. We assume getting a raise will bring long-term joy, a breakup will destroy us, or a dream vacation will permanently boost our mood. In reality, people quickly adapt to both good and bad events. This is known as the hedonic treadmill: the tendency to return to a baseline level of happiness after emotional highs or lows. That promotion, new car, or move to a new city may boost happiness in the short term, but it usually fades faster than expected.
One classic study found that lottery winners and paraplegics reported similar levels of happiness just a year after their life-changing events. Even marriages, often viewed as a long-term happiness booster, show only a temporary increase in subjective well-being before people revert to baseline. We’re also bad at predicting which things will bring us lasting satisfaction—we focus on status and novelty when research shows that relationships, purpose, and health are better long-term drivers. Yet we keep chasing the wrong carrots.[2]
8 Randomness and Probability
The Gambler’s Fallacy is Really Odd
When asked to generate a random sequence or assess chance, humans tend to see patterns where none exist. The gambler’s fallacy—the belief that a run of losses makes a win “due”—has been observed across casinos, sports betting, and even judicial decisions. Judges, for instance, are more likely to rule favorably after a string of negative rulings, as if fairness somehow balances itself out. But random events don’t care about streaks. We’re wired to search for causality, even in pure noise.
This bias has huge consequences. In financial markets, investors chase trends, believe in hot hands, and panic at “corrections” that may be statistical noise. People confuse correlation with causation in health and science, creating pseudoscientific beliefs about vaccines, diets, or weather. Entire industries profit from our inability to process randomness correctly—lotteries, slot machines, and even influencer culture rely on false perceptions of skill or destiny in what are often pure luck distributions. Despite understanding probability intellectually, humans revert to magical thinking under pressure.[3]
7 How We’ll React Under Pressure
Air France 447 crew should have understood the “startle effect”
Ask someone how they’d behave in a crisis—say, a plane emergency, a public speech, or a violent confrontation—and most will give a confident, idealized version of themselves. In practice, our stress responses are highly unpredictable. Adrenaline floods the brain, narrowing focus and impairing decision-making. During real disasters, many people freeze, forget training, or follow crowd behavior even when it’s irrational. In aviation psychology, this is called the startle effect, and it’s one of the leading reasons trained pilots make fatal errors during unexpected events.
A tragic example: During the 2010 Love Parade disaster in Germany, thousands of festivalgoers became trapped in a narrow tunnel. Despite signage and crowd-control plans, panic overtook logic, and 21 people were crushed to death. Bystanders who thought they would stay calm did not. Even in fire drills, people often ignore alarms or freeze unless directly told to evacuate. Police officers and soldiers are trained to rehearse responses specifically because instinct is rarely reliable under duress. Imagining heroism is easy; executing it through foggy panic is not.[4]
6 How Much Stuff We’ll Need or Use
We often imagine our future selves as vastly more productive, organized, and varied than they turn out to be. That’s why we stock up on groceries we don’t finish, overpack for trips, and buy workout gear we never touch. Psychologists call this projection bias—the assumption that our current preferences and feelings will continue into the future. We believe we’ll crave variety in meals, want multiple outfit choices on vacation, or be motivated to rotate through books or hobbies. Reality almost never reflects that.
This overestimation gets weaponized by retailers and advertisers. “Buy two, get one free” deals encourage bulk purchasing of perishables. Subscription services count on people forgetting to cancel. Even digital hoarding—saving dozens of tabs, notes, or courses—is rooted in the illusion that we’ll return to them later. Yet, over and over, we discover that our daily routines are simpler and more repetitive than we expect, and the future version of ourselves doesn’t exist as we imagine. We don’t prepare like realists—we prepare like characters in an idealized version of our lives.[5]
5 Our Own Future Behavior
The Science of Behaviour Change
We’re strangely confident in our ability to predict how we’ll act tomorrow, next week, or a year from now—even though we routinely get it wrong. This is driven by what psychologists call the empathy gap: We assume our future selves will have the same motivation, energy, and values as we do in the present. That’s why we sign up for gym memberships we won’t use, promise to stop procrastinating, or think we’ll meal-prep every Sunday starting now. Our future self, in our mind, is practically a different species: more disciplined, less distracted, and immune to craving.
In behavioral economics, this gap explains why people make unrealistic commitments, such as taking on large student loans with little planning or setting aggressive savings goals they can’t maintain. Even dieters and smokers consistently underestimate how strong their future impulses will be. We also misjudge how we’ll respond emotionally—believing we’ll be less angry or more forgiving than we end up being. Our past behavior, ironically, is a much better predictor of our future than any idealized version of ourselves—but we almost never believe it.[6]
4 The Pace of Long-Term Change
APOCALYPSE NEXT: Are humans failing to plan long-term?
Humans operate with an immediacy bias—we expect big change to happen fast and gradual change to be irrelevant. This leads to two simultaneous errors: overhyping short-term technologies and underestimating long-term transformations. Tech futurists once predicted flying cars and household robots by 2000 but missed how the internet would reshape language, politics, and relationships. We get distracted by the flashiest invention without noticing that the truly transformative stuff arrives slowly, almost invisibly.
This also explains our failure to act on creeping problems like climate change, antibiotic resistance, or infrastructure decay. Because the damage is incremental, we push solutions down the road. Meanwhile, trends like aging populations or rising sea levels quietly build up pressure until they cross a threshold. We’re evolutionarily tuned to respond to immediate threats, not data curves. This cognitive lag leaves us constantly surprised by events that were entirely predictable over time.[7]
3 What Others Are Thinking
Can People Really Read Your Mind? The Illusion of Transparency Explained 🤯🔮
Most people assume they have a pretty good sense of what others think of them—whether they seem likable, annoying, confident, or smart. However, social psychology research has repeatedly shown that people are terrible at reading minds. This includes friends and partners, not just strangers. We overestimate how much people understand our jokes, intentions, or emotions, a phenomenon known as the illusion of transparency. We also project our own knowledge onto others, assuming they see the same context we do.
These assumptions cause frequent miscommunication in everything from relationships to workplace dynamics. Leaders think they’ve explained their expectations clearly, but team members misinterpret them. Couples think their signals are obvious when they’re not. One study even found that people can’t reliably predict how happy or upset someone will be from a gift—despite being confident they can. The constant mismatch between perception and reality fuels everything from awkward conversations to diplomatic failures, yet we rarely notice it happening.[8]
2 Global Crises and Their Impacts
Flight, Fight, or Freeze: The Perils of Normalcy Bias – Epic Science #74
Despite historical precedent, expert warnings, and risk models, governments and institutions consistently fail to predict or prepare for major global crises. The COVID-19 pandemic is a prime example. Public health agencies had simulated coronavirus-like outbreaks for years. Yet most countries lacked the stockpiles, testing capacity, and communication infrastructure to respond effectively. In 2006, a U.S. Homeland Security report outlined almost exactly what happened in 2020—and was largely ignored.
The same applies to financial meltdowns. The 2008 housing crash caught Wall Street and regulators by surprise despite clear signs like subprime lending and ballooning debt ratios. Normalcy bias—the belief that tomorrow will look like today—blinds decision-makers to fast-building systemic risks. Even now, experts warn about cybersecurity threats, food system fragility, and climate-related displacement. However, most plans remain underfunded or theoretical. When disaster does strike, the response is treated as unforeseeable, even if it was outlined in detail years prior.[9]
1 Our Own Mortality
The Surprising Ways Death Shapes Our Lives
The ultimate blind spot in human prediction is death. While we intellectually accept that we’ll die, we often act like it’s something that happens to other people. This denial shapes everything from financial planning to end-of-life care. Surveys show most people want to die at home surrounded by loved ones—but never document that legally. Fewer than 1 in 3 adults in many countries have a will, and even fewer have advance medical directives. Our inability to confront mortality leaves family members guessing in crisis moments.
Psychologists refer to this as terror management theory—the idea that our fear of death shapes behavior so deeply that we construct elaborate systems (religious, cultural, or personal) to avoid thinking about it. Ironically, people who do face mortality—through illness, loss, or crisis—often report higher clarity and peace afterward. But for most, it remains a topic to be delayed, downplayed, or joked about. Even when death is statistically predictable, we’re too busy believing we’ll be the exception.[10]
fact checked by
Darci Heikkinen