People often mistake luck for skill without realizing it. We see a sure plan in randomness and believe coincidences have a cause. We often confuse noise with actual signals. We mistake forecasts for sure predictions, and guesses for certainties. We might overlook the role chance plays and see someone as a skilled investor when luck played a big part in their success. Consequently, we expose ourselves to risks and randomness that we could have seen but didn't.
In Fooled by Randomness, bestselling author and former options trader Nassim Nicholas Taleb explores the significant role luck plays in success. He explains why people often misinterpret luck. He also discusses how we can handle randomness in our lives once we recognize it. This book is the starting point of the five-book series entitled Incerto, which explores various facets of randomness.
While Taleb mainly discusses the world of investing, his insights apply broadly to fields ruled by unpredictability like economics and politics. He sheds light on how randomness deceives us in many aspects of our lives.
Now, let's delve into the core insights from Fooled by Randomness by breaking down the book into four parts:
- In Part 1, we'll explore how luck and rare events influence success and failure.
- In Part 2, we'll look at the incorrect ways we often think about randomness, risk, and probability.
- In Part 3, we'll examine why we tend to have these misconceptions.
- In Part 4, we'll discuss how to navigate through life and work while acknowledging the presence of randomness.
Part 1: Luck Plays a Big Role in Extreme Success
When we talk about luck, we are referring to randomness—more specifically, to "rare events": infrequent and usually unpredictable events that bring huge gains or severe losses. Rare events can significantly impact success, but not always in a fair or predictable manner. Sometimes, they hand success to less competent individuals. At other times, they snatch it away from those who’ve had long winning streaks.
For instance, this notion holds more true in industries that heavily rely on chance, such as investing, compared to professions like carpentry or medicine. The latter depend more on perseverance and skill. People often fail to recognize the fundamental difference between these two types of jobs. Chance-based jobs may lead to success in an unpredictable manner, while skill-driven jobs offer a more steady pathway to success. This misunderstanding, along with the tendency to attribute success to skill instead of luck, can lead people to make poor decisions.
In Part 1, we’ll explore how people confuse randomness with skill, and how rare events can influence success and failure.
We Often Confuse Luck with Skill
When we witness someone achieving incredible success, we often credit that success to a mix of skill, hard work, intelligence, and some unique traits that mold millionaires. However, skill and hard work generally lead to average success. Remarkable success usually comes from luck. This could mean earning millions of dollars or gaining lasting fame. This luck could be a favorable unexpected event or the lack of unfavorable ones.
This doesn't mean luck is the only factor in a successful career. To make the most of luck, you need a solid foundation including skill, experience, and a good presentation. These are crucial steps: being punctual, dressing appropriately, and working hard. Yet, they don't explain why some people achieve runaway career success while others don't. Similarly, buying a lottery ticket is a necessary step, but it doesn’t explain why one person wins over another.
When people confuse luck with skill, they confuse what's necessary with what's causal. For example, it’s necessary to dress neatly for work, but that didn’t cause you to make a huge profit on your last trade. Millionaires necessarily work hard and take risks, but not all hardworking risk-takers become millionaires.
History provides ample evidence. Though Julius Caesar was undoubtedly smart, noble, and brave, many others with similar traits never reached his level of success. His personal traits were necessary for his achievements but not sufficient to explain his lasting fame. His repeated luck of being in the right place at the right time played a significant role.
Survivorship Bias Skews Our Perception
One reason people mistake luck for skill is the "survivorship bias." We usually see only the people who have "survived" or thrived in a certain situation, and draw lessons from their success. Mainly, we start to believe that extraordinary success is common in this specific industry or venture. Survivorship bias misleads us partly because the hugely successful examples are more visible, while failures often fade away unnoticed. When we don’t see the failures, we forget they're possible.
This bias leads people to view examples of huge success as typical of what anyone can achieve in that industry. For instance, seeing a very wealthy stockbroker might make someone think, "Trading is very profitable." Or, seeing a bestselling author might lead to the thought, "Writing is a great way to get rich."
However, to gauge the real potential for success in any venture, you need to look beyond the visible success stories. Consider the unseen failures: what could have happened if luck hadn’t favored these successful individuals.
For instance, understanding the chances of getting rich as a trader requires considering the many people who tried and failed. Similarly, to judge your chances of wealth through writing, think about all the authors who couldn’t find a publisher or whose books sold poorly.
Thinking about these unseen failures helps you explore different scenarios. What if the market hadn’t soared 500 points that morning? What if the large investment you made in a particular stock hadn’t paid off?
The Evolution of Possible Worlds Thinking
The concept of exploring possible worlds has gained traction in various fields of study:
- In philosophy, theories around possible worlds delve into the idea that God might have created an endless array of possible worlds, but chose to actualize just one.
- In physics, the many-world theory in quantum mechanics suggests that the universe branches into multiple paths at every decision point, with us residing in just one of these multiple universes.
- In economics, the "state-space" method tackles economic uncertainty by exploring the "what-ifs" that could arise under varying market or global conditions.
- In mathematics, a discipline often linked to investing, the Monte Carlo methods help generate a hypothetical series of alternative paths. These paths represent a series of events that might have occurred following a certain event. There is a computer program called a Monte Carlo generator used by mathematicians. It creates a vast range of these possible paths, letting users explore thousands of potential outcomes under certain conditions. These simulations find application in diverse areas like war preparations, financial markets, aiding policymakers, and business professionals in calculating probabilities without complex mathematical formulas.
Of course, you don't need an actual Monte Carlo generator to ponder over possible worlds. When faced with a decision, envision the possible outcomes. Then, take it a step further by imagining the subsequent possibilities each outcome could lead to. This practice helps you recognize that the desired outcome is merely one among a myriad of potential scenarios.
It’s Hard to Spot Possible Worlds in Real Life
It's straightforward to see the randomness in a hypothetical game like Russian roulette. In this game, you could earn fifty million dollars by firing a gun loaded with one bullet and five empty chambers at your own head. The conditions here are simple, well-defined, and undeniably random.
However, spotting randomness in professions that align with mainstream income expectations is trickier, especially when they lack clear parameters, like trading. Despite this, the random nature of these professions is equally influential. Studies have shown that randomly selecting stocks by throwing darts at a list can be as successful as selections made by seasoned professionals.
Recognizing random influences is crucial, making it essential to make a conscious effort to do so. In the real world, the fatal "bullet" is less frequent, and its risks are more vague and challenging to identify. People often find themselves playing a sort of Russian roulette without realizing it—they engage in luck-based activities, mistakenly believing they are skill-based. When the recipients of this random luck become role models, others are drawn into the game, overestimating their chances of success.
By developing a habit of considering possible worlds and alternate paths, you can broaden your perspective. Every choice presented to you will unfold a myriad of possibilities, enabling you to "learn from the future." This newfound awareness will aid in better risk assessment and reduce the allure of the roulette wheel.
Rare Events Have an Outsized Influence
When we discuss luck's role in success or failure, we are specifically referring to rare events and their significant influences on any given path. A person can be propelled to great success if she catches a positive rare event, like an unlikely and highly profitable trade. On the other hand, she might spend every day making winning bets, only to lose everything in a few minutes when a rare bad bet wipes out all her previous winnings.
A notable example in the trading world is when a trader makes a trade that erases her capital, leading to her dismissal or exit from the investment field. This scenario is termed "blowing up." It implies losing more money than anticipated or prepared for, often enough to end a trader's career. Most traders experience a blow-up at some point. In fact, the 10-year survival rate for traders is under 10 percent.
History is full of rare events, yet it's impossible to pinpoint exactly when and where they will occur. This unpredictability often leads people to overlook the probability of rare events. It's challenging to plan for something unpredictable and to understand occurrences that don't follow usual patterns.
However, lacking an understanding of how rare events work and their significant impact means you won't be able to assess risks and opportunities accurately or perceive what's shaping the world around you. Rare events and randomness:
- Determine track records.
- Allow less qualified competitors to win.
- Influence the evolution of businesses.
- Reveal that wild success built on luck isn't sustainable.
These ideas are explored further below.
Randomness Shapes Track Records
A strong track record doesn't guarantee future success in luck-driven professions like investing. The nature of randomness and rare events means that, from a large initial group, some individuals will emerge as extraordinarily fortunate, thus possessing impressive track records, regardless of their actual skills or competence.
Consider a Monte Carlo simulation involving 10,000 traders. Each trader has a 50 percent chance of making or losing money annually. If a trader loses money, she exits the game. After year one, 5,000 traders remain. By year five, just over 300 traders remain. The success of these 300 could be purely due to luck, yet in real life, they would be praised for their skill.
To accurately assess someone's track record, knowing the size of the original group they emerged from is crucial. Success derived from a large initial set might be purely down to luck as some individuals will avoid negative rare events longer. Conversely, success from a smaller set is likely more indicative of skill. In a simulation with just 10 traders, for instance, most or all would be eliminated within three to four years. So, if one of them succeeds beyond five years, it's likely due to skill rather than luck.
Further emphasizing this point, consider a scam exploiting this statistical concept to fake a winning track record. Suppose someone sends letters to 10,000 random individuals in January. Half predict a market rise next month, and half predict a fall. In February, only those who received a correct prediction (5,000) get a new letter, again with split predictions. This process repeats monthly. By June, around 300 recipients remain, likely impressed by the sender's supposed predictive prowess, not realizing the underlying randomness. Consequently, they might be swayed into making ill-advised decisions, like entrusting their funds to this individual.
Randomness Allows Less Qualified Competitors to Win
Because of the nature of randomness, life can be unfair: The most capable companies and the most skilled people are not the always ones who achieve success. Let's extend the earlier simulation, but this time, each trader has only a 45% chance of making money each year. Despite being less competent, nearly 200 traders would still be standing at the end of five years, each celebrated for their 'success'.
This scenario becomes more interesting when considering that in the real world, unlike in simulations, early success often lays the groundwork for subsequent success. For instance, in a coin flip, your odds remain the same with each flip. However, in real life, an early stroke of luck can set you up for more favorable chances down the line, while early misfortunes can lead to a string of losses.
Take the journey of two comparable companies. Their paths might diverge significantly if one company has early chance encounters with key stakeholders or secures unexpected product orders. Microsoft's success illustrates this phenomenon well. Bill Gates, while undeniably diligent and smart, faced many rivals with similar traits. Yet, it wasn’t necessarily superior software that propelled Microsoft ahead; it was more likely early advantages from chance events and positive feedback loops.
This phenomenon is known as a "path dependent outcome". It highlights why we witness a handful of remarkable successes amidst a vast sea of failures. The story of the QWERTY keyboard further embodies this effect. It didn't become the standard due to its efficient letter layout—actually, it was designed to slow typists down. However, being the first design commercially introduced, people got accustomed to it. Attempts to change it later failed, not because it was the best, but because it came first.
Randomness Shapes Evolution
Many people mistakenly believe that evolution, whether in nature or business, is a steady, forward-moving process. However, this overlooks the significant impact of rare events on evolutionary trajectories.
Darwinian evolution highlights a species' ability to endure across extensive time frames. Yet, in shorter time spans, the evolutionary journey isn't always straightforward. At times, animals may develop unfavorable genetic mutations. While these might not be detrimental initially—sometimes even beneficial—they could pose threats in the long run. These mutations linger until a rare event exposes their inadequacy, potentially leading to the species' decline. Over generations, such "genetic noise" usually gets weeded out.
Similarly, traders might adopt strategies that thrive under specific favorable conditions but could spell disaster under others. Success in predictable market scenarios doesn’t guarantee the same outcome in tumultuous times. The traders who stand out are those whose strategies align well with the prevailing market conditions. For instance, buying during market dips works well if a rally follows, but if the market keeps tumbling, this strategy backfires. A case in point is the fate of traders who invested in foreign currencies in the 1980s versus those who did so a few years later. The former faced losses due to an overpriced US dollar, while the latter enjoyed profits as the dollar value dropped.
To sum up, luck might favor short-term survival, but long-term endurance leans towards those not solely reliant on fortune.
Randomness Guarantees That Wild Success Is Not Sustainable
The fleeting nature of success borne out of luck brings us to a critical realization: such success isn't worth the gamble. Eventually, luck wanes, and when it does, one might lose all they've gained. In the grand scheme of things, the frequency of success is inconsequential if a single failure can lead to devastating losses. Over time, securing modest success while shielding oneself from major setbacks is more beneficial than chasing monumental success, only to risk larger downfalls that could wipe out all gains.
Consider the hypothetical Russian roulette scenario discussed earlier. Out of six outcomes, five lead to immense wealth, but one results in death. Despite the higher probability of success, the catastrophic cost of failure makes this a dangerous way to amass wealth. Similarly, if your bets bring in money consistently but expose you to a loss that could ruin you, longevity is unattainable. The lesser-likely, yet detrimental event will occur eventually, nullifying all previous wins.
On the flip side, success attained without relying on luck is less susceptible to the unpredictable nature of randomness. Although the rewards may be humbler, the stability it offers is invaluable. For instance, pursuing success diligently and intelligently in a field like cardiology may not lead to immense wealth, but it promises a degree of consistent success. Unlike traders who might face ruin from bad trades, or waiters whose lottery win is a one-off, enduring success is often molded by one’s inherent qualities rather than an extended run of good luck.
Part 2: We Don’t Think About Probability Correctly
Now that we’ve explored some of the ways in which randomness affects success and failure, we’ll start to examine our difficulty in understanding and anticipating it. In Part 2, we’ll explore three concepts that reflect this difficulty:
- We don’t properly interpret the past.
- We can’t predict the future.
- We don’t insure against risk properly.
We Don’t Properly Interpret the Past
One reason we’re bad at assessing and preparing for risk and random events is that we are not good at learning from the past. We mistakenly believe that because something has never happened before, it can’t happen now. We then defend our lack of planning accordingly: “That had never happened before!”
A longer-term examination of history shows that rare events of all kinds do, indeed, happen. The very definition of a rare event is its unpredictability. History is littered with examples of events that never happened before. If history’s past brought surprises, why shouldn’t our own past do the same?
Even when we do remember a past rare event, we tend to falsely believe that we now understand the events that led up to it, and we therefore think we can “predict” it; that is, if it were to happen again, we wouldn’t be taken by surprise. We’d be more prepared for, and therefore less exposed to, any negative fallout from a similar rare event.
We also tend to falsely believe that mistakes of the past that led to these events have been resolved, making it even more unlikely that they would happen again. For example, people know that 1929 proved that stock markets can crash, but they often chalk that up to specific causes of that time. They believe, in other words, that the event is contained and non-repeatable.
We thus like to imagine that if we were to live through certain historical events, such as the stock market crash of 1929, we would recognize the signs and wouldn’t be taken by surprise in the way that people at the time were. This is the “hindsight bias,” otherwise known as the “I knew it all along” claim. However, seeing something clearly after the fact is much easier than seeing it clearly in real time.
In the same way, a manager taking over a trading department might do an analysis and find that only a small percentage of the trades made that past year were profitable. She might then point out that the solution is to simply make more of the profitable trades, and less of the losers. Unfortunately, such a statement of the obvious doesn’t provide any usable guidance for future trading decisions.
We Can’t Understand the World Through Observations Alone
A major hurdle in grasping the risk of rare events is our tendency to focus on what actually happened in the past, overlooking what could have happened. This exposes a key flaw in inductive reasoning. Unlike deductive reasoning, which forms a theory first then seeks data for support, inductive reasoning starts with data observation, identifies patterns, and then forms a theory.
The drawback here is, relying solely on personal observations can cause us to miss out on rare events. Take swans for example. If you see thousands of them and they are all white, you might conclude that all swans are white. But black swans do exist, though they are rare. You might never see one unless you're in Australia, but their existence shatters your conclusion.
Observations can disprove a theory, but they aren't sufficient to prove one. A single counterexample can debunk a theory crafted from millions of observations. This concept extends to historical observations as well. You can use historical events to challenge a conclusion, but deriving a solid conclusion from the absence of certain historical events is faulty. For instance, before September 11, 2001, the idea of airplanes intentionally crashing into skyscrapers might have seemed impossible. Yet, the events of that day invalidated such an assumption.
Thus, basing your understanding of, say, the market solely on past observations is perilous. You may think, “The market never plummets thirty percent within a four-month span,” citing historical data as proof. However, “It has never gone down” and “It never goes down” convey vastly different meanings.
We Misinterpret the Lack of Evidence
A common error in inductive reasoning arises from mixing up 'absence of evidence' with 'evidence of absence'. This flawed thinking appears in various fields. For instance, consider a drug trial that reveals a 2 percent improvement. However, the researcher, thinking her sample size too small and the improvement too minor, states that she hasn't found evidence of significant benefits yet. Other doctors might misinterpret this as, “The medicine doesn't work.”
Similarly, viewing history and saying “this hasn’t happened” often turns into “this can’t happen,” a mindset that overlooks potential risks.
We Can’t Predict the Future
Not only do we incorrectly interpret the past, but we also have trouble using lessons of the past to predict future rare events—no matter how hard we try—for three primary reasons:
- Noise prevents clarity: Only the passage of time allows us to properly judge which pieces of information end up being consequential.
- Structures change: The structure of the past can be so different from the structure of the future that lessons learned from history might fit today’s world only clumsily.
- The future is changeable: If everyone could predict the future, the future would then change, and would once again become unpredictable.
These three reasons are explored further below.
Reason #1: Noise Prevents Clarity
People often believe that if they could have predicted past rare events, they can also predict current or future rare events. However, clarity on an event usually comes with time; real-time situations often have too much "noise" to discern what's important.
Here, "noise" refers to the bombardment of information from newspapers, television, online outlets, and more. It includes minute-to-minute stock fluctuations, daily market move explanations, and endless company analyses—many of which won't last a decade.
Most small market changes are likely random. Focusing on them can mislead you into thinking that minor things have bigger consequences than they do. Noise isn’t just useless; it can be harmful if it prompts bad decisions. For instance, selling a stock hastily due to small market changes, when holding onto it might have been wiser. Only time can filter out what's crucial from what's noise: it shows which changes were fleeting and which shifted history's course.
This is particularly true for stock prices. Getting caught up in the continuous price changes makes you focus on variance, not returns. It’s unhelpful, and negative movements may prompt premature actions. Consider an investor with a portfolio that has a 90 percent chance of increasing over a year. If she checks stock prices every minute, she might have 250 joyful minutes and 240 distressing minutes each day.
People tend to react more strongly to negative news. So, she’ll end each day drained, with 240 moments of doubting her strategy. However, if she checks her balance yearly, the 90 percent chance of increase implies good news 9 out of 10 years, leading to 9 joyful moments for every distressing one. Time would have filtered out the unhelpful noise.
The same goes for world events. Daily news covers everything, important or not, while historians, with hindsight, can identify the transformative events.
Reason #2: Structures Change
When we study history to apply its lessons today, we often focus on a few specific past events. We then assume these events can guide us for the future, especially for similar types of situations. However, we may overlook the fundamental changes that continuously shape our world.
The way things operate changes so much that it challenges the value of studying history, except for learning to expect surprises. Lessons from bygone eras may not always suit today's scenarios. For instance, the Asian markets of the 1990s have evolved significantly due to shifts in the global economy, making old market strategies potentially outdated.
Moreover, the structural differences between past and future can blur the connections. We might only spot similarities between the two in retrospect, as real-time judgments are often too close to the situation. This holds true for linking specific past and future events, as well as understanding broader landscapes over time.
Reason #3: The Future Is Changeable
The future eludes us in another way, too: It is affected not just by outside influences but also by our own understanding of those influences. Our predictions of the future can themselves change it.
Take traders for example. If they notice the market consistently rises in March, they might buy stocks in February expecting a repeat performance. However, this collective action would shift the market rise from March to February.
Even if we could fully grasp the past, if everyone did, our predictions could become self-defeating. People preparing for a future event based on past patterns could prevent the event from unfolding as anticipated. The cycle of the future mirroring the past relies on unseen forces driving events in a similar fashion. Rare events like stock sell-offs occur because they are unexpected; if anticipated, people would brace for them, altering the scenario entirely.
We Don’t Insure Against Rare Events Properly
Misunderstanding past rare events often leads to underestimating the likelihood of future rare events, which in turn leads to inadequate risk planning. This is evident in people’s attitude towards insurance.
Many hesitate to buy insurance for highly unlikely events. If they do buy it and the event doesn't occur, they feel they wasted money. Here, people mix up "forecast" with "prophecy," criticizing someone for failing to predict accurately rather than for estimating risk correctly.
A clear example of this is seen in stock market risk warnings. For instance, journalist George Will once interviewed Robert Shiller, the author of Irrational Exuberance, a book about the stock market's mathematical randomness. Will argued that following Shiller's advice at one point about an overpriced market could have led to losses since the market rose instead. He failed to see that one incorrect prediction didn’t render Shiller’s overall caution invalid. It’s easy to spot incorrect predictions in hindsight.
It's akin to criticizing someone for avoiding Russian roulette, when, looking back, those who played got lucky and won big. In the long run, those who are vigilant about risks fare better as they are well-prepared for rare events.
The Dilemma of the Risk Manager
Predicting the future is tough, and dismissing past risks that didn’t materialize makes the job of risk managers at investment funds quite challenging. They are tasked with spotting potential catastrophic risks to investors’ portfolios. They must balance advising investors to steer clear of certain risks while handling the backlash if the risk doesn’t materialize, causing the investor to miss out on profits.
Caught between a rock and a hard place, many risk managers often just highlight potentially risky moves, without outright warning against them. As a result, they may end up giving a semblance of risk reduction rather than actually reducing risk. This scenario is known as “epiphenomena”: the blurry link between cause and effect, where merely monitoring risks is mistaken for minimizing them.
Part 3: How Our Human Nature Steers Us Wrong
In Part 1, we examined why and how rare events affect success and failure. In Part 2, we discussed some ways in which we fail to understand and anticipate randomness. We’ll now look at why we have such trouble comprehending randomness, and how our brain’s wiring makes it difficult for us to understand probability.
Overall, these reasons include:
- We are guided by our primitive brain, which likes simplicity and dislikes abstraction, and makes decisions emotionally.
- We don’t understand how probabilities work.
- We see meaning where there is none.
We Are Guided by Our Primitive Brain
The first hurdle in accurately anticipating randomness stems from the guidance of our ancient brain. Our brains have evolved with thinking shortcuts to enable swift and decisive reactions to threats, aiding our survival. These shortcuts help save time and mental energy; if we were to deliberate extensively on every interaction we encounter daily, we might either miss out on opportunities or fall prey to threats.
This system served us well throughout much of our history, when our lives were localized and simple. We could enhance our survival chances without needing to factor in rare events. However, in today's complex world, we need to assess probabilities, and our brains' shortcuts often lead us to accept many things without thorough contemplation. As a result, we find ourselves wielding primitive tools against modern challenges, and our perceptions of the world frequently stem from misunderstandings and biases we unknowingly harbor. Some of these are highlighted below.
Our Emotions Shape Our Choices
Neurologists point out that our brain has evolved into three broad sections: the primitive brain, the emotional brain, and the rational brain. While the rational brain serves as an advisor, it's the primitive and emotional parts that play the main role in decision-making.
This isn't necessarily negative. Though our thoughts can guide us, it's the feelings that help us choose one course of action over another, preventing us from getting stuck in endless rational deliberations. This phenomenon is evident in patients with brain trauma that impairs their emotional capability while leaving their intelligence intact. Such individuals struggle with decisions as simple as whether to get out of bed in the morning due to their overly rational mindset.
On the flip side, emotions can misguide us and lead to mistakes. Emotions can obscure our judgment, overriding rational thinking and misguiding our risk assessment, thus driving poor decisions. For instance, we might buy a stock from a company we adore and get emotionally attached to its prospects, even if it's not a financially savvy move.
Our emotions particularly mislead us as we tend to be more affected by negative events than positive ones. This bias also translates to how we perceive market volatility - it's more alarming when associated with falling prices compared to rising ones. Similarly, market volatility during adverse global events is perceived as worse than during peaceful times. For example, in the eighteen months leading up to September 11, 2001, the market saw more volatility than in the same span post-event, yet the latter period attracted more media attention. Consequently, people are inclined to act during stressful times, often making unwise strategic moves.
We Favor Simplicity
Our primitive and emotional brain segments have evolved to value speed when scanning our surroundings for potential threats. As a result, we shy away from complexity and gravitate towards simplicity. We find it easier to digest simple, straightforward concepts. Often, complex ideas make us uneasy, leading us to assume there might be misleading or incorrect information involved.
This tendency drives us to sidestep challenging concepts, even when they offer more insightful understandings. As a result, we often overlook the nuances of probabilities, which are not only challenging to grasp but also frequently counterintuitive.
For instance, a study exploring how medical professionals understand probabilities revealed surprising gaps in knowledge among those we'd expect to have a firm grasp on the concept. The doctors were posed with a question: If a disease affects one in 1,000 people in a certain population, and individuals are randomly tested with a test that has a 5 percent false positive rate and no false negatives, what's the likelihood that someone with a positive test result actually has the disease?
A majority of doctors responded with a 95 percent likelihood, aligning the answer with the test's accuracy rate. However, under these conditions, a person with a positive test result would only have a 2 percent chance of actually having the disease. (Out of 1,000 people tested, only one will have the disease, but an additional 50 will yield false positive results, resulting in a total of 51 positive tests with only 1 actual case of the disease. Therefore, the probability is one divided by 51, roughly 2 percent.) Fewer than one in five respondents got the answer right, as the correct answer is counterintuitive.
(Sidenote: This doesn't imply that people are frequently treated for diseases they don't have. This scenario overlooks the human aspect of testing: typically, individuals only get tested for a disease when they exhibit symptoms, which heightens the probability that a positive result is indicative of the disease.
Nonetheless, the mathematical principle holds true in real-world scenarios concerning diseases that are rare but for which asymptomatic individuals are regularly tested—like breast cancer, for instance. Mammograms tend to have a relatively high false positive rate, yet a vast majority of those testing positive do not have the disease. Additional testing helps filter out these false alarms.)
We Are Drawn to Surprises
Our primitive and emotional brain regions are particularly attuned to surprises, paying more heed to them than to ordinary occurrences. We tend to assign more importance to startling events, even if they're not truly significant, and believe that events which come to mind easily are more likely to happen. This leads us to overestimate the likelihood of rare events while overlooking the risks associated with common ones.
This bias is evident in media coverage, which often highlights unusual yet less threatening news while glossing over more common and potentially dangerous issues. For instance, in the 1990s, the media extensively covered mad cow disease, even though it only resulted in a few hundred deaths over a decade. In reality, one had a higher chance of dying in a car accident en route to a restaurant than from consuming contaminated meat at the venue. However, due to the disproportionate media attention, people feared the (rare) threat of mad cow disease more than the more probable risks they faced daily.
We Dislike Abstraction
Because for most of human history people faced tangible threats rather than theoretical probabilities, our brains evolved to better understand concrete ideas rather than abstract ones, and consequently, we have trouble assessing the risks of abstract circumstances. Studies have shown that when presented with two sets of risks, people will be more concerned about the one that describes specific threats even if the more general threats would also include those specific threats.
For example, travelers are more likely to insure against a death from a terrorist threat on their trip than death from any reason (including, but not specifying, terrorism). In another example, a study found that people predicted an earthquake in California was more likely than an earthquake in North America (again, including but not specifying California).
We Don’t Understand How Probabilities Work
The second reason we fail to anticipate randomness is that we don’t inherently understand how probabilities—the likelihood of rare events—work. Researchers have found that people have a lot of difficulty comprehending concepts that feel counterintuitive. Mathematical probabilities and random outcomes frequently fall into this category.
Our inability to correctly judge probabilities shows up in many ways, including the following:
- We underestimate the likelihood of a rare event.
- We don’t understand how probabilities compound.
- We misunderstand sample size.
- We don’t understand skewness: the unevenness of randomness.
- We misunderstand how probabilities change over time.
These ideas are explored further in the sections below.
We Underestimate the Likelihood of Rare Events
Rare events don't occur often, which can lead us to think they're even less likely than they actually are. So, when they do happen, we find ourselves more surprised than we should be.
This misunderstanding arises because we tend to assess the probability of rare events based on them happening in a very specific way—like envisioning a particular type of market correction within a certain timeframe. However, this narrow view makes us overlook the likelihood of any unexpected event occurring in any manner—for instance, any kind of market event over the next ten years.
Here's an example: Your chance of sharing a birthday with a random person you meet is one in 365. In a group of twenty-two other people, the chance that you'd share a birthday with any one of them is 22 in 365—a still relatively small chance. Yet, the probability that any two people in that group share any birthday is around fifty percent. This broader scenario isn't focused on one person and one date, so the odds are higher. But discovering such a shared birthday feels like an extraordinarily unlikely event, often leading to exclamations like, "What a small world!"
Similarly, your chance of winning the next lottery might be one in 25 million. However, the chance of someone winning is a sure thing: one hundred percent.
We Don’t Understand How Probabilities Compound
On the flip side, we sometimes overestimate the likelihood of rare events when they are linked to another probability, leading us to brace for risks with extremely slim chances of occurring. When two or more probabilities intertwine, we multiply the individual probabilities. For instance, the chance of you getting diagnosed with a specific rare disease in any given year might be one in 1,000,000. Similarly, the chance of you being in a plane crash in any given year might also be one in 1,000,000. Now, the probability of both these events happening in the same year is found by multiplying the odds of either happening independently: one in 1,000,000,000,000.
While this is straightforward in theory, in practice, we often fixate only on the individual probabilities, overlooking their compounded likelihood. This difficulty in grasping compound probabilities was showcased during the O.J. Simpson trial in 1995.
Simpson's lawyers contested that DNA evidence was irrelevant as there might be four other individuals in Los Angeles sharing his DNA characteristics. While technically possible, the slim chance of Simpson being innocent of the blood evidence (a 1 in 500,000 chance), combined with him being the victim's spouse and other supporting evidence, shoots the compounded likelihood of his innocence into the one-to-several-trillion range. Despite this near-impossible chance of innocence, the jury honed in on the individual probabilities—like the blood evidence—and acquitted him.
We Misunderstand Sample Size
Often, we misjudge situations, risks, and chances since we assess them from a too limited sample size. We tend to learn from a few instances and extend those learnings to broader scenarios. This usually leads to misguided strategies.
Self-help books claiming to unveil the secrets of millionaires frequently fall into this trap. They analyze a small group of millionaires, identify common traits among them, and announce that adopting these traits will lead to wealth. Their sampling issue is twofold: Firstly, they only examine a small number of millionaires, and secondly, they ignore a larger group of people who possess these traits but aren’t wealthy.
For instance, one bestselling book suggests accumulating investments since the studied millionaires do so. However, the small sample size doesn’t prove that all millionaires accumulate investments. Moreover, many non-millionaires collect investments but in the wrong assets: stocks of soon-to-fail companies or devaluing foreign currencies.
These books might also focus on a narrow time span, like the 1980s and 1990s, when the average stock soared almost twenty-fold. Those who invested during this era were more likely to get wealthy, not due to skill, but timing. Lessons from this period are nearly worthless; the best advice from this sample would be, “purchase a time machine and invest in the late twentieth century.”
Similarly, if a trader makes a single correct prediction, people might expect her to be right again in the future. Conversely, one wrong prediction can tarnish her reputation as a skilled trader. This becomes evident when journalists ask successful traders to forecast the market on any given day. A single incorrect prediction can lead journalists to cast doubt on their entire career, even though they’re basing their judgement on a minuscule sample of the trader’s predictions.
Often, this sampling error occurs because we focus on the data that is immediately available or visible to us. For example, a lawyer earning $500,000 annually might out-earn 99 percent of Americans, but if she resides in a posh neighborhood in a pricey city among multimillionaires, she might feel less successful. This skewed perception arises not just from the limited sample size of comparisons, but also from the self-selecting nature of the sample: Only the extraordinarily wealthy can afford to live in her vicinity, so she only observes the ultra-successful individuals in her sample.
We Don’t Understand Skewness
When assessing the merit of a strategy, people often focus on the likelihood of winning, overlooking the more crucial aspect of how much they might win or lose. This disparity in outcomes is termed “skewness,” depicting a scenario where there’s a high chance of small gains but a low chance of large losses.
Individuals may prefer a strategy with frequent small wins, despite the possibility of a rare large loss negating all those gains. The thrill of winning drives them to focus on the regularity of small wins and the rarity of losses, rather than aiming to maximize the overall outcome. However, commonplace events matter less compared to rare events since the impact of rare events can be significantly larger.
For instance, a trader might feel elated if her portfolio increases by 1,000 dollars for eleven consecutive months but then plummets by 15,000 dollars in the twelfth month. She may overlook the annual result (a loss of 4,000 dollars) and fixate on the eleven profitable months.
On the flip side, someone who earns substantial amounts sporadically, by predicting rare events (often negative ones), tends to amass wealth more than someone who earns modestly but steadily, ignoring rare events. Such traders are labeled “crisis hunters.” They might incur losses frequently, albeit small ones. When they profit, it’s infrequent but sizable. This tactic extends beyond trading too. For example, TV production companies and book publishers churn out a vast amount of work, expected to be marginally profitable or slightly unprofitable, while banking on the occasional blockbuster.
In daily life, apart from business pursuits, we often assess risks with high skewness more accurately as they are less abstract. For example, when packing for a week-long mountain trip with expected temperatures of 65 degrees but potential swings of 30 degrees either way, you would prepare for both extremes alongside the expected weather. You'd pack light and warm clothing, readying for the risk in both directions.
Similarly, investing should embody this mindset. Prepare for the most probable scenarios while also planning for deviations.
We Overlook the Value of Options
Our inability to grasp skewness manifests in the common reluctance among traders to buy options. Purchasing an option allows a trader to pay a small fee for the chance to buy a stock at a specified price before a certain date. If the option isn't exercised by the deadline, the trader doesn’t buy the stock, losing only the money spent on the option.
For instance, suppose stock XYZ is trading at 39 dollars. A trader spends 1 dollar on an option to buy XYZ’s stock at 40 dollars by month's end. If the stock climbs to 50 dollars, the trader can exercise the option, buying the stock at 40 dollars and immediately selling it at 50 dollars. Having spent 1 dollar on the option, the trader enjoys a net profit of 9 dollars. However, if the stock price doesn’t ascend, the trader won’t exercise the option, allowing it to expire, and incurs a loss of 1 dollar.
The dislike for losing money, even in small sums, often leads to undervaluing options. Regular losses of 1 dollar from expiring options can be offset by an occasional gain of 9 dollars, potentially resulting in an overall profit. However, the emotional sting of frequent small losses deters many from taking this gamble. Hence, emotions obstruct them from engaging in a potentially profitable trade.
We Don’t Understand Mean and Median
Our struggle with skewness extends to misunderstanding the concepts of "mean" and "median," often leading to a misjudgment of real-life risks.
The mean represents the average of a set of numbers. However, it's commonly mistaken as the middle number. A single outlier within a set can significantly skew the mean, making most other numbers fall above or below it. For instance, if nine people have an income of one hundred dollars and one person earns a thousand dollars, the mean income among them becomes 190 dollars, with ninety percent of the individuals earning less than the mean.
Similarly, if you hear that the average investor makes one thousand dollars a month with a particular fund, it's crucial to dig deeper to understand how this figure is calculated. If most people are losing money while a few reap substantial gains, you might choose to avoid investing in that fund.
On the other hand, the median refers to the middle number in a set of values but does not reflect the range of these numbers. In the set "1, 3, and 47,000," the median is 3. If you're evaluating risk based on the median, you might overlook the significant outlier of 47,000.
For example, a cancer diagnosis might come with a prognosis of six more months to live, based on the median survival rate for that cancer type. Upon deeper analysis, this median figure could reveal that while half the patients pass away within six months, the other half live for many more years. Understanding this discrepancy could significantly influence your end-of-life decisions.
We Don’t Allow Enough Time for Rare Events to Happen
Skewness also manifests in the nonlinear (uneven) nature of randomness, where minor factors can have major impacts. For instance, a single grain of sand can cause a sandcastle to crumble.
This concept is also known as "chaos theory," illustrating how a slight alteration can trigger a disproportionate effect, like a population explosion stemming from a minor change at an initial stage.
Our minds are not naturally equipped to grasp nonlinearity, either in thought or emotion. We usually assume that if one variable increases, the related variable will increase proportionally. However, that's not always true. For instance, you might practice an instrument for a long while without noticeable improvement, but eventually, something clicks and your skill drastically improves. If you'd quit out of frustration earlier, you'd miss out on this nonlinear progression.
Success often follows such nonlinear paths, but many lack the patience to endure slow phases. Positive rare events favor those who wait for them.
We Misunderstand How Probabilities Change With Time
People often struggle with understanding probabilities because they may hold true at one moment but not the next. Over time, a probability may shift from being unconditional (true in general) to conditional (true under specific circumstances).
Take, for example, a financial planner who says that since the average life expectancy of an American is 75 years, a 69-year-old should plan for 6 more years. This common misinterpretation confuses life expectancy at birth (unconditional) with life expectancy as one ages (conditional). A life expectancy of 75 at birth accounts for those who pass away at ages 5, 20, 60, and so on. However, if you’ve made it to 69, your life expectancy might now be an additional 10 years. And if you reach 79, your life expectancy might extend by another 5 years.
We See Meaning Where There Is None
A third obstacle in recognizing randomness is our inherent tendency to overlook it and seek out patterns or meanings, even when none exist. This trait of seeking meaning is hardwired within us, an evolutionary tool for survival. However, it misguides us when we find meaning in randomness and let that perceived meaning drive our decisions. Specifically, we often misinterpret:
- Empty but intelligent-sounding verbiage
- Random events
- Random noise
All three mistakes can lead us to make poor decisions and are further explored below.
We See Meaning in Empty but Intelligent-Sounding Verbiage
Fancy phrasing can make people think a piece of communication is significant when it in fact is nonsense. Feed a sentence-generating computer program phrases like “shareholder value,” “position in the market,” and “committed to customer satisfaction,” and you’ll get a paragraph that sounds like it has meaning but doesn’t. This kind of language can be found frequently in corporate and investment fund communications.
When you have experts in a field using industry-specific buzzwords and convoluted sentences, they project an aura of expertise that is often unwarranted. All too often, people buy into this aura, don’t properly question the advisors’ policies, and end up losing money on poor investments.
The experiences of some investors in the late 1990s illustrates this. Economists at the International Monetary Fund (IMF) at that time misunderstood the true risk of default by the Russian government but sounded like they knew what they were talking about. Many emerging-market traders invested deeply in Russian Principal Bonds as a result of advice from these IMF experts and lost hundreds of millions of dollars each.
We Try to Find Meaning in Random Events
We often seek meaning in random happenings, attributing rational explanations to luck-driven scenarios. For example, if a trader earns a windfall, people might credit her intelligence or market insight. If she later faces losses and exits the trade, people will again search for reasons, perhaps blaming a lax work ethic.
(It’s notable that we usually see our misfortunes differently compared to others'. We may attribute others' failures to a lack of skill, while blaming our own on bad luck. This is known as attribution bias, the flawed logic we apply to interpret our and others’ actions.)
Given enough data, it's likely you'll stumble upon correlations that seem meaningful, even when they're not. This leads to the confusion between correlation and causation. Correlation occurs when two events coincide due to chance or a hidden common cause; causation happens when one event triggers the other.
The concept of “data mining” showcases this confusion. It involves scouring large data sets for patterns. While crucial in fields like insurance and healthcare, it can be misused to force meaning onto meaningless data. If you dig deep enough, you can find a pattern in any random series of events.
For instance, certain bestsellers claim to find irregularities in the Bible, asserting these as evidence of biblical “predictions” of already occurred events. They retrospectively match these events to their so-called “predictions,” enhancing the illusion of foresight.
This error also manifests in investment strategies. An investor might sift through a database of historical stock prices, testing various rules to find one that fits. They may hope to discover a “magic” rule for predicting future price shifts. Yet, randomness affects rules just as it does events: given enough rules and a large data set, some correlations will appear. However, a rule explaining past events doesn't necessarily forecast future ones.
For instance, a trader might test the outcome of buying stocks that closed 2 percent higher than their price the previous week. If this rule falls short, she might try 1.8 percent as a threshold, and so on, until she finds a condition yielding a specific result. Yet, applying this finding to future trades seldom yields the same outcome.
Interestingly, any random data set will show some patterns. If it didn't, and was genuinely “random,” it would seem fabricated. Picture a painting of a night sky. For realism, stars must cluster in ways suggesting constellations. A sky with evenly spaced stars, though more “meaningless,” would look unmistakably unnatural.
We See Meaning in Random Noise
Similarly, we often try to find meaning in random fluctuations. Small shifts typically don't warrant analysis; they're probably random, lacking clear causes. Yet, people often dive into dissecting them for significance. For instance, various factors can influence stock market movements. But listening to a pundit dissect minor shifts is often a futile exercise.
Consider watching a marathon where one runner finishes a second ahead of another. Such a minuscule difference hardly calls for scrutiny. It’s unlikely to stem from a significant difference in diet or training; rather, a random gust of wind during the run might be the culprit.
Moreover, pinning down a single cause for a minor event can be challenging due to numerous potential influences. For example, the dollar's value may sway against the euro, the euro against the yen, the market against interest rates, interest rates against inflation, and inflation against OPEC decisions, among others. Isolating one cause amid these myriad influences, especially to explain minor, frequent market shifts, is an unachievable task.
Part 4: Applying the Understanding of Randomness
With a deeper grasp of the significant role randomness plays in success, and the common misinterpretations surrounding it, it's time to delve into how you can navigate risk bearing these insights in mind. We'll look into the traits of individuals who often misjudge risk and discuss actionable steps you can take to mindfully prepare for potential risks.
How Our Misguided Beliefs Lead to Failure
Individuals, notably traders, and others in domains heavily influenced by randomness like economics or politics, often exhibit certain traits that contribute to their misjudgment of risk, leading to ill-advised decisions:
- Overconfidence in their convictions: They fail to recognize that their past successes might be a result of mere chance. The analysis of past events tends to overshadow the randomness involved. Traders who face severe losses often believed their expertise rendered failure impossible. When they take risks under such false pretenses, it's not bravery, but ignorance propelling them forward.
- Lack of foresight: They don't envision scenarios where their strategy could falter and tend to react impulsively under stress. For instance, a market dip might trigger frantic buying of more stocks, driven by panic rather than a well-thought-out strategy.
- Inconsistency or over-persistence in strategy: At times, traders oscillate between short-term and long-term strategies, swayed by insignificant market 'noise' they should have disregarded. Conversely, they might overlook crucial market signals, holding on to sinking assets far too long instead of minimizing losses.
- Over-simplification: Accustomed to distilling arguments to their bare essentials, they find themselves ill-equipped to navigate the intricate web of randomness and probabilities ruling the market. While simplicity suffices in handling various business affairs like managing product logistics, it falls short in dealing with mathematically complex scenarios that defy reduction.
How to Avoid Failure in a Random World
To steer clear of the misconceptions mentioned earlier and to better evade failure, it's crucial to remember that success rooted in diligence and astute decisions is more enduring compared to success built on luck. Hence, steer clear of chasing success that's heavily dependent on luck. Be mindful of how your instincts and emotions can obscure the unpredictable nature of life. Brace yourself for risks and avoid situations that could lead to unbearable losses. Below are some suggestions on how to foster this resilient mindset.
Anticipate Your Emotional Reactions
Reacting rationally can be challenging when we have inherent biases influencing us, even when we're aware of the better choice. We often know how we're supposed to behave, but impulsiveness takes over. The issue isn’t a lack of knowledge; it's in the execution.
Being humans, we are naturally inclined to respond emotionally to different stimuli. One effective method to prevent emotions from muddling our judgment is by avoiding the triggers that spark them. For instance, if you struggle with controlling your chocolate cravings, a wise strategy would be to not store it in your desk. Similarly, minimize your exposure to market triggers.
To dodge unnecessary emotional reactions to minor fluctuations, lessen your exposure to the constant stream of information coming your way: Avoid checking market reports, watching television, and scrolling through online finance analyses. Shield yourself from minor shifts that might emotionally hook you, leading to hasty decisions.
Set specific criteria for your data, so you only get alerted when the market experiences relatively large swings, with the magnitude defined by you beforehand. (Remember, the significance of changes isn’t linear. A 2 percent shift isn't merely double a 1 percent shift; it's more like 4 to 10 times, depending on the specific figures. Set your significance threshold with this in mind.)
Essentially, sift through the noise similarly to how an engineer filters out background static during a phone call. A phone's technology identifies sounds with minor amplitude variations, labels them as noise, and filters them out, ensuring only sounds with notable amplitude changes reach the listener. Without such technological aid, our ears struggle to discern voices amidst the noise. Likewise, our brains find it difficult to filter out important shifts from noise, especially while being swamped with "expert" advice.
Don’t Think Like a Bull or a Bear
Instead of fixating on whether the market will climb (as in a bullish market) or decline (as in a bearish one), focus more on the effects either movement could have on your outcomes. While accurately predicting small market movements could net you minor gains, protecting against significant drops or surges could be far more rewarding.
Strive to profit from rare events rather than from slow and steady ventures; rare events—be they positive or negative—usually offer more profitability over time. Adopt a crisis hunter mindset: stay vigilant for uncommon opportunities or unexpected threats, and devise plans to gain from either scenario. Crisis hunters are the ones betting against the market anticipating a major fall, or purchasing options hoping for an unlikely price surge. Even if these scenarios are less likely compared to market stability, they could provide much larger returns and are therefore worth the effort.
For example, if you estimate an 80 percent chance of the market rising, but only by two percent, and on the other hand, if it falls, it would plummet by 20 percent, you’d be better off betting against the market and expecting the fall, as this strategy could yield higher returns.
Be Aware of Your Superstitions
Superstitions are ingrained in our human nature: we naturally seek links between unrelated items, events, or patterns, and are often more inclined to accept a hypothesis rather than reject it. So, if someone suggests that our success could be tied to what we had for breakfast or a quirky market strategy we employed, we tend to give it some thought instead of dismissing it right away.
While you might associate superstitions with age-old cultural myths, most people carry a handful of superstitions that lead them to see cause-and-effect relationships between unrelated things—like believing a particular pair of shoes brings good trading luck. These quirks are sometimes dubbed "gambler’s ticks."
Steer clear of letting such baseless superstitions guide you. Don't hunt for links where there aren't any—embrace the randomness inherent in your profession and avoid giving significance to trivial matters. Reflect on your fondness for your 'lucky' pair of shoes or a peculiar strategy, realizing they hold no real fortune.
Don’t Stay Wedded to Your Positions
Successful traders often share a trait—they are willing to change their positions swiftly. George Soros is famous for this; he might express bearish views about the market and then suddenly make bullish buys.
We mentioned “path dependence” earlier while discussing how initial success influences future success. A quick reminder, path dependence implies that a sequence of events unfolds based on an initial, significant event. Practically, it suggests that we are often guided by our previous decisions, like choosing a particular trading stance, unless we decide otherwise.
This tendency likely evolved to maintain a stable society rather than having one where people change their spouse, job, and life choices daily. However, on the flip side, it can hinder individuals from reconsidering their emotional commitment to a decision, even when evidence suggests they should.
This issue is not limited to investing but prevails in other fields too. For instance, it’s rare for a scholar to suddenly contradict all their previous research. Typically, a scholar “defends” her thesis; it would be unusual for her to change her stance during her doctoral defense.
In trading, this manifests when traders keep doubling down on losing positions, insisting their strategy will eventually prevail. This often leads to significant losses and sometimes an exit from the industry.
Rather than acknowledging their mistakes and learning from them, such traders blame market forces. Admitting that all the beliefs and efforts up till now were incorrect is tough. However, a wiser approach is to recognize that errors happen, learn from them, and integrate those lessons into future decisions. Once you embrace the reality of making mistakes, you'll find it easier to alter your position when indications suggest it's misaligned.
Some Final Notes on Randomness
Sometimes randomness is an acceptable—even desirable—part of the human experience. Outside of survival matters, like making a living through investing or other means, randomness can enrich our lives. For example, poetry often comprises random-sounding phrases, and a computer-generated poem can sound just as lyrical as one created by a human. However, in the business world, minimizing or sometimes harnessing randomness is key to achieving lasting success.
We can't control the random events that occur in our lives. The best we can do is plan for them and react appropriately when they inevitably happen. Expect the risks you can predict, and leave room in your plans for those you can't. No matter how much you plan, always remember that randomness is around the corner. Bear this in mind and react with dignity to the rare events that inevitably hit you:
- Greet misfortune with stoicism: a combination of wisdom and courage.
- Face the future with an optimistic yet accepting attitude.
- Don't act like a victim. Don’t show self-pity.
- Be courteous to those below you; be forgiving of those who contributed to your misfortune.
- Remember that the only thing luck does not control is your attitude.