Cognitive Biases: Rethink What You Know

Written By: Vidya Sinha

In the age of readily available information, we are constantly inundated with stimuli. The interpretation of media presents a slew of challenges that our hunter-gatherer brains are not equipped to handle, and in our attempt to parse complex information, we inadvertently fall prey to unsubstantiated views about the world. In this article of Science ReWired, we will explore some common cognitive biases that impair human judgment, seeking to rectify this issue through a novel lens of self-awareness.

What Is a Heuristic?

There are two modes of reasoning: deductive and inductive. Deductive reasoning refers to the meticulous analysis and interpretation of evidence to arrive at a logical conclusion, while inductive reasoning refers to the use of general truths to arrive at situation-specific conclusions. In the ancestral landscape, we were often forced to leverage induction; if an unidentified animal is sprinting toward you at eighty miles per hour, it is unwise to logically analyze the predatory status of the animal, and it is much more pragmatic to quickly assume that the animal is dangerous based on your past experiences with carnivorous animals. This will prompt you to flee, maximizing your chances of survival. In this scenario, your survival is dependent on the use of a heuristic, or a mental shortcut, to instantaneously categorize the animal as a predator. 

Although the vast majority of human beings no longer encounter such enervating challenges in their daily lives, our residual instincts still govern our behavior in unmistakable ways, influencing the way we interact with the modern world. Although heuristics serve a crucial role in informing decisions, they remain a double-edged sword that can be dangerous when misutilized. Once a tactic for escaping ravenous tigers and avoiding food poisoning from a neon-pink mushroom, heuristics now fuel much of our confusions and qualms about the modern world, manifesting in voting inaction and late-night internet skirmishes, destabilizing relationships, and tainting our perceptions of reality.

Correspondence Bias

Have you ever had a fraught encounter with a stranger online, and thought to yourself, “that was such a nasty person!” Did you stop to consider the possibility that they were faced with subpar circumstances that day? Did you ponder the reasons behind their nastiness? Chances are, you simply concluded that they were a nasty person. This is a typical example of the correspondence bias, defined by psychologist Daniel T. Gilbert as  “the tendency to draw inferences about a person’s unique and enduring dispositions from behaviors that can be entirely explained by the situations in which they occur.” 

In combating correspondence bias, the goal is not to absolve a person of a misdeed, but rather to acknowledge that a singular action may not indicate a person’s long-term behavioral trajectory. To reiterate in plain English: people have bad days. Moreover it is irrational to generalize from one example of a person’s behavior, unless the action perpetrated lies at a moral extreme, such as murder.

Correspondence bias is fairly common-sensical; readers with above-average empathy may now be shaking their heads, now convinced that they are unaffected by cognitive biases in general. (Which, in fact, is an example of overgeneralization! I caught you!) But have you ever heard of the availability heuristic? 

The Availability Heuristic 

The availability heuristic permeates our daily lives in a much more covert fashion than the correspondence bias. Have you ever been afflicted with a sense of prophetic doom regarding the future of the world, wholly convinced that Western politics are incorrigible, human violence is at its zenith, and going to the beach is a surefire way to get your two-year old mauled by the shark from “Jaws?” Bingo. That’s the availability heuristic at work! 

The availability heuristic is the formation of judgements based on readily available information. Unfortunately, what is readily available, especially the information delivered by news outlets,  is often curated to specific standards: news must be newsworthy, and therefore, a plurality of media-stories will be shocking enough to incite a reaction in the reader. Due to the filtration of news, our stored information regarding an issue may be tilted towards extremes, leading to irrational fears and ill-founded cynicism. 

The truth is that societies undergo natural highs and lows, and that incorrigibility is seldom the case. The truth is that violence is scanter than it has ever been over the course of recorded human history. The truth is that malfunctioning toasters account for more annual fatalities than sharks. These truths are not inherently strange, but the evidence we possess at hand makes each of these claims non-intuitive. This is the subtle danger of the availability heuristic. 

Scope Insensitivity

Another curious fact about human psychology is that our concern for an issue does not linearly increase with the magnitude of those affected. In fact, our concern for individual lives diminishes as the number of affected individuals increases. This phenomenon is referred to as scope insensitivity. This is completely counterproductive to utilitarian ethics; people’s brains cannot process large numbers, so our moral judgements are not guided by a desire to maximize good after a certain threshold of human beings involved has been exceeded. This is incredibly important at the level of national decision making, as it can shed light on indifference to large-scale problems. By becoming aware of scope insensitivity, we can begin to reconsider perspectives on global issues, as well as the prosaic concerns of our daily lives.

Conclusion

This is by no means an exhaustive list of cognitive biases. Behavioral science is replete with equally fascinating examples of subconscious reasoning leading subjects astray. However, this is not an indication to abandon quick judgment and trust only careful, deliberate reasoning; the subconscious mind is inextricably tied to everything we do, and it is our responsibility to harness it effectively. Knowledge of our minds is the corrective lens through which we can begin to see the world for what it is.

Works Cited

Dickert, S., Västfjäll, D., Kleber, J., & Slovic, P. (2015). Scope insensitivity: The limits of intuitive valuation of human lives in public policy. Journal of Applied Research in Memory and Cognition, 4(3), 248–255. https://doi.org/10.1016/j.jarmac.2014.09.002

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. https://doi.org/10.1016/0010-0285(73)90033-9

Gilbert, D. T., & Malone, P. S. (1995). The correspondence bias. Psychological Bulletin, 117(1), 21–38. https://doi.org/10.1037/0033-2909.117.1.21

Previous
Previous

The Perils of Iron Deficiency

Next
Next

The British Flag Theorem