Cognitive Bias Field Manual

The Cognitive Bias Field Manual reveals where we can and cannot trust our intuitions when we look to formulate accurate judgements within our lives.


What’s covered in this manual? This will be a fascinating journey of psychology—We’ll begin by exploring the way you think, and looking at the many biased judgements that your brain makes each and every day to ensure your survival against the chaotic forces of nature.


What will I gain from reading this? We’ll equip you, the reader, with the knowledge of your automated thinking processes that often blind us to the true realities of our external environment—Before lastly, arming you with one of the oldest and most powerful strategies for dealing with your cognitive bias.


It’s always interesting just to see how the human mind is relating to the natural universe, and what we try to make of it just so we can believe we understand what’s going on.

— Neil deGrasse Tyson

Introduction

 

Welcome to your Cognitive Bias Field Manual. This guide will explore the numerous mental shortcuts that the human brain often depends upon to form rapid decisions—Biases that have served our ancestors admirably for thousands of years, to ensure our survival against the chaotic forces of nature.

We’re particularly concerned in this guide with how these mental shortcuts serve you in today’s environment. How these shortcuts often deceive you, and continuously lead you to believe that you’re in possession of the objective truth when in fact, you’re not.[1]

In my write up on human fallibility, I made the case that we’re each limited in our capacity to comprehend objective truth and we’d all be a lot better off if we knew this while making an active effort to mitigate it through critical thinking. This Field Manual is an additional effort of mine to assist you in this endeavour by outlining the psychological factors in play when you’re trying to make sense of the world around you.

Index:

What is Cognitive Bias?

A cognitive bias refers to the habits/patterns in our thinking that make us conclude the wrong things from time to time. It’s in our wiring, unconscious and unavoidable. Psychological thinking processes that are optimised to interpret and act upon an environment comprised of limitless information, often in a timely manner that has your survival in mind. This process contributes towards a subjective social reality that your mind possesses, that differs from the objective world around us.[2]

Here, we can immediately grasp the problematic nature of this phenomenon—if we’re continuously forming a subjective social reality that is left unopposed. Furthermore, our judgements are often guided by the impressionable forces of the crowd, which typically justifies the unrelenting confidence we often have in our chosen beliefs. So much so, that we’re sometimes over confident in our beliefs, even when we’re objectively wrong. But why is this?

To grasp a concept as profound as this one demands nothing short of your undivided attention. Humility towards your own thinking process is mandatory if you wish to identify and understand the common errors in judgement that I will be covering within this guide. Within this guide, we will be exploring some of the themes explored by the world-renowned psychologist and Nobel Prize winner, Daniel Kahneman. The author of the major New York Times bestselling book: Thinking, Fast and Slow.[3]

The Two Systems: System 1 Thinking & System 2 Thinking

Let’s begin. To understand and identify the moment our cognitive biases take place in real-time, we must firstly comprehend how our brain processes our choices and judgements. Thanks to the wonderful works of Danial Kahneman, we can now confidently discuss our cognitive processes through a useful construction that we can all make sense of:

System 1: The all powerful, subconscious agency that processes our thoughts rapidly and automatically.

System 1 contains your personal model of the world, that is continuously crosschecked to perceive external events around you as normal or surprising. This automated thinking process is heavily influenced by context and your previous experiences to aid in assimilating newly acquired stimuli into preexisting knowledge structures. This whole process is truly spectacular when you think about it.

To witness System 1 thinking in action, take a glimpse of the following image:

 

As you glanced at the image above, your intuitive thinking took over. A rapid, automated observation, determined that this man is angry. This conclusion of yours didn’t require any logical assessment to analyse the accuracy of such a judgement—It is self-evidently valid. (Experience alone is enough for this belief to be held.)

This is the magnificent, beneficial thinking process that has allowed humans to form rapid choices when exposed to foreign stimuli in time critical/stressful environments. However, you can probably guess how this self-evident thinking system could cause us a few blind spots from time to time, particularly when our choices are being funnelled through System 1 on a regular basis.

 

System 2: Introducing our logical thinking system.

You’re probably already familiar with this system, as it’s typically the one we identify with when we think of ourselves. (However inaccurate this is, but that’s another conversation.) System 2 is our rational, conscious self that formulates our everyday plans, beliefs and actions through calculated methodical thinking.

System 2 is the logical system that we’re empowering when we’re developing our critical thinking skills. We’re granting System 2 more permission to weigh in on matters that may have once been monopolised by automated, System 1 thinking in our everyday activities. A changing of the guard, that is welcoming a new shift in our mindful authority. We’ll be exploring a heuristic process of acknowledging our cognitive blind spots, that will better enable us to overcome the shortfalls of lazy thinking when the stakes are high. And you shouldn’t be too hard on yourself either. System 2 processing is cognitively very demanding, and intrinsically more taxing than your default thinking, so this will be difficult. If your body can avoid complex thinking, it will. The nervous system typically consumes more glucose than most other parts of the body, and heavy mental activity via System 2 thinking appears to be rather expensive in the currency of glucose.[4]

To witness the ever present, authority of System 1: Close your eyes and think of nothing—Clear the mind of all thoughts present and future (System 2 is now engaged and overriding your automated thinking) and behold, as random thoughts begin to creep back into the mind without your consent. This is System 1, making itself loud and clear.

It’s important to note, these systems we speak of are not systems in the standard sense of physical entities. And there is no single part of the brain that either of these systems would call home. The construction we’re using here to describe the mind, is a linguistic aid to allow us to discuss the intricate complexities of choice and judgement without misconception. Humans are funny like that. We’re able to overcome vast cognitive limitations of complexity by forming useful little devices like: That mountain is 14 football pitches high… and that is precisely what we’re doing here—utilising a language that accurately describes the brain’s operandi in a way that we can understand. These systems need not explain all the facts to be useful, they simply need to provide a better paradigm to discuss the unfamiliar terrain of cognitive thought than those previous used.

For anybody interested in further reading on these thinking systems, I would highly advise you to read the spectacular book: Thinking, Fast and Slow by Daniel Kahneman.

How to identify Cognitive Biases

There will be countless moments in our lives when our cognitive biases will be unavoidable, simply because System 2 lacks the clues to indicate when an error is taking place. Continuous vigilance over our automated thinking processes would be bloody tiring, to say the least, and very impractical. Thus, our best solution is a compromise—We will learn to recognise the situations that may be prone to mistakes and we will learn to ask the right questions.

 

Take a look at the above image to see this process in action. If we’re to glance at these shapes, it seems quite obvious that the top line is longer than the bottom line. This would be incorrect. The lines are the same length. This error is a consequence of the brain using cognitive shortcuts to ensure one’s behavioral responses have the best chance of contending successfully with retinal projections whose sources are inherently uncertain.[5] It doesn’t matter how many times you’ve witnessed this illusion, or verified the length of the lines through reasoned analysis—Your brain will continue to see the top line as the longest. If we’re to desire greater accuracy in our judgements, we should all be aware of the hard-wired mental shortcuts in play and what limitations they may bring.

Now, let’s take a look at another type of problem:

A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball.
How much does the ball cost?

For most people, their impulsive thinking would likely conclude that the ball costs 10 cents. This would be incorrect. The ball would in fact, cost 5 cents. A frustrating revelation for many of you, I’m sure—At this moment, cognitive reflection is likely in full effect. Check out the following explanation if you’re still struggling to work it out:

(incorrect) If you say the ball = .10, the bat must cost $1.10 to be $1 more. Total = $1.20
(correct) Ball = .05, the bat must cost $1.05 to be $1 more. Total = $1.10

There you have it folks, an example of System 2 bailing out your automated, impulsive System 1 when it goofs up in the field. But let’s not forget, if it wasn’t for this guide pointing out the previous incorrect answer in real-time—You may have been satisfied with your answer. And this is the problem. We all regularly scroll around our local environments, making impulsive choices that are not factually accurate. Unaware of our errors due to the absence of clues that would imply we’re missing the mark.

In the guide ahead, I’ll be be highlighting four big problems that your brain faces off against everyday—Along with all the Cognitive Biases that are present in each. By exploring the various biases, you’ll see why your System 1 thinking is optimised to ensure successful outcomes with impulsive decision-making capabilities. While the accuracy is often compromised, the capability to act decisively within difficult and time critical scenarios has been paramount to our survival.

This is your challenge. Learn to understand when it is more beneficial to act impulsively—And when it is important to employ a methodical approach towards a problem. Mastering this process is the key towards maneuvering this world with more accuracy in one’s assessments and resulting actions.

The Problems and their respective Cognitive Biases

 

Real World Problem 1: There is too much information

We now reside within a world of vast information and our brain has no choice but to filter almost all of it out. Here are some of the tricks that our brain utilises automatically to assist us with this problem:

We are drawn to details that confirm our own existing beliefs: We’re not naturally impartial. We regularly turn a blind eye to details that contradict our own positions. See: Confirmation bias, Congruence bias, Choice-supportive bias, Selective perception, Observer-expectancy effect, Ostrich effect, Subjective validation, Continued influence effect, Semmelweis reflex

We’re more likely to notice things that are fresh in our memory or repeated often: Have you ever noticed sometimes after hearing about something, the frequency of its appearance increases? Suddenly you’re noticing it on billboards and everyone is talking about it? Yeah, that’s normally our fault. See: Availability heuristic, Attentional bias, Illusory truth effect, Mere exposure effect, Cue-dependent forgetting, Frequency illusion, Empathy gap, Omission bias, Base rate fallacy

We notice when something has changed: A collection of interesting quirks that sway our choices through perception. See: Distinction bias, Conservatism, Anchoring, Money illusion, Framing effect, Weber–Fechner law.

Peculiar/funny/visually-striking/anthropomorphic things stand out more than ordinary/unfunny things: Our brains often prioritise the importance of things that are surprising or unusual to us. Alternatively, we’re often quick to skip over information that we think is expected or ordinary. See: Bizarreness effect, Von Restorff effect, Picture superiority effect, Self-relevance effect, Negativity bias.

 

Real World Problem 2: Not enough clarity and meaning

On top of the vast information that we have all have at our disposal today, making sense of it all is a big problem and we often lack clarity. But the brain needs to make sense of the world around it to survive. So, after our filtered information stream comes in, we analyse, then fill in the missing links with our predetermined model of the world.

“It is an acknowledged fact that we perceive errors in the work of others more readily than in our own.”

― Leonardo da Vinci

We fill in characteristics from stereotypes, generalities, and prior knowledge whenever there are new specific instances or gaps in information: When we’re unclear and looking upon a specific thing, within a context we’re already familiar with, we tend to automatically fill the gaps with our best guesses. See: Group attribution error, Ultimate attribution error, Stereotyping, Essentialism, Functional fixedness, Moral credential effect, Just-world hypothesis, Argument from fallacy, Authority bias, Automation bias, Bandwagon effect, Placebo effect.

We find patterns and stories, even within limited data: The data may be plentiful at times, but it is filtered and often full of holes. It seems we never quite have the full story at hand. And yet, our brain reconstructs the world around us to complete the puzzle inside our heads. See: Confabulation, Clustering illusion, Insensitivity to sample size, Neglect of probability, Anecdotal fallacy, Illusion of validity, Masked man fallacy, Recency illusion, Gambler’s fallacy, Hot-hand fallacy, Illusory correlation, Pareidolia, Anthropomorphism.

People and things that we’re familiar/fond with are seen as better than the people and things we’re least familiar/fond of: The worth of things we’re familiar with seems to be very valuable within our own personal economy. Often a troubling predicament for those who lack our perspective. See: Halo effect, In-group bias, Out-group homogeneity bias, Cross-race effect, Cheerleader effect, Well-traveled road effect, Not invented here, Reactive devaluation, Positivity effect.

We think we know what others are thinking: We model the thinking of others to our own. This also includes the illusion that people are thinking about us as much as we are thinking of ourselves. Spoiler alert: They’re not. See: Curse of knowledge, Illusion of transparency, Spotlight effect, Illusion of external agency, Illusion of asymmetric insight, Extrinsic incentive error.

We simplify numbers and probabilities to make them easier to think about: System 1 is not very good at math and often fails to call upon the resources of System 2 (Likely due to energy preservation.) This often results in errors when predicting the likelihood of something happening. See: Mental accounting, Normalcy bias, Appeal to probability fallacy, Murphy’s Law, Subadditivity effect, Survivorship bias, Zero sum bias, Denomination effect, Magic number 7+-2.

We often project our current mindset and assumptions onto our past and future: This is one of the big ones. We seem to show little to no remorse for our inaccurate perceptions of time when we find ourselves tweaking our memory banks with the present narrative. See: Hindsight bias, Outcome bias, Moral luck, Declinism, Telescoping effect, Rosy retrospection, Impact bias, Pessimism bias, Planning fallacy, Time-saving bias, Pro-innovation bias, Projection bias, Restraint bias, Self-consistency bias.

 

Real World Problem 3: Time

When there is no time to think, it is imperative that we can still act. System 1 thinking has provided our species with the autonomous processes that have allowed us to act imperatively when we’re presented with time-critical challenges. Nature often punishes those who are unable to formulate decisive actions when the clock is ticking.

In order to act, we need to be confident in our ability to make an impact: In reality, this can be classified as overconfidence, but without it, we might not even act at all. See: Overconfidence effect, Egocentric bias, Optimism bias, Social desirability bias, Third-person effect, Barnum effect, Illusion of control, False consensus effect, Dunning-Kruger effect, Hard-easy effect, Illusory superiority, Lake Wobegone effect, Self-serving bias, Fundamental attribution error, Defensive attribution hypothesis, Trait ascription bias, Effort justification, Risk compensation.

We favour simple solutions over more complex, ambiguous solutions: We prefer quick, simple solutions over complicated strategies. Even if the complicated strategy is ultimately a better use of our time and energy. See: Ambiguity bias, Information bias, Belief bias, Rhyme as reason effect, Law of Triviality, Conjunction fallacy, Occam’s razor, Less-is-better effect.

We’re motivated to preserve our autonomy within a group to reduce self-inflicted mistakes and to avoid irreversible decisions: It seems we have an intrinsic pull towards the hive mind. We tend to choose the options that are the least risky towards the status quo. See: System justification, Reverse psychology, Decoy effect, Social comparison bias, Status quo bias.

To stay focused, we favour the immediate, relatable thing in front of us over the delayed and distant: Once again, we disregard the objective realm to serve our personal economy. We place more value on stuff in the present than in the future. See: Hyperbolic discounting, Appeal to novelty, Identifiable victim effect.

We’re motivated to complete things that we’ve already invested our time and energy in: Rationality may often take a back seat over personal goal attainment and self-justification. See: Sunk cost fallacy, Escalation of commitment, Loss aversion, IKEA effect, Generation effect, Zero-risk bias, Disposition effect, Pseudocertainty effect, Endowment effect, Backfire effect.

 

Real World Problem 4: What should we remember?

There’s an incalculable amount of information within the universe. This measureless realm of data that we find all around us, is not directly comprehensible to us within our standard thinking models. The human brain can only afford to keep around the segments of information that may prove useful to us in the future. To achieve this goal efficiently, our brains are constantly discarding and betting on items that may be valuable to us—Discarding plentiful details in the process.

We reduce events and lists to their key elements: We tend to judge entire experiences through the emotion we felt at the event’s peak. Rather than evaluating the event piece by piece. Thus, our memories are typically represented by a few key items rather than the complete reality of the experience we had. See: Peak–end rule, Leveling and sharpening, Misinformation effect, Duration neglect, Serial recall effect, Modality effect, Memory inhibition, Serial position effect.

We reinforce and modify some of our memories after the fact: We can swap various details within a memory, remove them and even add specifics to a memory that were never present to begin with. See: Misattribution of memory, Cryptomnesia, Confabulation, Suggestibility, Spacing effect.

We discard specifics to form generalities: This area of social cognition tends to be one of the most problematic cognitive behaviours within our modern world. It is here, that many of us form anecdotal stereotypes that conflict with the facts of reality. See: Implicit stereotypes, Prejudice, Negativity bias, Fading affect bias.

We store memories differently based on how they were experienced: We catalogue and prioritise our memories based on a substantial number of fascinating factors. This includes; What senses we used to experience it, if we used System 2 thinking to encode its meaning, and how easy the information could be accessed online at a later time. See: Levels of processing effect, Google effect, Testing effect, Absent-mindedness, Tip of the tongue phenomenon.

 

Critical Thinking: The Socratic Method

The oldest and still one of the most powerful tactics for dealing with your cognitive biases and forming objective conclusions is The Socratic method. This distinguished method is named after a man credited as one of the founders of Western philosophy—Socrates.

Socrates was a classical Greek (Athenian) philosopher who found himself continuously engaged in the skillful questioning of his students in an unrelenting search for truth. He became famous by the process in which he used to seek out his truths—and this was achieved by Socrates seeking out the foundations of his students’ and colleagues’ views and looking to see if they could hold up to a logical onslaught of questioning that looked to expose the weaknesses in faulty ways of thinking.

The Socratic method is a series of questions, that attacks and questions the legitimacy of a statement that has asserted as the answerer’s own personal belief.[6] It does this via the follow process:

  1. Focus on a common sense statement.
  2. Find an exception to that statement.
  3. Reject the statement if an exception is found.
  4. The respondent reformulates the statement to account for the exception.
  5. Keep repeating the process until a statement cannot be overturned.

This is a beautiful method that we’re all naturally effective at. Curiosity, asking questions and testing things are intuitive traits of human nature.

Let’s see it in action:

  1. “To be successful, one must have a good education.”
  2. “Cristiano Ronaldo is one of the world’s most successful and wealthy athletes. He didn’t graduate High School.”
  3. Reject the statement: To be successful, one must have a good education.
  4. “This is true, however, people who complete higher levels of education, typically earn more than those who don’t, over large population groups.”[7]
  5. Unable to find an exception to the modified statement. Accept this newly proposed statement: “People who complete higher levels of education, typically earn more than those who don’t, over large population groups.”

The Socratic approach is a valuable method to utilise when engaging in matters of debate and preventing faulty conclusions. If you’re genuinely interested in learning the Socratic Method, click on the link below as I’ve elaborated a bit more on the process:

Learn more about the Socratic Method — Capable Men of History: Socrates

Conclusion

Within this manual, I’ve unravelled the automated thinking system that has proven paramount to our survival here on Earth. To now stand on the shoulders of giants and even consider the possibility of more System 2 oversight within our everyday thinking, is a privilege afforded to us, only because of the triumphs of our ancestors. Through modern Psychology, we have confirmed these cognitive biases through replicable research in this grand endeavour of ours to better understand the human condition.

We’re survivalists within an untried environment that our predecessors have never witnessed before. And it is here and now, where the capable amongst us, must adapt, thrive, and seek innovation and discovery where it is possible in our own respective environments.

Use this guide as a reference tool to remind you of your cognitive limitations when it comes to navigating this complex world objectively. Bookmark it, read it from time to time to keep the lessons fresh in the mind. (Taking full advantage of the Availability heuristic.) Finally, share this guide with loved ones and allies to assist them in their own endeavours. Our world will benefit highly from the harmonious perspective of critical thinking as we move forward in this grand, human story.

References

 

  • [1] Kahneman, D.; Tversky, A. (1972). “Subjective probability: A judgment of representativeness” (PDF). Cognitive Psychology
  • [2] Haselton, M. G.; Nettle, D. & Andrews, P. W. (2005). “The evolution of cognitive bias” (PDF). In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology: Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746
  • [3] Kahneman, Daniel. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux
  • [4] Journal of Personality and Social Psychology 2007, Vol. 92 – Self-Control Relies on Glucose as a Limited Energy Source: Willpower Is More Than a Metaphor
  • [5] The Müller-Lyer illusion explained by the statistics of image-source relationships. Catherine Q. Howe and Dale Purves.
  • [6] Gregory Vlastos, ‘The Socratic Elenchus‘ 1983
  • [7] Education Counts – Impact of education on income