The Cognitive Bias Field Manual reveals where we can and cannot trust our intuitions when we look to formulate accurate judgements within our lives.
What’s covered in this manual? This will be a fascinating journey of psychology—We’ll begin by exploring the way you think, and looking at the many biased judgements that your brain makes each and every day to ensure your survival against the chaotic forces of nature.
What will I gain from reading this? We’ll equip you, the reader, with the knowledge of your automated thinking processes that often blind us to the true realities of our external environment—Before lastly, arming you with one of the oldest and most powerful strategies for dealing with your cognitive bias.
It’s always interesting just to see how the human mind is relating to the natural universe, and what we try to make of it just so we can believe we understand what’s going on.
-Neil deGrasse Tyson
Welcome to your Cognitive Bias Field Manual. This guide will explore the numerous mental shortcuts that the human brain often depends upon to form rapid decisions—Biases that have served our ancestors admirably for thousands of years, to ensure our survival against the chaotic forces of nature. Today, the modern human resides within a highly complex world, comprised of countless novel challenges that we must all overcome with little to no historical references to guide us along this untrodden frontier. It is here, that many social institutions rely upon individuals to make sound, rational judgments to keep this machine running. A tremendous problem for a world that filters a considerable amount of its choices through cognitive biases. Often leaving us confused, unsure and alone in our efforts to make sense of it all when these choices are torn apart by external observers.
We’re often lead to believe that the mind is hopelessly lost in the face of such a complex, modern environment and that we should merely embrace rationality, at the expense of our default thinking habits to achieve real-world victories. However, this would be foolish to anybody who would choose to dismiss the intricately, splendid thinking mechanisms that have been tried and tested throughout human history. The automated mind is not to be dismissed so recklessly. As it is a highly advantageous mindset to depend upon within specific situations. Particularly, when time is against us, and we need to win.
- What is Cognitive Bias?
- The Two Systems: System 1 thinking vs System 2 thinking
- How to identify Cognitive Biases
- List of Cognitive Biases
- Critical Thinking: The Socratic Method
What is Cognitive Bias?
A cognitive bias refers to the habits/patterns in our thinking that make us conclude the wrong things from time to time. It’s in our wiring, unconscious and unavoidable. Biological systems that are doing their best job to interpret and manoeuvre within a universe comprised of limitless information, often in a timely manner that has our survival in mind. This often means that our brains are forming a subjective social reality that differs from the objective world around us.
Here, we can immediately grasp the problematic lifestyle one could face, if we’re continuously forming a subjective social reality that is left unopposed. Furthermore, our judgements are often guided by the impressionable forces of the hive mind, which typically justifies the unrelenting confidence we often have in our chosen beliefs. So much so, that we’re sometimes over confident in our beliefs, even when we’re objectively wrong! But why is this? Why are we so effective at spotting the flawed judgements of others, but not ourselves?
To grasp a concept as profound as this one demands nothing short of your undivided attention. Self-awareness towards your own thinking process is mandatory if you’re to identify and understand the common errors in judgement that I will be covering within this guide. This process towards achieving mindfulness is an immense, life-changing experience that grants clarity to the victors, by utilising some of the most astute insights of cognitive psychology available today. Within this guide, we will be exploring some of the themes explored by the world-renowned psychologist and Nobel Prize winner, Daniel Kahneman. The author of the major New York Times Bestselling book: Thinking, Fast and Slow.
The Two Systems: System 1 Thinking vs System 2 Thinking
To understand and identify the moment our cognitive biases take place in real-time, we must firstly comprehend how our brain processes our choices and judgements. Thanks to the wonderful works of Danial Kahneman, we can now confidently discuss our cognitive processes through a useful construction that we can all make sense of:
System 1: Introducing the all powerful, subconscious agency that processes our thoughts rapidly and automatically.
System 1 contains your personal model of the world, that is continuously crosschecked to perceive external events around you as normal or surprising. This automated thinking process is heavily influenced by context and your previous experiences to aid in assimilating newly acquired stimuli into preexisting knowledge structures. This whole process is truly spectacular when you think about it.
To witness System 1 thinking in action, take a glimpse of the following image:
As you glanced at the image above, your intuitive thinking took over. A rapid, automated observation, determined that this man is angry. This conclusion of yours didn’t require any logical assessment to analyse the accuracy of such a judgement—It is self-evidently valid. (Experience alone is enough for this belief to be held.)
This is the magnificent, beneficial thinking process that has allowed humans to form rapid choices when exposed to foreign stimuli in time critical/stressful environments. However, you can probably guess how this self-evident thinking system could cause us a few blind spots from time to time, particularly when our choices are being funnelled through System 1 on a regular basis.
System 2: Introducing our logical thinking system.
You’re probably already familiar with this system, as it’s typically the one we identify with when we think of ourselves. (However inaccurate this is, but that’s another conversation.) System 2 is our rational, conscious self that formulates our everyday plans, beliefs and actions through calculated methodical thinking.
System 2 is the logical system that we’re empowering when we’re developing our critical thinking skills. We’re granting System 2 more permission to weigh in on matters that may have once been monopolised by automated, System 1 thinking in our everyday activities. A changing of the guard, that is welcoming a new shift in our mindful authority. We’ll be exploring a heuristic process of acknowledging our cognitive blind spots, that will better enable us to overcome the shortfalls of lazy thinking when the stakes are high. And you shouldn’t be too hard on yourself either. System 2 processing is cognitively very demanding, and intrinsically more taxing than your default thinking, so this will be difficult. If your body can avoid complex thinking, it will. The nervous system typically consumes more glucose than most other parts of the body, and heavy mental activity via System 2 thinking appears to be rather expensive in the currency of glucose.
To witness the ever present, authority of System 1: Close your eyes and think of nothing—Clear the mind of all thoughts present and future (System 2 is now engaged and overriding your automated thinking) and behold, as random thoughts begin to creep back into the mind without your consent. This is System 1, making itself loud and clear.
It’s important to note, these systems we speak of are not systems in the standard sense of physical entities. And there is no single part of the brain that either of these systems would call home. The construction we’re using here to describe the mind, allows us to grasp the intricate complexities of choice and judgement without misconception. Humans are funny like that. We’re able to overcome vast cognitive limitations of complexity by forming useful little devices like: That mountain is 14 football pitches high… and that is what we’re doing here, utilising a language that accurately describes the brain’s operandi in a way that we can all understand.
For anybody interested in further reading on these thinking systems, I would highly advise you to read the spectacular book: Thinking, Fast and Slow by Daniel Kahneman.
How to identify Cognitive Biases
There will be countless moments in our lives when our cognitive biases will be unavoidable, simply because System 2 lacks the clues to indicate when an error is taking place. Continuous vigilance over our automated thinking processes would be bloody tiring, to say the least, and very impractical. Thus, our best solution is a compromise—We will learn to recognise the situations that may be prone to mistakes and we will learn to ask the right questions.
Take a look at the above image to see this process in action. If we’re to glance at these shapes, it seems quite obvious that the top line is longer than the bottom line. This would be incorrect. The lines are the same length. This error isn’t created by faulty intelligence or misinformation but merely the consequence of intrinsic flaws in light of critical, objective thinking. It doesn’t matter how many times you’ve witnessed this illusion, or verified the length of the lines through reasoned analysis—Your brain will continue to see the top line as the longest. If we’re to navigate our environment with accuracy, we should be aware of the hard-wired limitations in play when we’re observing and interpreting our world objectively.
Now, let’s take a look at another type of problem:
A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball.
How much does the ball cost?
For most people, their impulsive thinking would likely conclude that the ball costs 10 cents. This would be incorrect. The ball would in fact, cost 5 cents. A frustrating revelation for many of you, I’m sure—At this moment, cognitive reflection is likely in full effect. Check out the following explanation if you’re still struggling to work it out:
(incorrect) If you say the ball = .10, the bat must cost $1.10 to be $1 more. Total = $1.20
(correct) Ball = .05, the bat must cost $1.05 to be $1 more. Total = $1.10
There you have it folks, an example of System 2 bailing out your automated, impulsive System 1 when it goofs up in the field. But let’s not forget, if it wasn’t for this guide pointing out the previous incorrect answer in real-time—You may have been satisfied with your answer. And this is the problem. We all regularly scroll around our local environments, making impulsive choices that are not factually accurate. Unaware of our errors due to the absence of clues that would imply we’re missing the mark.
We’re now going to help you with that. In the next section, we’re going to look at 4 big problems that the brain is faced with everyday—Along with all the Cognitive Biases that are present in each. Here, you’ll see exactly why your System 1 thinking is optimised to ensure successful outcomes with impulsive decision-making capabilities. While the accuracy is often compromised, the capability to act decisively within difficult scenarios has been paramount to our survival.
This is your challenge. Learn to understand when it is more beneficial to act impulsively—And when it is essential to exert a methodical approach towards a problem. Mastering this process is key towards manoeuvring this world with formidable proficiency.
List of Cognitive Biases
If you’ve ever had the pleasure of reading Wikipedia’s list of cognitive biases, you might find it difficult to conceptualise the list before you into a comprehensive structure that you can make use of. I’ve personally found the Cognitive Bias Cheat Sheet, by Buster Benson one of the most effective structures to work from, when placing cognitive biases into a real-world context.
Real World Problem 1: There is too much information
We now reside within a world of vast information and our brain has no choice but to filter almost all of it out. Here are some of the tricks that our brain utilises automatically to assist us with this problem:
We are drawn to details that confirm our own existing beliefs: We’re not naturally impartial. We regularly turn a blind eye to details that contradict our own positions. See: Confirmation bias, Congruence bias, Choice-supportive bias, Selective perception, Observer-expectancy effect, Ostrich effect, Subjective validation, Continued influence effect, Semmelweis reflex
We’re more likely to notice things that are fresh in our memory or repeated often: Have you ever noticed sometimes after hearing about something, the frequency of its appearance increases? Suddenly you’re noticing it on billboards and everyone is talking about it? Yeah, that’s normally our fault. See: Availability heuristic, Attentional bias, Illusory truth effect, Mere exposure effect, Cue-dependent forgetting, Frequency illusion, Empathy gap, Omission bias, Base rate fallacy
We notice when something has changed: A collection of interesting quirks that sway our choices through perception. See: Distinction bias, Conservatism, Anchoring, Money illusion, Framing effect, Weber–Fechner law.
Peculiar/funny/visually-striking/anthropomorphic things stand out more than ordinary/unfunny things: Our brains often prioritise the importance of things that are surprising or unusual to us. Alternatively, we’re often quick to skip over information that we think is expected or ordinary. See: Bizarreness effect, Von Restorff effect, Picture superiority effect, Self-relevance effect, Negativity bias.
Real World Problem 2: Not enough clarity and meaning
On top of the vast information that we have all have at our disposal today, making sense of it all is a big problem and we often lack clarity. But the brain needs to make sense of the world around it to survive. So, after our filtered information stream comes in, we analyse, then fill in the missing links with our predetermined model of the world.
We fill in characteristics from stereotypes, generalities, and prior knowledge whenever there are new specific instances or gaps in information: When we’re unclear and looking upon a specific thing, within a context we’re already familiar with, we tend to automatically fill the gaps with our best guesses. See: Group attribution error, Ultimate attribution error, Stereotyping, Essentialism, Functional fixedness, Moral credential effect, Just-world hypothesis, Argument from fallacy, Authority bias, Automation bias, Bandwagon effect, Placebo effect.
We find patterns and stories, even within limited data: The data may be plentiful at times, but it is filtered and often full of holes. It seems we never quite have the full story at hand. And yet, our brain reconstructs the world around us to complete the puzzle inside our heads. See: Confabulation, Clustering illusion, Insensitivity to sample size, Neglect of probability, Anecdotal fallacy, Illusion of validity, Masked man fallacy, Recency illusion, Gambler’s fallacy, Hot-hand fallacy, Illusory correlation, Pareidolia, Anthropomorphism.
People and things that we’re familiar/fond with are seen as better than the people and things we’re least familiar/fond of: The worth of things we’re familiar with seems to be very valuable within our own personal economy. Often a troubling predicament for those who lack our perspective. See: Halo effect, In-group bias, Out-group homogeneity bias, Cross-race effect, Cheerleader effect, Well-traveled road effect, Not invented here, Reactive devaluation, Positivity effect.
We think we know what others are thinking: We model the thinking of others to our own. This also includes the illusion that people are thinking about us as much as we are thinking of ourselves. Spoiler alert: They’re not. See: Curse of knowledge, Illusion of transparency, Spotlight effect, Illusion of external agency, Illusion of asymmetric insight, Extrinsic incentive error.
We simplify numbers and probabilities to make them easier to think about: System 1 is not very good at math and often fails to call upon the resources of System 2 (Likely due to energy preservation.) This often results in errors when predicting the likelihood of something happening. See: Mental accounting, Normalcy bias, Appeal to probability fallacy, Murphy’s Law, Subadditivity effect, Survivorship bias, Zero sum bias, Denomination effect, Magic number 7+-2.
We often project our current mindset and assumptions onto our past and future: This is one of the big ones. We seem to show little to no remorse for our inaccurate perceptions of time when we find ourselves tweaking our memory banks with the present narrative. See: Hindsight bias, Outcome bias, Moral luck, Declinism, Telescoping effect, Rosy retrospection, Impact bias, Pessimism bias, Planning fallacy, Time-saving bias, Pro-innovation bias, Projection bias, Restraint bias, Self-consistency bias.
Real World Problem 3: Time
When there is no time to think, it is imperative that we can still act. System 1 thinking has provided our species with the autonomous processes that have allowed us to act imperatively when we’re presented with time-critical challenges. Nature often punishes those who are unable to formulate decisive actions when the clock is ticking.
In order to act, we need to be confident in our ability to make an impact: In reality, this can be classified as overconfidence, but without it, we might not even act at all. See: Overconfidence effect, Egocentric bias, Optimism bias, Social desirability bias, Third-person effect, Barnum effect, Illusion of control, False consensus effect, Dunning-Kruger effect, Hard-easy effect, Illusory superiority, Lake Wobegone effect, Self-serving bias, Fundamental attribution error, Defensive attribution hypothesis, Trait ascription bias, Effort justification, Risk compensation.
We favour simple solutions over more complex, ambiguous solutions: We prefer quick, simple solutions over complicated strategies. Even if the complicated strategy is ultimately a better use of our time and energy. See: Ambiguity bias, Information bias, Belief bias, Rhyme as reason effect, Law of Triviality, Conjunction fallacy, Occam’s razor, Less-is-better effect.
We’re motivated to preserve our autonomy within a group to reduce self-inflicted mistakes and to avoid irreversible decisions: It seems we have an intrinsic pull towards the hive mind. We tend to choose the options that are the least risky towards the status quo. See: System justification, Reverse psychology, Decoy effect, Social comparison bias, Status quo bias.
To stay focused, we favour the immediate, relatable thing in front of us over the delayed and distant: Once again, we disregard the objective realm to serve our personal economy. We place more value on stuff in the present than in the future. See: Hyperbolic discounting, Appeal to novelty, Identifiable victim effect.
We’re motivated to complete things that we’ve already invested our time and energy in: Rationality may often take a back seat over personal goal attainment and self-justification. See: Sunk cost fallacy, Escalation of commitment, Loss aversion, IKEA effect, Generation effect, Zero-risk bias, Disposition effect, Pseudocertainty effect, Endowment effect, Backfire effect.
Real World Problem 4: What should we remember?
There’s an incalculable amount of information within the universe. This measureless realm of data that we find all around us, is not directly comprehensible to us within our standard thinking models. The human brain can only afford to keep around the segments of information that may prove useful to us in the future. To achieve this goal efficiently, our brains are constantly discarding and betting on items that may be valuable to us—Discarding plentiful details in the process.
We reduce events and lists to their key elements: We tend to judge entire experiences through the emotion we felt at the event’s peak. Rather than evaluating the event piece by piece. Thus, our memories are typically represented by a few key items rather than the complete reality of the experience we had. See: Peak–end rule, Leveling and sharpening, Misinformation effect, Duration neglect, Serial recall effect, Modality effect, Memory inhibition, Serial position effect.
We reinforce and modify some of our memories after the fact: We can swap various details within a memory, remove them and even add specifics to a memory that were never present to begin with. See: Misattribution of memory, Cryptomnesia, Confabulation, Suggestibility, Spacing effect.
We discard specifics to form generalities: This area of social cognition tends to be one of the most problematic cognitive behaviours within our modern world. It is here, that many of us form anecdotal stereotypes that conflict with the facts of reality. See: Implicit stereotypes, Prejudice, Negativity bias, Fading affect bias.
We store memories differently based on how they were experienced: We catalogue and prioritise our memories based on a substantial number of fascinating factors. This includes; What senses we used to experience it, if we used System 2 thinking to encode its meaning, and how easy the information could be accessed online at a later time. See: Levels of processing effect, Google effect, Testing effect, Absent-mindedness, Tip of the tongue phenomenon.
Critical Thinking: The Socratic Method
The oldest and still one of the most powerful tactics for dealing with your cognitive biases and forming objective conclusions is The Socratic method. The Socratic method is a series of questions, that attacks the complacencies that arise when our System 1 thinking has become too convincing.
- Focus on a common sense statement.
- Find an exception to that statement.
- Reject the statement if an exception is found.
- The respondent reformulates the statement to account for the exception.
- Keep repeating the process until a statement cannot be overturned.
This is a beautiful method that we’re all naturally effective at. Curiosity, asking questions and testing things are intuitive traits of human nature.
Let’s see it in action:
- “To be successful, one must have a good education.”
- Cristiano Ronaldo is one of the world’s most successful and wealthy athletes. He didn’t graduate High School.
- Reject the statement: “To be successful, one must have a good education.”
- This is true, however, people who complete higher levels of education, typically earn more than those who don’t, over large population groups.
- Unable to find an exception. Accept the statement: “People who complete higher levels of education, typically earn more than those who don’t, over large population groups.”
The Socratic approach is a valuable method to utilise when engaging in matters of debate and preventing faulty conclusions.
Within this cognitive bias field manual, we have unravelled the automated thinking system that has proven paramount to our survival here on Earth. To now stand on the shoulders of giants and even consider the possibility of more System 2 oversight within our everyday thinking, is a privilege afforded to us, only because of the triumphs and lessons of our ancestors. Through modern Psychology, we have confirmed these cognitive biases through replicable research in this grand endeavour of ours, to better understand the human condition.
We’re survivalists within an untried environment that our predecessors have never witnessed before. And it is here and now, where the capable amongst us, must adapt, thrive, and seek innovation and discovery where it is possible. This guide is your personal tool to remind you of your cognitive limitations when it comes to navigating this complex world objectively. Bookmark this guide and visit it once in a while to keep the lessons fresh and ready. (Taking full advantage of the Availability heuristic.) Finally, share this guide with loved ones and allies to assist them in their own endeavours. Our world will benefit highly from the harmonious perspective of critical thinking as we move forward in this grand, human story.
I believe, it is perspective, that has proven one of the biggest challenges to human collaboration. When a man looks within his own mind and knows something to be true, how can he be told by another man what is right and wrong? Perspective. A continuous moving target, that isn’t helped by the cognitive blind spots that we all fall victim to from time to time, when we’re battling the fierce obstacles of life—Use this guide as your reminder, that the challenges we face are often camouflaged and hidden. But you can be rest assured, through capability you will conquer this environment like all of your ancestors before you.
-  Psychological Review 1996, Vol. 103 – Reasoning the Fast and Frugal Way: Models of Bounded Rationality
-  Wikipedia – Cognitive Bias
-  Wikipedia – Book: Thinking, Fast and Slow
-  Journal of Personality and Social Psychology 2007, Vol. 92 – Self-Control Relies on Glucose as a Limited Energy Source: Willpower Is More Than a Metaphor
-  Wikipedia – Müller-Lyer illusion
-  Wikipedia – Socratic Method
-  Education Counts – Impact of education on income