As subjects of the Information Age, we’re bombarded daily with half-truths, misleading statements, bias, corporate spin and out-and-out fabrications masquerading as facts. Finding what to trust in a world of growing disinformation is becoming ever more troublesome. In this article, we’ll be looking at how we can identify the BS and find trustworthy and objective sources of information.
One of the virtues we’re continuously advocating here at Capable Men is the trait of critical thinking. Forming an objective analysis of the issues around us in order to form a judgement that allows us to act decisively against the masters of spin and deception. You, me and everyone you know has been deceived at some point by political leaders, lobbying groups, news outlets and the corporate world to exploit your emotions to further better someone’s position.
Sometimes, it’s just harmless spin – Perhaps on the subject of moisturising cream or other trivial objects. Then the next time, it’s political spin, which leads you into the wrong voting booth. Arming yourself with the tools to evaluate information is key if you have any desire to navigate this world on your own terms. Far too many of us have simply given up caring about such matters and then find ourselves easily exploited by those who have the skills to do so.
This is more important today than ever before, simply due to the fact that spin now comes at us in ways that didn’t even exist a few decades ago. The Internet has enabled a new environment for information that breeds continuous falsehoods that replicate and spread like the plague. News organisations (Newspapers, online editorials and TV/Media) now all compete in an increasingly hostile news environment that competes for the user’s viewership with sensationalist headlines that prioritise speed and impact over substance and facts. We’re all navigating this deceptive environment every day, and we can no longer trust our media at face value, we must become capable in our information gathering tactics moving forward.
Firstly, we must look for the red flags of bad information, some of which are featured in a great book on this subject that I highly recommend reading – UnSpun: Finding Facts in a World of Disinformation.
- Fear – Does your content contain elements of fear? Fear sells, and it’s no secret that our judgement can become deeply impaired when we become fearful of a threat. The buildup to the Iraq War in 2003 was an excellent example of fear tactics being utilised to influence the populace with misinformation. Some circumstances will certainly require such a narrative to get the point across (Like Global Warming for example) The point here is that fear will often be a deceptive cloak that hides the lack of real evidence.
- The Writing Style – Is the writing clear, coherent and formatted well? If the body of text is unable to adhere to the basic principles of grammar, then we’re off to a bad start.
- The Blame Game – The blaming of others is often a reflex, with little regard for facts. Most people have no ownership over their failures or weaknesses and will be quick to point the finger at others – potential red flag.
- Glittering Generalities – Learn to recognise glittering generalities, blanket-terms that aim to collectivise the information into fancy adjectives that avoid facts and questions at face value.
- Lack of a source – One of the best things we can do to evaluate information is to seek the original source of the story/data and analyse the content as it was originally provided. If no source is provided to a sensationalist claim or piece, then this must be seen as a potential red flag.
“Coca-Cola isn’t just carbonated water that’s been flavored and sweetened, it’s “the Real Thing.” United isn’t just an airline emerging from bankruptcy, it’s your access to “the friendly skies.” Allstate isn’t just a colossal insurance company, it’s “good hands.”
— UnSpun: Finding Facts in a World of Disinformation
The above photos formed a highly deceptive piece of content that was making the rounds on Social Media in 2015 during the European Refugee Crisis. While refugees pushed towards Europe, discomfort and debate formed across the continent. The above imagery and false commentary formed from that dissonance, serving as a visual warning of the situation coming Europe’s way.
However, the photographs employed above were incorrectly described and served as propaganda to fuel emotion. The photos were not from mid-2015; instead, the photographs were published as documenting an influx of people migrating from Albania to Italy in 1991. The facts didn’t stop the viral momentum of the content spreading across social media.
“The science of psychology has taught us a lot about how and why we get things wrong. As we’ll see, our minds betray us not only when it comes to politics, but in all sorts of matters, from how we see a sporting event, or even a war, to the way we process a sales pitch. Humans are not by nature the fact-driven, rational beings we like to think we are. We get the facts wrong more often than we think we do. And we do so in predictable ways: we engage in wishful thinking. We embrace information that supports our beliefs and reject evidence that challenges them.”
— UnSpun: Finding Facts in a World of Disinformation
We’re not guilt-free in this cycle of disinformation. Cognitive bias effects us all, and if the story suits our own agenda, it often gets a free pass at the expense of facts. Taking ownership of oneself, seeking the truth and understanding that we have an inherent bias operating in our primal subconscious, will go a long way in stopping the BS from gaining momentum.
To help you understand the hidden forces of bias, we have created a comprehensive Cognitive Bias Field Manual to assist you in this endeavour. In the meantime, here are a few examples of cognitive Bias which may obstruct our critical thinking:
- The ambiguity effect – Decision making is affected by a lack of information or ambiguity. The effect implies that people tend to select options for which the probability of a favourable outcome is known, over an option for which the probability of a favourable outcome is unknown.
- Belief bias – The tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. In other words, if people agree with a viewpoint, they are inclined to believe that the process used to obtain the results must also be correct.
- Authority bias – The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.
- Bandwagon effect – The bandwagon effect is a phenomenon whereby the rate of uptake of beliefs, ideas, fads and trends increases the more that they have already been adopted by others. In other words, the bandwagon effect is characterised by the probability of individual adoption increasing with respect to the proportion who have already done so. As more people come to believe in something, others also “hop on the bandwagon” regardless of the underlying evidence.
- Moral credential effect – A bias that occurs when a person’s track record as a good egalitarian establishes in them an unconscious ethical certification, endorsement, or license that increases the likelihood of less egalitarian decisions later. This effect occurs even when the audience or moral peer group is unaware of the affected person’s previously established moral credential.
- Normalcy bias – People with a normalcy bias have difficulties reacting to something they have not experienced before. It causes people to underestimate both the possibility of a disaster and its possible effects.
I wasn’t entirely sure if I was going to discuss identity politics in this piece, but after consideration, I decided to do so. Identity Politics is a big reason why different races, genders, class structures, ethnicities, nationalities and religions have produced false information to further their agenda. Such methods are often never challenged from within, due to In-group favoritism. Sam Harris has an excellent bit on this (Taken from his podcast) that I implore you all to listen to. [3min, 55sec – Youtube:]
Thinking about the data
- Don’t Confuse Anecdotes with Data – One or two fascinating tales don’t prove anything. They could be far from typical. These tales could be on the fringes of the bell curve and yet, they may be used as tools to prop up a weak concept to be believed as an everyday occurrence.
- Seeing versus Believing – It’s completely natural to trust what we can see with our own eyes and senses, but ultimately humans are simply poor data-taking devices. Our own experience can often mislead us and it is why the eyewitness testimony in science is undoubtedly the lowest form of evidence available. But people will fight tooth and nail to defend something they have seen because they know it to be true. It’s not their observation that is likely wrong, but the person’s interpretation of the experience. I’ll never forget the video from Neil DeGrasse Tyson explaining this phenomenon when he was asked if he believed in UFOs.
“There was a police officer who was tracking a UFO that was swaying back and forth in the sky. In a squad car, chasing a UFO and the UFO is moving back and forth like this… (Moves hands) Later it turned out that the cop car was chasing Venus and he was driving on a curved road! But he was so distracted by Venus he thought Venus was the one moving!”
— Neil DeGrasse Tyson
- Counterfactual Method – If a source makes claim A, consider whether or not there’s a good case for not-A. What are the questions raised? Thinking with this approach, helps you identify the flaws in an argument and leads you down a path to finding out more about the subject.
- Not All ‘Studies’ are Equal – Studies come in all shapes and sizes and often form the basis of sensationalist headline news, without the study being criticised or assessed. Often the study will involve an alarmingly small sample size with a high margin of error and yet this will often be overlooked by yellow journalism.
- Who stands behind the information?
- Does the source have a motive?
- What method did the source use to obtain the information?
- How old is the data?
- What assumptions did those collecting the information make?
- How much guesswork was involved?Are there other places the information provided from the study can be verified? Public records? Respected scientific journals? Are other unrelated sites or news organisations talking about this? What is their input? Is there a general consensus to the facts of the story?
We have now identified spin, the warning signs that may be available, the psychological bias we inherently have towards data and the process we can use to think about data objectively. At this point, it would be suitable to discuss some reputable sources of information/data that we can all use online, but I’ve decided not to.
Developing your own pallet of respected sources of authority is something which we should all challenge ourselves to do using the various techniques above. Remember to mentally accredit the sources when they pull through with accurate data (It’s always easy to remember the falsehoods and never the facts) and dismiss those who continuously spew their BS.
Hold friends and family accountable for their words. Let it be known that their weak arguments will not be respected if they can’t be defended. This isn’t hostile, it’s intellectual accountability and the refusal to play along with senseless information which can often lead to irreversible decisions and conflict.