Human objectivity article illustration by Catarina Morais

In his bestselling book Thinking Fast and Slow, Nobel prize-winning economist and professor of psychology Daniel Kahneman proposes we have two basic systems of thought. System 1 is automatic and intuitive — it loves to simplify, make assumptions and jump to conclusions. Whereas System 2 is our slow, conscious and effortful mode of reasoning.

System 1 deals with answering simple maths problems, like 2 + 2, driving a car on an empty road and completing phrases like “bread and …”. Your System 1 brain operates quickly and involuntarily. You can’t stop yourself from knowing that 2 + 2 = 4 or from thinking of butter when someone says “bread and …”.

System 2 on the other hand is activated by mental activities that require sustained attention. Think parallel parking in a narrow space, trying to focus on one person’s voice in a crowded, noisy room, or working out the product of 16 x 23.

We identify best with our System 2 selves, the self that makes careful, conscious choices and judgements. But actually, it’s System 1 that’s in charge of the majority of our decision-making throughout the day.

“Although System 2 believes itself to be where the action is,” Kahneman writes, “the automatic System 1 is the hero”. This unfortunately leaves us susceptible to unconscious errors and systematic biases, which can lead to poor decision-making.

Most of the time, our brain’s two-system approach to judgement and choice works well. “The sophisticated allocation of attention has been honed by a long evolutionary history,” says Kahneman. We rely on System 1 thinking to guide the majority of our thoughts and actions because it’s impossible for us to analyse everything going on around us. System 2, which requires significant cognitive effort, only steps in when System 1 runs into difficulty and slow, deliberate thinking is needed.

Cognitive biases are psychological tendencies to focus on some limited set of information at the expense of other relevant information, without there being an explicit process of reasoning behind the selection of information.

“The human mind, at this point in time, is a mystifyingly powerful inference engine,” says Stephen Dewitt, cognitive scientist and psychology lecturer at University College London. “However, like anything, it is limited and the world is so complex that it does resort to mental shortcuts to save time and energy.”

System 1 has been critical to our survival and helps to make life-saving decisions by sensing danger and responding quickly to perceived threats. But problems arise when we allow our fast, intuitive system to make decisions that really require our slow, analytical system. This is because the operative features of System 1 give rise to unintentional errors known as cognitive biases.

“Cognitive biases are psychological tendencies to focus on some limited set of information at the expense of other relevant information, without there being an explicit process of reasoning behind the selection of information,” explains Dr Katherine Puddifoot, assistant professor of philosophy at Durham University. We make these mistakes because the brain can only handle a finite amount of information, so it uses a number of tricks to avoid getting overwhelmed.

The digital age bombards us with more information than we can possibly handle. This leaves us more susceptible to cognitive biases, which in turn causes us to make more suboptimal judgements and decisions than ever before. “With the large volume of information and repeated exposure to messages on different platforms from different people, there is a significant vulnerability of cognitive biases to influence people’s decision-making,” says Dr Eryn Newman, cognitive psychologist and senior lecturer at the Australian National University. As much as we might like to think we can’t be unconsciously swayed, the truth is, we’re all vulnerable to the mis- and disinformation we encounter online.

There are numerous cognitive biases, but, according to Kahneman, overconfidence is the most significant. When it comes to consuming information online, we tend to feel more confident in our ability to discern fact from fiction than we really are. Ironically, this makes us more likely to fall victim to misinformation and unwittingly participate in its circulation. Indeed, a large-scale study published in the Proceedings of the National Academy of Sciences of the United States of America found that three in four Americans overestimate their ability to distinguish between legitimate and false news headlines.

If we want to get better at identifying false news content, we must slow down and engage in self-monitoring behaviours. Lisa Bortolotti, professor of philosophy at the University of Birmingham, suggests pausing to ask yourself where a story has come from before you share it: “Check whether there is independent evidence supporting it, who would have an interest in spreading it and what effect it might have on debates we care about.”

Research shows that two people with opposite opinions can read the same article and both become more convinced of their opinions.

When it comes to the conspiracy theories that thrive on the internet, we tend to think of those who fall for them as irrational and deluded. But Bortolotti proposes that people are drawn to them because they satisfy social-psychological motives, including the need for understanding and control. “In times of uncertainty, human beings seek explanations that restore our sense of control and solve a problem once and for all,” she explains. “When new challenges emerge and quick solutions are not available we may turn to conspiracy theories, according to which malevolent powerful people are responsible for the threat, because the uncertainty is hard for us to tolerate.”

Nothing is more certain to us than the facts or opinions that we already hold to be true, and confirmation bias — the tendency to seek out and believe information that confirms our pre-existing beliefs — is rife in our fast-paced digital world. Dr Puddifoot explains that if the information we encounter online supports, for example, our own political stance, we’re likely to believe it. But if it doesn’t fall in line with our beliefs or what we already know, we’re inclined to subject the information to more scrutiny and are more likely to disregard it.

“Research shows that two people with opposite opinions can read the same article and both become more convinced of their opinions,” explains Dewitt. This makes us vulnerable to false claims that are congruent with our existing values. It also leads to echo chambers where we are only exposed to information that matches our opinion. These make it difficult to correct our inaccurate beliefs, enhancing our biases, perpetuating ignorance and strengthening polarisation.

We similarly gravitate towards information that feels familiar to us — an action caused by familiarity bias. “When information is repeated, it feels familiar so we see it as more plausible, trustworthy and true,” says Newman. This makes us prone to accepting information we come across multiple times, regardless of whether it is based on a legitimate fact or if it comes from a credible source. We might be sceptical of a false claim the first time we come across it, but if we keep seeing it pop up on our newsfeed, we’re likely to start believing it to be true.

Dr Newman also notes that the fluency heuristic — a phenomenon whereby the quicker we’re able to process information, the more we believe it — can have a significant impact. When information is easy to perceive (high colour contrast), easy to hear (good audio) and easy to pronounce (a name that the receiver is familiar with), we tend to evaluate that information positively, even if it is not true. “The exchange of information online is infused with variables, such as repetition of claims, clear fonts and the addition of photos, that produce a feeling of ‘truthiness’,” says Newman.

Clearly, there are numerous cognitive biases that make us susceptible to unintentionally believing and sharing misinformation, but what is it that motivates us to knowingly spread false information? “There are strong pro-social forces, which make us want to feel a part of a group, but also make us want to appear to be a part of a group,” says Dr Puddifoot. So we spread false information to signal group membership and express support for our “team”, even when we know it to be false. Conformity bias is relevant here too — individuals who want to conform won’t pass on accurate evidence that contradicts the beliefs of their ingroup, keeping false beliefs in place and driving polarisation. For example, if you are anti-vaccine and your community is anti-vaccine too, you won’t be inclined to share evidence that suggests vaccines are safe because it doesn’t match the group’s view.

According to Dr William J Brady, a Yale University social psychologist, we may also intentionally spread misinformation online because of the social rewards it brings in terms of likes and shares. Bortolotti agrees, pointing out that, “we all like to be the one that tells an alternative account that conflicts with the official story, because people find it intriguing and it makes us sound interesting and better informed than others.”

So, what can we do to improve the quality of the judgements and decisions we make when we engage with information in our fast-paced digital world? The fact that many of our biases operate under the radar of consciousness means they are difficult to tackle.

“Errors can be prevented only by the enhanced monitoring and effortful activity of System 2,” writes Kahneman. “As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical.” Instead, he suggests recognising situations in which mistakes are likely and working harder to avoid significant mistakes when the stakes are high.

While we now have more information and less time to process any individual piece of information, Puddifoot argues the extra data we have could in fact allow us to overcome some of our cognitive biases. “We have the capacity to use the information found in these new platforms in a careful and reflective way,” she says. “Whether or not the 24/7 nature of the internet and the emergence of new social media platforms amplifies our cognitive biases will depend on how we respond to the information that is provided.” In other words, we must slow down and invoke System 2 when dealing with complex issues online.

“The mere fact that we can become aware of our biases and limitations is a strength,” adds Bortolotti, “because it enables us to take measures to get closer to the standards of rationality we value so much.”