Why We Fall for Fake News

Why do people fall prey to fake news stories? Can't they just tell when they come across them? Having such easy access to real facts and fact checkers surely takes care of the problem, right? Unfortunately, part of the reason that fake news is such a problem is that people do fall for these stories, and being shown the facts doesn’t help correct this problem. This is due to a feature of human thinking called cognitive biases. A cognitive bias is a gap in reasoning, remembering, or evaluating something that can lead to mistaken conclusions. They’re universal. Everyone has them.

This section explains how cognitive biases operate—what goes in inside people's heads that makes us more likely to fall for fake news, and for us to continue to believe false information even after it's been corrected.

Cognitive biases result from the automatic mental shortcuts that humans naturally take on a daily basis. Normally, these shortcuts make our lives easier. You don't need to re-learn your path from home to work or school every day; it’s become so ingrained in your mind that you can follow it almost without thinking. This "automatic process" allows us to conserve mental energy that we could use for things that are more complicated [1]. At the same time, cognitive biases can cause mistakes in our thinking, too. These biases can act as blind spots, leading you not to realize something that may be apparent. They can also make one part of your mental perspective play a disproportionate role in your thinking. A common example of this is younger people mistakenly believing they don't need health insurance because they are healthy generally [2]. This seems perfectly logical to a healthy person, but overlooks the possibility of catastrophic events.

Cognitive Biases and Fake News

Mental shortcuts or cognitive biases also affect the way we use information. Four types of cognitive biases are especially relevant in relation to fake news and its influence on society: First, we tend to act on the basis of headlines and tags without reading the article they’re associated with. Second, social media convey signals that affect our sense of the popularity of information, which leads us to greater acceptance. Third, fake news takes advantage of the most common political mental shortcut: partisanship. And fourth, there’s a weird tendency for false information to stick around, even after it’s corrected.

Acting without Reading: (Not) Reading is Fundamental

The first of these biases is the tendency to rely on poor signals that fake news sends without evaluating the accompanying information too deeply. An unfortunate fact about getting news via social media is that many people form opinions about news articles without ever having read them. [3]

Source: NPR Facebook page

A humorous, if alarming, example of this comes from a social experiment done by National Public Radio (NPR). As a joke, NPR shared a headline on their Facebook page entitled, "Why Doesn't America Read Anymore?" If users clicked the link, they were directed to a page on NPR's website explaining that the article was a joke. But many viewers, without reading the article, went right ahead and posted comments in response to the headline, clearly not having read the article [4].

This isn’t an isolated incident. Other news or interest sites also report that many comments on the news articles they present come from individuals who are reacting to the headline and not the article itself [5]. In the case of Twitter, researchers examined 2.8 million online news articles that Twitter users shared, and to which they sometimes added their own original comments. More than half the time, according to computer records, more than half of the users who shared the articles never clicked the link that would have enabled them to read the story. It’s clear that people are more than happy to share, retweet, or like things without ever having read them [6]. In terms of the spread and influence of fake news, this could be quite damaging. For example, the rise of clickbait relies on flashy headlines that draw attention [7]. If people only read the headlines, they may mistakenly take what is in the headlines as fact without exploring further whether there’s doubt or another side expressed in the story. Sharing, without reading, can also make stories look like they are gaining popularity, or trending [8]. This makes it more likely for other people to read or retweet them, also. It becomes a socio-cognitive epidemic.

Popularity Cues Affect Acceptance: Fifty Million Frenchmen Can't Be Wrong

Another bias that we fall into when it comes to fake news has to do with how popular a news item appears to be. One of the most well-researched cognitive biases that explains how popularity can influence how we think is called the "bandwagon effect" [9]. As the phrase, "jumping on the bandwagon" suggests, when it appears that a lot of other people like something, we’re more likely to support it, too.

In terms of fake news, the bandwagon effect leads us to focus on how many times something has been shared or liked, rather than by the content itself [10]. The popularity of something allows us to bypass the responsibility for verifying information. If thousands of other people have shared a piece of news, surely someone else has verified it, right? Unfortunately as we've already learned, shares and like can often come about without anyone reading what's being shared [6]. Moreover, as we discuss elsewhere, the popularity of fake news items is often inflated automatically by “bots” the sole purpose of which is to make a fake news story seem especially popular.

Popularity may influence not only our perceptions but our behavior as well. Just as we seem to like what others like, we also want to be liked and present a good image of ourselves to others [11]. Research has verified that the desire to appear in-the-know is one reason many people claim to share information that they haven't read [3].

The Pernicious Impact of Partisanship

James McDaniel…said he created a fake news website last month as a joke to see just how naive Internet readers could be. UndergroundNewsReport.com was launched Feb. 21. In less than two weeks, more than 1 million people had viewed stories on the site and spread them across social media platforms. …"I continued to write ridiculous things they just kept getting shared and I kept drawing more viewers," McDaniel told PolitiFact. "I saw how many fake ridiculous stories were making rounds in these groups and just wanted to see how ridiculous they could get."

McDaniel even tried to warn viewers by putting a disclaimer on the bottom of his web pages saying his posts "are fiction, and presumably fake news." While a handful of people took the time to email him to ask if stories were real or send hate mail, most of the comments on his links blindly accepted what he wrote as the truth.

(Quoted from Politifact.com, March 9, 2017)

A third type of bias comes from the way our personalities and attitudes lead us to see the world, and can lead us into mental traps that are difficult to escape from. Focusing specifically on fake news, one’s political attitudes—liberal vs. conservative—has a huge impact on what we readily believe or reject in the news, regardless of its truthfulness. As uncomfortable as this may be to accept, research has demonstrated that most politically-oriented fake news during the 2016 election campaigns was consumed by conservatives, with Donald Trump supporters being especially likely to encounter and visit fake news sites [12].

Source: Guess et al., 2018 [12]

On average Clinton supporters were more likely to visit fact-checking websites and less likely to visit fake news websites. Trump supporters were less likely to visit fact-checking websites and more likely to visit fake news websites.

We don't yet understand exactly why this was the case. Many private individuals who attempted to make money off of fake news, who had no political preference at all, claimed that they attempted to do so by manufacturing stories that would attract both conservatives and liberals. However, they abandoned the pro-liberal fake news specifically because liberals were not clicking on it [13]. Russian propagandists, in contrast, were more capable of spreading false or misleading advertisements that had either pro-liberal and pro-conservative messages. In any case, fake news, in the 2016 Presidential election was a significantly conservative phenomenon.

But that tendency seems to be reversing course in 2018 [14]. Facebook diligently uncovered (and removed) numerous Russian-backed fake accounts in the summer of 2018, which had announced (bogus) right-wing events (like “Unite the Right 2”) in order to mobilize liberals, feminist groups, and minority members to stage counter-protests.

Source: Facebook Newsroom [15]

A number of real-life, liberal-leaning interest groups were taken in by the charade until Facebook stepped in.

The Persistence of Inaccuracy

One final way that cognitive biases can be particularly troubling is in how long they can last and how they provide barriers to un-doing our errors in belief. It would be nice to assume that a quick fix for fake news would be to tell people when information they’re consuming is false. Intuitively, this seems appealing. However it has two major drawbacks.

First, researchers have found that our memory is quite poor when it comes to remembering what’s real and what isn’t, as long as we’ve seen something. In the case of fake news, Professor Emily Thorson at Boston College found that even in the face of information corrections, "belief echoes" often remain [16]. Belief echoes occur when people remember fake news and claim that it was true, even when they were later presented with correct information. Misinformation is notoriously sticky in people's heads, and simply correcting it in the form of another message can only go so far.

Even if there were to be some form of correction on fake news stories that would warn people to take them with a grain of salt, the absence of those warnings may have a greater impact than their presence. Drs. Gordon Pennycock and David Rand at Yale University examined if warning about fake news would affect people’s belief in information. While people were indeed less likely to believe stories that had a warning attached, when the warning wasn't present they were more likely to believe the stories, fake or not [17]. When people know that warnings are a possibility, they may feel as though they can let their guard down, and then if there is no warning, they think that the information is likely to be believable, which unfortunately may not be the case.

Other research, focusing on Twitter, has shown that users who post fact checks about a fake news story often post misleading content along with the fact check article, too, and what they write may actually contradict what the fact check indicates [6]. Even if people take the further step of examining the original source of a news item, insidious clickbait artists have been known to make the hosting site look as similar to a real news site as possible [7].

We have a lot more information about fact checking elsewhere on this site, but from the perspective of cognitive biases, it’s possible that the reason fact-check services aren't clicked much is simply because people don't think they need them. The most common way that people evaluate whether information is true or not is to use their own intuition [18]. People tend to feel more confident about their ability to identify something that isn’t true, than they really are [19]. Moreover, people also think that they themselves are less likely to be influenced by media messages than they think other people will be, an illusion commonly called a "third-person effect" [20]. Fact checkers don't have enough of an impact because we mistakenly assume we don't need them, even when we’re being fooled.

There are countless cognitive biases that influence everything from how we think, to decisions we make, and how we feel about others. The table below lists some of the more common ones to look out for. Everyone has these biases, and the first step to reducing their influence over you is knowing what they are.

This image describes the most well-researched cognitive biases, and what effect they have on us. (Source: Wikimedia Project)

References

[1] S. T. Fiske and S. E. Taylor, Social Cognition: From Brains to Culture, 2nd ed. Los Angeles, CA: SAGE, 2013.

[2] D. Bennett, “What If Healthy People Don’t Want to Buy Obamacare?,” The Atlantic, 03-Jun-2013. [Online]. Available: https://www.theatlantic.com/national/archive/2013/06/what-if-healthy-peo.... [Accessed: 03-Aug-2018].

[3] T. Hale, “Marijuana Contains ‘Alien DNA’ From Outside Of Our Solar System, NASA Confirms,” IFLScience, 13-Jul-2013. [Online]. Available: //www.iflscience.com/editors-blog/marijuana-contains-alien-dna-from-outsid.... [Accessed: 03-Aug-2018].

[4] NPR.org, “Why Doesn’t America Read Anymore?,” NPR.org, 14-Dec-2016. [Online]. Available: https://www.npr.org/2014/04/01/297690717/why-doesnt-america-read-anymore. [Accessed: 02-Aug-2018].

[5] M. Gabielkov, A. Ramachandran, A. Chaintreau, and A. Legout, “Social Clicks: What and Who Gets Read on Twitter?,” in Proceedings of the 2016 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer Science, New York, NY, USA, 2016, pp. 179–192.

[6] C. Shao et al., “Anatomy of an Online Misinformation Network,” PLOS ONE, vol. 13, no. 4, p. e0196087, Apr. 2018.

[7] C. Silverman and L. Alexander, “How Teens In The Balkans Are Duping Trump Supporters With Fake News,” BuzzFeed News, 23-Nov-2016. [Online]. Available: https://www.buzzfeednews.com/article/craigsilverman/how-macedonia-became.... [Accessed: 02-Aug-2018].

[8] G. King, J. Pan, and M. E. Roberts, “How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, not Engaged Argument,” American Political Science Review, vol. 111, no. 3, pp. 484–501, 2017.

[9] S. S. Sundar, S. Knobloch-Westerwick, and M. R. Hastall, “News Cues: Information Scent and Cognitive Heuristics,” Journal of the American Society for Information Science and Technology, vol. 58, no. 3, pp. 366–378, Feb. 2007.

[10] M. J. Metzger and A. J. Flanagin, “Using Web 2.0 Technologies to Enhance Evidence-Based Medical Information,” Journal of Health Communication, vol. 16 Suppl 1, pp. 45–58, 2011.

[11] C. A. Insko, R. H. Smith, M. D. Alicke, J. Wade, and S. Taylor, “Conformity and Group Size: The Concern with Being Right and the Concern with Being Liked,” Personality and Social Psychology Bulletin, vol. 11, no. 1, pp. 41–50, 1985.

[12] A. Guess, B. Nyhan, and J. Reifler, “Selective Exposure to Misinformation: Evidence from the Consumption of Fake News During the 2016 U.S. Presidential Campaign,” 09-Jan-2018. [Online]. Available: https://www.dartmouth.edu/~nyhan/fake-news-2016.pdf. [Accessed: 03-Aug-2018].

[13] L. Sydell, “We Tracked Down A Fake-News Creator In The Suburbs. Here’s What We Learned,” NPR.org, 23-Nov-2016. [Online]. Available: https://www.npr.org/sections/alltechconsidered/2016/11/23/503146770/npr-.... [Accessed: 02-Aug-2018].

[14] E. Dwoskin and T. Romm, “Facebook Says it Shut Down 32 False Pages and Profiles Engaged in Divisive Messaging Ahead of the U.S. Midterm Elections,” New York Times, 31-Jul-2018. [Online]. Available: https://www.washingtonpost.com/technology/2018/07/31/facebook-says-it-ha.... [Accessed: 03-Aug-2018].

[15] Facebook Newsroom, “Removing Bad Actors on Facebook," 31-Jul-2018. [Online]. Available: https://newsroom.fb.com/news/2018/07/removing-bad-actors-on-facebook/

[16] E. Thorson, “Belief Echoes: The Persistent Effects of Corrected Misinformation,” Political Communication, vol. 33, no. 3, pp. 460–480, Jul. 2016.

[17] G. Pennycook and D. G. Rand, “The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Stories Increases Perceived Accuracy of Stories Without Warnings,” Social Science Research Network, Rochester, NY, SSRN Scholarly Paper ID 3035384, Dec. 2017.

[18] E. C. Tandoc, R. Ling, O. Westlund, A. Duffy, D. Goh, and L. Zheng Wei, “Audiences’ Acts of Authentication in the Age of Fake News: A Conceptual Framework,” New Media & Society, vol. 20, no. 8, pp. 2745–2763, Aug. 2018.

[19] M. Metzger, A. Flanagin, and E. Nekmat, “Comparative Optimism in Online Credibility Evaluation Among Parents and Children,” Journal of Broadcasting & Electronic Media, vol. 59, no. 3, pp. 509–529, Jul. 2015.

[20] A. C. Gunther and J. D. Storey, “The Influence of Presumed Influence,” Journal of Communication, vol. 53, no. 2, pp. 199–215, Jun. 2003.