top of page
  • Tracey Kyles

Cognitive Bias: How to communicate effectively with someone who’s gone down the internet rabbit hole

Nodson, D. (Photographer). (n.d.). Stone Tunnel [digital image]. Retrieved from

So in my last entry, I had mentioned how scientific illiteracy, as well as confusing media communication, has led to misinformation. I had stated if we had closed the gap between science communication and the general public, as well as filtered what the media reported, we would’ve probably had better communication during Covid. This alone would’ve helped more people and fought the misinformation out there.

But despite that, even if we were to fix the way things are right now, there are still a good portion of us who are so far gone that they’re finding all sorts of crazy alternatives online. Even outside of medical communication, you have people who are spreading and believing in eccentric sociopolitical ideas and conspiracy theories. There isn’t a doubt in my mind that some of you reading this probably know someone who’s gone down an internet rabbit hole, and you probably know how difficult it’s become to communicate with them. And so, the subject at hand here is how do you communicate with that person close to you who is now difficult to reach?

To properly answer this question, let’s start by first asking this: Why are there groups of people who can’t trust mainstream information? On top of the reasons I had covered last blog post, there is one big psychological phenomenon at play: Cognitive bias. Selective Exposure caused by social media only worsens it.

Cognitive Bias and Selective Exposure

To further break this down, a cognitive bias is when people create their own reality from their perceptions (Haseltan et al., 2005). A simple answer to its cause is that it’s a product of our minds trying to make sense of the world. It can manifest as superstitions, following unsubstantiated hunches, pseudoscience, and conspiracy theories (Žeželj & Lazarević, 2019). Many of us are probably familiar with that source when coming across ideas online.

And that brings us to the second part of this. What is selective exposure theory? It’s when people only want information that confirms their pre-existing views (Hart et al., 2009). This theory extends from Cultivation Theory, which states an opinion or belief can seem more real or believable if it’s repeated enough (Gerbner, 2002), and (you guessed it) cognitive bias. In a nutshell, it’s the classic mention of “echo chambers” I’m sure many of you have heard about. The classic story is that a person joins an online community. Once there, they’re only exposed to the ideas of that select group. They tell each other the same information over and over until all they believe now is that piece of information, and those on the outside are wrong, or “blue-pilled,” as some communities say.

So how did we get like this? As the internet has come to evolve over the years into what we know now, people have become their own content creators on social media. As a result, we have random outspoken individuals becoming the new voices of expertise or sources of information people turn to. And this has become an unforeseen problem. Research into selective exposure in 2014 noted its far-reaching effects, with individuals’ perceptions becoming distorted the deeper they went into their bubbles (An et al., 2014).

This all comes back to a bigger picture. As stated in the prior entry, information out there is complex and confusing for many people. Enter ELM theory, or the Elaboration Likelihood Model (Petty & Cacioppo, 1980). Basically, it’s harder to persuade people with overwhelming logically-driven arguments, but simplified emotional arguments resonate much faster with people. It’s because an argument driven by logic takes much longer to process. How much the person is willing to process depends on their education level. This is why it seems like emotionally-charged messages tend to rile the masses much faster.

So to tie it all together and make a summary of my point, when the world out there is saturated with complicated information that is hard to decipher due to scientific illiteracy as well as confusing media (especially online), we have a perfect breeding ground for cognitive bias. It’s a natural part of human nature to want to fill in an answer to a problem you can’t solve, and so that’s what many people do. Individuals begin to turn to other parts of the Internet, which in turn, leads them to charismatic voices of “reason” online (YouTube personalities and loud, opinionated pundits, for example), then they eventually find their way down echo chambers of communities online for more answers that match their newfound beleifs.

So what do we do about this?

I understand some of this advice might sound hard, especially now that the political climate is so tense. Nonetheless, these techniques work when you have to sort out dealing with someone close to you who is far down the rabbit hole. At the very least, it starts with those nearby us. There’s another part to ELM Theory: If the person receiving the argument already has a negative attitude towards the position of the argument or the person, then they’re likely going to resist the message no matter what.

So to start, don’t use force. Don’t insult the person or attack them. This will only further trigger their cognitive bias towards you. Instead, be ready to be objective, informative, and empathetic. Especially listen, and allow them to explain themselves. Someone closer to the person can establish a stronger emotional attachment, and so they’re more willing to listen to you when you are trustworthy and show that you care. When the time comes, calmly bring up sources you have. If you can, encourage them to take a break from social media. Have time with the person away from phones and computers, just one-on-one. A lot of times, extended breaks help people avoid the triggers for their cognitive biases. Lastly, do understand this is a process that takes time.

This technique is exemplified in a story from Alabama that ran this year in the New Yorker (Al-Sayyad, 2021). A local woman by the name of Dorothy Oliver and the county commissioner, Drucilla Russ-Jackson, both spoke to locals in their town to resolve vaccine hesitancy. The two were able to get a nearby hospital to set up a pop-up site in the small town so people could have easy access to vaccines after getting at least 40 people to sign up. And this was simply done not through abrasive argument, but through friendly neighborhood conversation.

“I just be nice to them. I don’t go at them saying, ‘You gotta do that.’ ” Oliver simply states when asked how she succeeds in this effort. Even when people weren’t in agreement, she simply showed respect and patience as that person explained their side of the argument. As a result of their efforts, 94% of the adults in their town agreed to a vaccination.

In general, this isn’t just a habit we need for dealing with cognitive bias, but in general. And, for those of you who are thinking it, it’s true that this isn’t always easy. When tensions are high, fair discourse sometimes seems impossible. With some topics, the chance of a peaceful solution may even be nonexistent. Even so, we should aim for it when we can. My final message is to always try to remember that you’re dealing with other human beings who have their own emotions and thoughts, even when they're lost. Thank you for reading.

BONUS: Here’s a link to some info on how to talk to family about the Covid-19 vaccine.


Haselton MG, Nettle D, Andrews PW (2005). "The evolution of cognitive bias.". In Buss DM (ed.). The Handbook of Evolutionary Psychology. Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746.

Žeželj, I., & Lazarević, L. B. (2019). Irrational beliefs. Europe's journal of psychology, 15(1), 1.

Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychological bulletin, 135(4), 555.

Gerbner, G., Gross, L., Morgan, M., Signorielli, N., & Shanahan, J. (2002). Growing up with television: Cultivation processes. In J. Bryant & D. Zillmann (Eds.), Media effects: Advances in theory and research (pp. 43–67). Lawrence Erlbaum Associates Publishers.

An, J., Quercia, D., & Crowcroft, J. (2014, October). Partisan sharing: Facebook evidence and societal consequences. In Proceedings of the second ACM conference on Online social networks (pp. 13-24).

Cacioppo, J. T., & Petty, R. E. (1984). The elaboration likelihood model of persuasion. ACR North American Advances.

Al-Sayyad, Y. (2021, August 11). An alabama woman's neighborly vaccination campaign. The New Yorker.

40 views0 comments

Recent Posts

See All

Scientific communication

Hello! Welcome to my blog. Here, I will generally focus on anything relating to communication research. As someone who studies...


bottom of page