Tag: cognitive bias

  • Is the World Going to Hell in a Hand Basket? – the Negativity Bias

    Is the World Going to Hell in a Hand Basket? – the Negativity Bias

    Brief contact with a cockroach will usually render a delicious meal inedible. [But] the inverse phenome-non—rendering a pile of cockroaches on a platter edible by contact with one’s favorite food—is unheard of.[ref]Rozin, Paul, and Edward B. Royzman. ‘Negativity Bias, Negativity Dominance, and Contagion’. Personality and Social Psychology Review 5/4 (2001): 296–320.[/ref]

    Why does our media constantly report on the negative aspects of our society, with only a short snippet of positive media? Why is it that people spouting polemic and invectives get so much more airtime, than those building constructive arguments? Undoubtedly some of this is due to the prevalence of polemics and the problem of evil in the world, there is also a cognitive bias lurking beneath the surface: the negativity bias. This bias describes our human predisposition to paying more attention to elements that have an overall negative nature, to those that have a positive nature to them.

    The negativity bias, also known as the positive-negative asymmetry effect, has been regularly observed in a wide range of studies, and is intrinsically felt by many people. For example the loss of a pet or more significantly a relative will resonate with many individuals for a significant period. Or for a child at school being picked last on a sports team, or failing a test, will have a higher and longer lasting currency than all the times that they were picked first or excelled. Psychologically the effects of negative events stick with us longer than the positive ones.[ref]Baumeister, Roy F.; Finkenauer, Catrin; Vohs, Kathleen D. (2001). Bad is stronger than good. Review of General Psychology 5 (4): 323–370. [/ref] Furthermore, in a range of psychological studies, even when artificially controlling for frequency and occurrence the majority of people pay more attention to negative than positive events, and are even internally motivated to avoid negative projection rather than emphasise positive projection.

    'Typical media bias. First they label the wolf 'the big BAD wolf' then they only give Little Red Riding Hood's point of view.'So is this just the old adage of the world going to hell in a hand basket? Are things just getting ever worse, and eventually the situation will be completely untenable, with the world imploding in on itself in a sea of negativity? Well, not quite, or so the research says. Although it is obvious that negative events occur, and seemingly regularly, the overall tenor of the world is that these events are in decline. As Steven Pinker found in his research on violence in The Better Angels of Our Nature we may actually be living in the most peaceful era ever.[ref]Pinker, Steven. The Better Angels of Our Nature: Why Violence Has Declined. 1st Edition edition. New York: Viking Adult, 2011.[/ref] While I’m usually quite suspicious of Whig-type historiography, it appears that the research in Pinkers book stacks up. Why then is it that we feel that the world is just getting worse? A significant portion of it is likely to do with our negativity bias.

    With our inbuilt negativity bias pushing us towards paying more attention to negative events than positive, is is understandable that the variety of media tends to report on negative events. Especially those outlets that are dependent on sales or clicks to pay the bills. In turn this feeds our negativity bias, and so the cycle is perpetuated. However, it is important to note that this is not a chicken-and-the-egg origins scenario. From the plethora of studies it is quite clear that the origins for the cycle lie within our cognitive biases, and are then subsequently fed and reinforced.

    half-full-half-emptyThe same effect occurs when people are asked to make decisions based on the evidence provided. Many will decide on a course of action, or a held belief, based on the negative arguments put forward, rather than the positive arguments. We humans tend to make decisions based on what we may lose, rather than what we can gain.[ref]Rozin, Paul, and Edward B. Royzman. ‘Negativity Bias, Negativity Dominance, and Contagion’. Personality and Social Psychology Review 5/4 (2001): 296–320.
    In a possible confirmation of the FUTON bias: https://sites.sas.upenn.edu/rozin/files/negbias198pspr2001pap.pdf[/ref]

    What does this have to do with academia and the public square, other than something that is cool to know about. Well, one of the primary applications is to do with how positions are argued. While proving the null hypothesis probably wont suffice for formal arguments, it certainly suffices for the public square. What is more, because of the negativity bias, these negative arguments tend to be taken on board more than the same argument phrased positively. As an example I was recently reading an article posted to that erudite news source: Facebook. Essentially the author of the article was arguing positively for the existence of a town from archaeological and historical evidence, and quite academically convincingly too might I add. However, from the comments on Facebook it was evident that the take home aspect of many readers was that the author was arguing negatively, against the proposed thesis that the archaeological evidence pointed to the non-existence of the town. Many readers completely failed to acknowledge the positive arguments, even when quizzed.

    So what for us? Well in academia and public discourse there is a tendency to provide solely constructive arguments, as within the scientific method it is difficult to prove the null hypothesis (don’t get me started on Bayesian theory and NHST again). However, for reception of that discourse we need to be aware that negatively framed arguments tend to be carried with more weight. Sadly then negative arguments must be engaged with, rather than merely dismissed. Of course the difficulty is how to engage with them without merely being negative in response, and on that question I’m very sad to say it is highly contextual.

    How do you deal with negative arguments, and how are you aware of the negativity bias at play in your own work? Tell me below, in the comments.

  • Why are some people just so obstinate? – Bayes Theorem & The Backfire Effect

    Why are some people just so obstinate? – Bayes Theorem & The Backfire Effect

    What do Flat-earthers, Anti-vaxxers, 9/11-conspiracists, Jesus-mythicists and Obama citizenship deniers have in common?

    Well, subconsciously at least, they all display a contempt for Bayes’ theorem, and arguably they strongly manifest the ‘Backfire effect.’ Welcome to another instalment of Cognitive Bias Wednesday, today looking at the ‘backfire effect,’ with a prelude on Bayes’ theorem.

    statsWhy is it that even when presented with all the data and information, some people refuse to modify their beliefs? That even with all the weight of evidence that the world is round, or that vaccines don’t cause autism, that they refuse to budge in their beliefs, and even seem to become more set in their ways. While some of this behaviour is certainly conscious, at least some is also the product of a cognitive bias operating behind the scenes: the ‘backfire effect.’ This is also why most of the arguments that occur on the internet are relatively fruitless. However, before we get to the cognitive bias, it is worth having a brief journey through a neat cognitive feature, the theory of Bayesian inference.

    bayes-theorem-equationThomas Bayes was a 18th century Presbyterian minister, philosopher, and statistician; and while he published works in both theology and mathematics, he is best known for his contribution to statistics, posthumously. His work on what would eventually become known as Bayesian probability theorem was only published after his death, and the impact of it would be completely unknown to him. While Bayesian modelling and statistics are applicable in a wide spectrum of fields and problems, from memory theory and list length effects that my previous lab worked on (MaLL),[ref]Dennis, Simon, Michael D. Lee, and Angela Kinnell. “Bayesian Analysis of Recognition Memory: The Case of the List-Length Effect.” Journal of Memory & Language 59, no. 3 (2008): 361–76.[/ref] through to Richard Carrier’s application to historiography,[ref]Carrier, Richard. Proving History: Bayes’s Theorem and the Quest for the Historical Jesus. First Edition edition. Amherst, N.Y: Prometheus Books, 2012.[/ref] and many more (just don’t get me started on the null hypothesis significance testing debate).[ref]Lee, Michael D., and Eric-Jan Wagenmakers. “Bayesian Statistical Inference in Psychology: Comment on Trafimow (2003).” Psychological Review 112, no. 3 (July 2005): 662–68; discussion 669–74. doi:10.1037/0033-295X.112.3.662.[/ref] The key factor for this investigation is in the Bayesian logic applied to the belief-revision loop. In lay terms Bayesian statistics can be used to predict how much someone will believe in a proposition when presented with a certain evidence set.

    lutececointossTake for example the good old coin toss test, suppose you have a coin with one side heads and the other tails. Logic and the laws of probability would indicate that it should be a 50/50 chance of being heads. But what happens if you flip a coin 5 times and get 5 heads in a row, well statistically speaking it is still a 50-50 chance, even though the probability of getting a long consecutive run trends with 2(n-1). What about if you get 92 heads in a row,[ref]Rosencrantz and Guildenstern Are Dead: Act 1[/ref] or 122 heads,[ref]Lutece Twins, Bioshock: Infinite[/ref] do the odds change then? Probability and statistics give us a clear no, it is still 50-50 no matter how big n is. However, if you ask gamblers at a casino, or for that matter most people on the street, you will get a startling response. Many respondents will say that as it is a 50-50 probability the chance of the next coin toss being tails increases to even out the overall trend. Why? Well it is a bit of a faulty belief-revision loop, and this trend is able to be predicted by Bayes’ Theorem. Using Bayesian inference and applying it to epistemology we can predict the modification of the belief loop and see that degrees of belief in the outcome of a coin toss will rise and fall depending on the results, even though the statistics remains the same. Furthermore, these modifications are overwhelmingly conservative in most people, and this should give us pause for thought when we find evidence that challenges our beliefs.

    But what does this have to do with the backfire effect? I hear you ask. Well the backfire effect is essentially where the Bayesian inference model of the belief-revision loop fails, and fails badly. Normally when people are presented with information that challenges their beliefs and presuppositions they engage in the Bayesian belief revision loop as above, and slowly change (even if slower than you would think). However, when testing how people respond to correction of misinformation Nyhan and Reifler found that, in some cases, rather than modifying their beliefs to accommodate or fit with the information that they have received, they instead clung to their beliefs more strongly than before.[ref]Nyhan, Brendan, and Jason Reifler. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32, no. 2 (March 30, 2010): 303–30. doi:10.1007/s11109-010-9112-2.[/ref]  Essentially in their tests the presence of correcting material for political misconceptions served to strengthen the misconception, rather than modify it. They dubbed this the ‘Backfire Effect.’

    Now this isn’t displayed by everyone in the populace, although I would argue that it works within our subconscious all the time. Some early research shows that the backfire effect commonly raises its head when the matters are of emotive salience. So even though some of the more amusing incidences of the backfire effect that are commonly highlighted involve people sharing satirical news stories from The Onion or Backburner as if they were real news articles, others are less benign. Indeed, for almost every amusing incidence of people not checking their Bayesian revision loops and falling prey to the backfire effect, there are just as many where people are strongly reinforced in their faulty beliefs on items that matter. One of the notable items recently has been the issue of vaccination, where I have seen several acquaintances strongly hold to the proven faulty and fraudulent research that ‘linked’ vaccines with autism. Here the overwhelming body of evidence finds no link between the two, and yet they strenuously hold to the link.

    arguing-internetSo what can be done about it? Well this blog post is one useful step. Being aware of the backfire effect should help us evaluate our own belief systems when we are challenged with contradictory evidence. After all we are just as susceptible to the backfire effect as any other human being. So we should be evaluating ourselves and our own arguments and beliefs, and seeing where our Bayesian inference leads us, with the humility that comes from the knowledge of our own cognitive biases, and the fact that we might be wrong. However, it should also help us to sympathise with those who we think are displaying the backfire effect, and hopefully help us to contextualise and relate in such a way that defuses some of the barriers that trigger the backfire effect.

    Please weigh in on the comments as to what you thought about my explanation of Bayesian inference and the backfire effect. Also let me know what other cognitive biases you would like to see covered.