EMT and the difference between Sceptics and Believers

“EMT” in the title refers to Error Management Theory, a theory in Evolutionary Psychology which posits that there are two types of errors which can be made in particular situations;

  • Type I: the false positive error and
  • Type II: the false negative error.

If this sounds at all familiar, Michael Meadon recently directed us to an excellent article on EMT at The Mouse Trap when he reviewed the forty-fourth Encephalon.

Before I continue to discuss EMT, it is important to note a couple of important points. Firstly, being a theory in Evolutionary Psychology, it is not prescriptive of the way that people should think but is rather a descriptive framework for the evolutionary reasons for the way that we think (for more on the scope of the field of Evolutionary Psychology, take a look at this previous entry on the topic). And secondly, because it is not prescriptive, we should avoid labelling people who make Type I errors or even the people who make Type II errors as being “primitive” thinkers.

What is Error Management Theory?

In the human evolutionary past, we had to deal with hidden dangers. The situation could be one where, for example, a group of early humans were spending the night in a cave and before leaving in the morning one of them heard a rustling in the bushes near to the entrance. In this situation there are four possibilities;

  1. there is a predator lying in wait at the entrance to the cave and the cave-dwellers choose to stay in the cave fearing the unseen predator
  2. there is a predator lying in wait at the entrance to the cave and the cave-dwellers do not think that there is a predator there and exit the cave
  3. there is no predator and the cave-dwellers choose to stay in the cave fearing the unseen predator
  4. there is no predator and the cave-dwellers do not think that there is a predator there and exit the cave

In the scenarios described in point 1 and point 4 are successes. We are not interested in the successes but we are interested in the errors, more specifically, the relative cost of the errors. Scenario 3 is a false positive error (a Type I error) and scenario 2 is a false negative error (a Type II error).

For a more up-to-date example; consider the scenario of a fire alarm painted by Martie Haselton (refer to her document detailed a little further down). If we have a fire alarm which goes-off often, it would be a minor inconvenience to believe the alarm is always accurate (even if it is not always accurate) and be wrong (Type I) than to always ignore the infuriatingly frequent alarms and be wrong only once (Type II).

In the field of Evolutionary Psychology, researchers have found that people systematically make the same types of errors. On the whole, people are predisposed to making one type of error or the other type of error. They are not saying that people always make an error, but when people do make errors (whether they make them frequently or infrequently), they tend to always make the same type of error; either they hide from imagined fears or they ignore hidden dangers.

If you would like to consider more applications of EMT, there is a great introduction to Error Management Theory written by Martie Haselton which was published as part of the Encyclopedia of Social Psychology.

Are all errors equal?

From the examples I gave above, you might be tempted to think that it is always better to make Type I errors than Type II errors, after all, it’s better to be inconvenienced by the fire alarm than to die in flames; “Liewer ‘n bang Jan as ‘n dooie Jan” as the Afrikaans say (rather be a scared John than a dead John). Indeed Evolutionary Psychologists agree with this and have postulated that we are hard-wired to make Type I errors in certain situations. But, crucially, we are also hard-wired to make Type II errors in certain situations.

Martie Haselton – Research Overview

When the costs of a false positive error are recurrently greater than the costs of a false negative error, natural selection builds adaptations biased toward a false negative error, a miss. When the costs are reversed, selection biases the adaptation toward making the opposite error, a false alarm.

We have applied EMT to human courtship signaling [Haselton & Buss 2000 – Error Management Theory: A New Perspective on Biases in Cross-Sex Mind Reading]. EMT predicts that men possess a bias toward over-inferring sexual interest in women on the basis of ambiguous cues, such as a smile. We propose that this bias minimized missed reproductive opportunities in ancestral environments, and hence it should evolve instead of mechanisms with high thresholds for inferring sexual interest or those designed for maximum accuracy. According to another EMT hypothesis, the cost structure was reversed for ancestral women when they developed inferences about men’s interest in commitment. Some initial skepticism about a man’s commitment would have been more beneficial (and less costly) than falling victim to deceptive signals of commitment intent. As a result, we expect commitment inferences in women that are biased toward false negative errors (misses), at least in the early phases of courtship.

So that’s pretty much how far I have found that the research in this area to have gone to date. I personally think that these findings hint at a much larger significance in the difference in thinking patterns in Sceptics and Believers. From here on out it is all an hypothesis that I think has testable components and certainly could be falsifed … but I’ll go into more detail about the testing in a follow-up post.

What I think, is that one of the key features that believers look for in a warning / prophecy / conspiracy works hand in hand with the “hidden threat / agent” component. The feature that I am talking about is that less evidence is somehow more significant of a hidden agent – it is easy to see why most people can’t detect the dangerous predator outside the cave while the special few, “the heroes” can detect the hidden agent and inform everyone.

I also think that that society as a whole is predisposed to accepting Type I errors over Type II errors because it is likely to contribute towards the continued propagation of the group’s genes. Those who warn about the hidden dangers (when there is no evidence of the hidden danger) are taken more seriously than those who are often seen to be “shooting down” the people who are “only trying to help” – those who are thus “recklessly endangering” the group because, after all, “what if the warnings are right?”

Does this mean that sceptics are always the villains to the believers’ heroes? Not at all. Sceptics (and here I am talking about the modern Scientific Sceptics rather than the colloquial term “Sceptic” which refers to all “nay sayers”) are not trying to make Type II errors. Sceptics are trying to make informed decisions about the state of the world so that we can have more successful outcomes when evaluating the threat of unseen dangers.

If you recall earlier in this post I mentioned four outcomes to the cave-dwellers scenario and that we are only interested in the errors for the purposes of the discussion and not the “successes”. But for a moment I would like to explore the “successful” scenarios. In number 4, it really is a success in the fullest sense of the word; the cave-dwellers survived, and also they got to fulfil their goal of leaving the cave. By contrast, the first scenario is only partly a success; it is true that they survived, but they did not manage to leave the cave, they are trapped by their own fear. “Liewer ‘n bang Jan?” – I don’t think so.

Some contemporary examples (warnings, conspiracies and predictions)

14 October 2008

Blossom Goodchild is a self-described psychic and medium to the spirit world. It is rather memorable that she predicted that the world would be visited by a giant alien spacecraft on 14 October. She was trying to be a hero. Although it wasn’t a warning about malevolent alien beings visiting us and thus endangering our lives, it was more the sort of warning that we should take heart medication before venturing outside on the 14th. Perhaps you think it unfair to select examples from the past because hindsight is 20-20. So let’s look at some current examples.

The Large Hadron Collider Conspiracies

The LHC was recently shut down for maintenance work, even before it was able to smash the first particles. Particle physicists are on tenter hooks waiting for the planned start-up in the European spring months. But that just gives the conspiracy theorists more time to peddle their wares to the unsuspecting public.

One example is the YouTuber gorilla199 who has made a series of videos warning us that the end of the world will be at the moment that the LHC begins colliding particles. But rather oddly, he also warns about the end of the world in 2012, so perhaps he is hedging his bets early?

People will follow him, they will grow fearful of the invisible danger and learn a distrust of science. Even when gorilla199 is proven wrong and 2009 passes without Armageddon happening and again in 2012, the people will think that he may have been wrong that time, but he is “only trying to help”. Science is being portrayed by these people as being the irresponsible risk-taker, charging headlong into danger when this characterisation is absurd, scientists do not want to make errors, but the pay-off (the doubly successful scenario 4) is the ultimate goal; to explore the unknown a little more, push back the darkness and reveal that the irrational fear of the invisible is a shackle we must escape.

Armageddon 2012

South Africa has not been immune to the absurdity which is the prediction that the world will end in 2012, why should we be? We are all human after all and so our population prefers to give time to the warnings from the “philanthropic hero” rather than the “villainous sceptic”. A few new age writers (such as Desre Koertze) talk about a change in human spirituality towards knowing the Oneness (whatever that means) but shy away from making apocryphal prophesy (of course we know what this means; untestable and thus worthless).

But there are those believers (such as Letta Pretorius who has registered the domain http://www.armageddon2012.co.za) who are willing to stick their neck out and predict an event we can all witness. I agree with the Skeptic Detective; I love a testable claim.

Wayne Herschel a South African proponent of alien visitation and alien influence over earth’s government, has even spoken about the armageddon 2012 but rather slyly claims success whether the event happens or not. From the website Daily Grail:

Since my book (The hidden Records) has had a lot of interference in getting into book stores, due to its forbidden content, I need to take the next step in spreading a critically important message it was supposed to be leading up to:

I have good reason to believe there will be a meteor impact event in the northern hemisphere around the year 2012 IF humanity has not made some critical efforts to relieve the suffering in this world.

This world has been protected from impact events for 10000 years… but no more.

Can we overcome this obstacle?

Beyond the conspiracies and prophecies mentioned above, there are, of course, many hundreds of examples of people believing in some danger or hidden agent without evidence and it is these examples that we sceptics tirelessly battle on a regular basis. But if people are wired to prefer to make Type I errors over making Type II errors, are we fighting a losing battle? My answer is “No” because we are not asking believers to make Type II errors, we are asking them to join us in thinking critically about evidence and thus avoiding Type I errors and scoring more “double successes” than they would experience with self deception and self-imposed fear.

Advertisements

~ by James on 30 November 2008.

5 Responses to “EMT and the difference between Sceptics and Believers”

  1. “But if people are wired to prefer to make Type I errors over making Type II errors, are we fighting a losing battle? My answer is “No” because we are not asking believers to make Type II errors, we are asking them to join us in thinking critically about evidence and thus avoiding Type I errors and scoring more “double successes” than they would experience with self deception and self-imposed fear.”

    It does feel like we’re fighting a losing battle, though. The thing is how do we… 1.get people to look at the evidence critically and 2. do it in such a way without coming across as irritating know-it-all’s?
    In my experience, no matter how tactfully you try and get people to look a a situation critically, thay just aren’t interested in being told they may be wrong. To quote a recent example, a family member fowarded the “HIV positive blood in the tomato sauce” email. So I sent it back saying it was a hoax, along with the relavant links, as well as info on how HIV cannot survive outside the human body for long. She sent a relpy saying that she knows it might be a hoax, but she still didn’t want blood in her tomato sauce. I replied that THERE was NO blood – that it was all a hoax. The reply I got was that she’s not going to take any chances anyway, so haox or not she chose to believe the email – and it seemed to confirm her view that I think I’m a smart ass. *Sigh* being a sceptic isn’t easy……. :-/

  2. Haha, I know exactly what you mean. Chain letters are a bugbear of mine mostly because I don’t understand how anyone could fall for them. I was considering including chain letters as an example in the article but it was already very long, so I’m glad that you brought it up in the comments. 🙂

    I also try to rationally explain the situation with email hoaxes and I do have limited success, I can “dead-end” the “trick” chain letters (such as the incredible mind reading eMail) and the computer-related hoaxes (like virus warnings, Microsoft prizes and “send to 15 people to see a naked lady” type eMails) – but that’s probably because they trust my authority because I’m in the IT industry (sigh, how can I fix that now?). When the chain letter descirbed “good luck” scenarios or “angel blessings” depending on the number of people that you send it to, or “revealed religious prophecy” I am largeley ignored.

    A few strategies that I try to employ that may help:
    (1) Recommend Snopes (or Google) as a first step.
    (2) For non-good luck/bad luck eMails I like to end with a quote: “A lie travels around the world before the truth gets its boots on” – Mark Twain. This is usually a bad idea for blessing/religious eMails, but I hope it has a lasting effect when the person reads chain letters in the future.
    (3) Tell the person that they should not be embarrased to inform people higher up the chain that they were wrong. You are not alone in being duped, you can see all the eMail addresses of the people who were also wrong. Perhaps it sounds a bit arrogant, I know what you mean.

    The recent example I had only highlights what a difficult task it could be. From a family member I received a “Mondex smart-cards (which they confused with RFID) are the technology that will bring on armageddon” chain letter which was laced with Christian scripture. I was very careful only to address the technological issues in the eMail (how the claims were impossible) and said absolutely nothing about religion. The family member only replied that she believed the prophecy to be accurate – even if the technology that satan used would be slightly different. So I asked if she would send the technological rebuttal to other people that she sent the eMail to. Her reply was that people should make up their own mind about the truth (of course I agree) but she would not send my reply to them. Ha! How’s that for allowing them to make up their own mind?

  3. Very interesting indeed, thanks.

  4. […] difficulty in confirming such hypotheses. The subject becomes even more interesting when used to explain the lack of skeptical thinking in human beings, as Acinonyx Scepticus has done in a recent post. If we could somehow work in a discussion of boobs […]

  5. […] few weeks ago I wrote about my thoughts on the link between Belief or Scepticism and the types of errors in Error Management Theory. I mentioned that I thought that this was a testable idea so I’m writing this to “put […]

Comments are closed.

 
%d bloggers like this: