Welcome to Skeptics Digest, wherein we discuss scientific skepticism (check out the seminal dairy (link below) for a more specific explanation of the type of skepticism we’re working with here)
How many of you have ever tossed salt over your shoulder? How many of you have ever knocked on wood, actually thinking it would help? Do you say "bless you" or "gesundheit" when someone sneezes? Why? These are all examples of superstitions or superstitious thinking. Some of them (sneeze response comes to mind) are participated in by habit, and not because we consciously believe our actions or rituals will have any effect. Some of these superstitions are passed down from generation to generation, and some are newly invented or altered. Most common superstitions are harmless, but they are all representative of the kind of logic traps that lead us to making much more dangerous errors.
Scientific skeptics don’t generally spend a lot of time discussing minor superstitions, but scientific skeptics do need to combat the kind of superstitious thinking that causes many people to make dangerous decisions based on false assumptions. Skepticism, in the scientific sense of the word, should be protecting people against the common judgment errors and easily reached misperceptions that are ubiquitous in human society. We all know they happen, but we need a mechanism to mitigate the damage they can cause. Common superstitions are not often responsible for that kind of damage, but they can be instructive by illustrating the flaws in our thought patterns which lead to bigger failings of logic.
Post Hoc, Ergo Propter Hoc and Correlation vs. Causation
Have you ever uttered the phrase "now you’ve jinxed it"? Superstitions come in all shapes and sizes but a great deal of them get started by way of the post hoc, ergo propter hoc logical fallacy. This is one of the most common logical fallacies in the book. It is essentially the false assumption that correlation equals causation. Whenever two things happen in a sequence and seem to coincide (i.e. the rooster crows and the sun rises) there is a strong tendency in the human brain to assign a causal effect to those related events. Often these causal links based on observed correlation turn out to be correct. Take, for example, the observation that after it rains plants grow and get greener. However, many correlations happen because of another event that relates the two in an unseen way. Often, ignoring this third factor leads to hazardous assumptions. Sometimes correlation is pure coincidence, and is mistakenly seen as causal.
On occasion, sets of corollary data, whether in temporal sequence or not, seem to establish some causation. Statistics can be bent to subtly or subconsciously imply causality where none exists. We see this all the time in dirty political advertisements. One example would be comparing two data sets such as the number of crimes committed in a State during the term of someone’s opponent for Attorney General, and the number of acquittals during the same period. Without much more data or a global understanding of the nature of crime and court procedure there is a powerful desire for some to see these numbers as directly related. People who create political ads, commercial ads, or other uses of statistics can get away with this type of game because it is not explicitly false, and it relies on the human brain’s penchant for tricking itself.
Scientific inquiry is usually required to accurately prove the correct reason for a correlation. Some causal links, like ‘watering plants helps them grow,’ have been long established by human beings by an early scientific process of empirical observation. Given enough data and experience, this can be enough to establish a working knowledge of the relationships between correlation and causation. The largest factor that complicates this correlation=causation mistake is the "see the hits and ignore the misses" phenomenon.
Selective Observation, or "See the Hits and Ignore the Misses"
We humans have a very large quantity of data to process every waking minute. As a mechanism to cope with this deluge of information, we have developed methods of processing less important things in the subconscious. The squeakiest wheels get the grease. Something that deviates from the expected is more likely to get our attention than those things that are subtle or just not happening at all. In other words, things that are ‘active’ or ‘unexpected’ are in the foreground, and things that are ‘passive’ or ‘expected’ end up in the background. Because of that, when a suggestion is made that events are correlated, and we are open to that suggestion as a plausible idea, we have a strong force in our conscious mind that seeks examples to corroborate the proposal, while we tend to dismiss or completely fail to notice any examples that contradict that idea. When we engage in the process of collecting data in a subconsciously biased way we create a data set that is susceptible to the correlation=causation fallacy.
Examples
- "If I get out of bed on the left side, I have a good day at work." This sort of conclusion combines both the post hoc and selective data collection errors. If one is inclined to believe this, then they will most likely log the ‘good things that happen to them at work’ in support of the superstition on the days they exit the bed on the left side, and log the ‘bad things that happen to them’ in support of the superstition on the days they exit on the right side. Both sets of days contain both bad and good things, but when one is looking for a particular type of data in a random data set they usually see a pattern where none necessarily exists.
- "Jinxing" something is a superstition a lot of people actually take seriously. People want to believe it actually happens, and therefore notice any incidence that fits the perception. What they fail to notice, is that there is invariably a statistically significant set of occurrences that directly discredit the idea of something being "jinxed." These occurrences are either not noticed or dismissed by another convenient superstitious explanation. People frequently "see what they want to see" and not what is really there, which is usually just a set of random data.
- "A miracle" occurs when a highly improbable thing happens to someone who confused "highly improbable" with "impossible." Many people are convinced that a set of random events which culminated in a highly positive outcome (say winning the lottery) must constitute a miracle. Funny how the exact same people would not consider it a miracle if the same events lead to a highly improbable and extremely negative outcome (getting struck by lightning perhaps). If enough people buy a lottery ticket, one of them will eventually win, but no matter how much money trouble that person was in, that victory was not a "miracle" any more than it was a miracle that you matched the exact numbers of a drawing held 3 weeks before your ticket was purchased. The odds of those two events are identical, but it is the addition of the positive outcome that makes the actual winner’s case seem miraculous. This concept applies to pretty much any situation that people could consider "a miracle."
Currently Unknown vs. Unknowable
Another Common mistake we humans make is confusing the things that we don’t know with the things we can’t know. To muddy the water, there are infinite things in the universe that we don’t know and/or can’t know and we rightly ignore. For example, I don’t know how many ants there are in my backyard, yet it is something I could conceivably know if I were to put a massive amount of effort into it. It is something that I don’t know, could know, and shouldn’t want to. I could never know if there is an invisible Care Bear forest in a moon orbiting a planet that is in a solar system in a galaxy so far away we can’t see it with telescopes 100,000 times more powerful than Hubble. (I know I may get a philosophical argument about my use of the word "never" here, but save it for another day please). That is something that, in my estimation, the human race will never know and shouldn’t care to know, unless the Care Bears develop a way to travel here and vaporize us with a twisted evil Care Bear Stare. If that happens though, we would have a lot bigger problems on our hands than knowing where the invisible forest home of our conquerors was. Anyhow, back to reality.
So, in a world with so much superfluous data, and infinite lines of inquiry, it becomes difficult to judge what we don’t know from what we can’t know, and from what we don’t care to know. We have come a very long way as a species in understanding our Universe and our place within it. Almost every hurdle of knowledge that our forbearers thought impossible to clear has been leapt over. As soon as one roadblock is removed, another appears a little farther down the line. Each generation progresses to heights many of our ancestors could never have imagined. As we stand today, we have our own achievements to make, and goals that seem impossible to us. A century from now those impossibilities may be mundane, replaced by a new set of "impossibilities." The point being made here, is that for every hole in our knowledge there has been someone to exclaim that it can "never be known." A lot of these holes are temporarily plugged with superstition. When the Aztecs didn’t know what "caused" the sun to rise, they developed superstitions that involved human sacrifice to keep the universe in balance and allow the sun to rise. These superstitions were obviously very unfortunate for the victims of these sacrifices. This is a prime example of something that can be known, but wasn’t, leading to dangerous superstition.
Not all of our knowledge holes have been plugged yet, but a great many of them have. All of those people who claimed to know an alternate truth that explained those holes or claimed that they could never be filled have been made to look like fools. There is no reason to believe that given infinite time we can not know all that can be known. Obviously this is hypothetical, because you would be hard pressed to find anyone that believes our species will have infinite time in existence. We’ll undoubtedly meet our end within what would be a gnat’s fart worth of time on the cosmic scale. But on our time scale, we hopefully have a lot more time to learn and discover thing we never knew before. When someone tells you we "can’t know" something, yet paradoxically claims to explain it, ask yourself if it is something that can be known, and whether or not it would be rational to believe that person’s explanation for it. This doesn’t just cover broad questions of physics, cosmology, or evolutionary biology, but basic everyday BS people are willing to believe with no apparent rationale. Be a scientific skeptic, and apply reason to it.
False Dichotomy
There are many variants on the false dichotomy fallacy, but essentially it involves a deception (though not always deliberate) in choice. A person may be given only two choices as possible outcomes to a situation when a large variety of other outcomes could be chosen but are either intentionally suppressed or otherwise unknown to the person making the choice. This is also sometimes referred to as a "false choice." A pertinent example for this community would be those that offer this gem: "you have to keep Gitmo open or we’ll let a bunch of terrorists out to kill our children." This false dichotomy, though strangely popular, is easily dispelled. There are clearly other options than these.
Another version of this is to present two polar opposite extremes and nullify the more moderate options. This will lead someone to make a dangerously excessive choice when a safer, more rational alternative should exist. Take, for example, the person with a stash of drugs that sees a police officer approaching. They may see only two extreme ways of approaching their dilemma. They could take all the drugs quickly to eliminate the immediately visible evidence, or they could run as fast as they can away from the officer. Both of these choices could easily result in their injury or death, and in either case they would probably end up in arrest and possibly incarceration. The person with the drugs could choose a more moderate course of action such as trying to throw the drugs away, or calmly greeting the officer in the hopes that they will not be searched. They could even turn themselves in and probably get much lighter punishment for their honesty. The point of the example is that frequently people only see one or two dramatic courses of action where many more reasonable ones may be conceived of.
Human beings are often mislead into setting up black and white visions of decisions that must be made. I suspect that our aversion to making morally ambivalent or ambiguous decisions leads us to set up our own false dichotomies, which satisfy our need to have strong justification for the things we do. The insidious thing about false dichotomies is that they are sometimes hard to spot, easy to fall prey to, and are often employed by those with devious intentions. When someone tells you "if you don’t compromise, and give up the public option, no healthcare will be passed and we’ll be stuck with no reform," they are employing a false dichotomy in giving you two negative outcomes in the hope that you’ll choose the one that is less negative. When you apply a process of skepticism to this statement, you can see that there are alternatives to these two outcomes, and further exploration of the players and forces involved may illuminate them.
Jumping off to the rest of the series
I have laid out several of the common ways humans arrive at false conclusions. There are several more, and quite a few more variants of the particular ones mentioned. Some of those may come up as we go also. Look for these to provide a basic foundation to the more specific fallacies that scientific skepticism will address in this series.
Stay tuned for the next installment of Skeptics Digest next Sunday. We’ll be discussing some current, hot button issues that might stir up quite a controversy. Don’t miss it!
Link to first diary:
The Skeptics Digest: What is Skepticism?