You probably won't like this diary. You will very likely try to find every possible reason you can to conclude that while it may apply to others (especially Republicans), it doesn't apply to you. And by doing so, you will be demonstrating that the findings of neuroscience are indeed correct. :)
All of us like to think that we are a logical and rational creature, who has come to our belief system (whether about politics, religion, or social viewpoints) through dispassionate and detached evaluation of reality, and that we defend our positions with objective facts and evidence.
Alas, the findings of behavioral neurology and cognitive brain science show that we are all wrong. Humans do no such thing at all. Not me, not you--none of us. Instead, neurological research has shown that all humans tend to believe whatever our social peer group teaches us to believe, that all humans tend to make their decisions almost entirely based on emotions and only use "facts" and "logic" afterwards as rationalizations to justify what we already want to believe, and that all humans tend to support those beliefs by making up false memories and fictional frameworks and lying not only to others, but also to ourselves. And since this is the normal way that all human brains have evolved to work, all of this happens without our even being consciously aware of it happening.
Meet the Brain
The African Primate
Humans did not appear suddenly out of nowhere, intact with all our special human-ness. Like every other living thing, we evolved, and our evolutionary ancestors were primate apes who lived very much like today's chimpanzees do. Not only did we inherit our ape ancestor's chimplike body structure, but we also inherited their brain architecture and much of their social structure. Although we live today in a modern 21st century world of computers and mass communications and global society, we still carry around in our heads the brain of a bipedal African primate who lived in small groups on the savannah, and that brain still sees the world in the same way it did back then.
So what do we do with that chimp-brain we carry around? Well, from the time we are born, we are learning machines. The human brain is so well-developed for learning that it's not even finished growing when we're born--we pop out before our microwave has "dinged". It takes almost two years after birth for our brains to stop growing and for our internal nerve wiring to be complete enough for us to at last be capable of doing ordinary human things, like walking and talking. From then on, until the time we are about 30, our brains soak in information about everything around us. As we learn about the world around us, our brain builds extensive networks of neurons that store this information. Experiences which are repeated often in our environment will reinforce the appropriate neural networks, while the networks that store rare or seldom-used experiences tend to be broken apart and decay. In this way, the wiring inside our brain literally comes to reflect the world around us--focusing on those experiences and memories which we need most often to survive in our world (and to pass on our genes), and neglecting those which are not.
For the African savannah dwellers, there were two areas in particular which were crucially important, and which made up most of their experiential brain wiring: the business of staying alive by finding food and water and avoiding predators, and the intricacies of our own particular social group. And like all primates who live in social groups, these two areas overlapped extensively. Humans are unalterably social animals--our very lives depend on the existence of our social group and our role within it, and we can no more live as isolated independent individuals than a honeybee or a wolf can.
Although infants and young children are born with a natural curiosity to learn everything they can about everything around them, in the rough business of life there simply is not enough time for children to learn everything they need to know on their own. And indeed very many such lessons would be lethal--learning that snakes can be deadly by picking one up (and then dying), does not help us in the game of survival. As a result, human brains are hard-wired at birth to be particularly adept at being taught, to learning through the words and shared experiences of those other individuals in our social group, most crucially our parents. In the African plains, children who ignored or doubted the lessons taught to them by their parents, and who insisted on learning everything on their own, quickly ended up dead.
And even as adults, the people who survived were the people who were wired to make instant snap judgments about things--"is that noise a lion?" "is that guy from the next tribe over friendly?"--not the ones who took the time to sit down and carefully consider all the evidence and make a logical conclusion. The ability to make an instant decision based on "gut feelings" was often the difference between life and death. Those whose brain was wired to be good at such snap decisions survived; those who had to take their time and think about it, died.
In our modern industrialized world, finding food and water, and avoiding predators, is no longer a concern in daily life. But our brains have not evolved to keep up--our brains are still living on that African savannah, and are still wired to make its decisions the same way it did then.
Those African primates also needed to absorb knowledge about one other area that is still vitally crucial to modern humans--how to get along within our primate social structure. Chimps (and human ancestors) live in a society that is rigidly hierarchical. At the top is the alpha male, the unquestioned ruler of the roost. He makes all the group decisions; where the troop will move today, when it is feeding time, how to react when we meet a neighboring troop. But the alpha male cannot rule alone. He needs the active support of the majority of the other males, and indeed the place of everyone in the hierarchy is directly dependent upon how effectively each individual is able to form alliances and friendships among others in the group, on a "you help me and I'll help you" basis. Like all primates, then, human society is directly based on an innate deference to authority, coupled with a need to maintain a place in the social hierarchy by "winning friends and influencing people". These are things that every primate learns assiduously, and its brain is hard-wired to facilitate that learning. We are social animals, and our entire lives are dependent upon our ability to understand others in our social group, and then to cooperate with them, learn from them, share experiences with them, but also to dispute with them, manipulate them, exploit them, and to deceive them. It is just as much a part of the primate environment as lions and bananas and thunderstorms are. And our brains evolved its wiring to enable it to work, quickly and without hesitation, in that environment.
But there is a crucial limit to the primate process of constant learning. By the time the primate reaches biological adulthood (in humans this is around age 30), the brain itself begins to change its focus. By now, the brain has decided "how the world is"; it has internalized and absorbed a coherent "worldview", a mental framework of "how things are", everything from "how water acts" to "what is potentially dangerous" to "which foods I like to eat" to "which kind of companions I like and which kind I don't". It forms a mental map that we always carry around with us, and which we use to interpret and understand all of our environment and experiences. Although we never entirely stop learning from our experiences, at this point the brain's focus changes; now, instead of putting its energy into building new neural learning pathways, the brain begins putting its energy more into reinforcing the existing ones, and the ability for the brain to form new neural nets from experience and learning becomes diminished. The brain quite literally becomes "set in its ways". As we continue to get older, the brain continues to become less and less adaptable, and less capable of altering its existing wiring--and therefore less capable of changing the mental framework within which it operates.
Genetics and Personality, or, Nature vs Nurture
It is the oldest debate in the study of human behavior, going back even before we knew what a "gene" is: are humans born the way they are, or do our surroundings shape us into what we are? Is our personality and behavior inborn and set, or is it learned and therefore changeable. Nature, or nurture? Genetics, or culture? Today, a sub-field of neuroscience known as behavioral genetics, has an answer to that question--it is both.
One of the reasons why the "nature vs nurture" question has always been so difficult to answer is that it is not a simple "either/or" situation. Unlike organisms which operate entirely on genetic instinct, like ants, humans have remarkably complex brains capable of understanding our world and learning from it. Indeed, humans are successful as a species precisely because we are not slaves to our genes: we are capable of passing complex learned behaviors to the members of our social group, a process known as culture. If humans were dependent upon changes in our genes to alter our behavior, we'd still be chipping lava cobbles out on the African veldt. But because we can learn and share our experiences with others, we can build space shuttles and supercomputers. Our behavior is enormously plastic, capable of a stunning variety of learned cultural beliefs and practices which are passed down entirely independently of our genes. The interrelationship between genetics, learning, culture, and environment is a complex web--and it also differs depending on which particular trait we are considering.
To try to isolate the genetic component from the cultural, behavioral geneticists depend upon "twin studies". Human identical twins have the same DNA; fraternal twins do not. Behaviorists, then, can take advantage of the unique characteristics of twins to isolate the effects of genes and environment to study them. Most of these studies have centered around six basic personality traits: intelligence, shyness/introvertness, empathy/sociability, openness to new experiences, conscientousness/organization, and anxiety/moodiness. Each of these traits is a continuum with two opposite poles, with each individual falling at a particular point along this axis, and those positions tend to remain fixed throughout each person's life--people who are extroverted as children tend to be extroverted as adults; children who are reluctant to enter new situations or experiences tend to be the same way as adults. Further, studies show that these traits seem to correlate strongly between identical twins raised apart, but correlate less strongly between fraternal twins raised together--indicating that they have a strong genetic component.
Genetics, then, sets the basic personality traits that we are all born with: shyness, amenity to change, confidence, empathy, gregariousness, sense of adventure. And in turn, humans tend to choose environments (jobs, friends, activities etc) that are compatible with their own personality traits (a process called "niche picking"). But this does not mean that our behavior is genetically determined. Our genes set only the broad outlines of our behavior; it is learned culture that takes over from there. Our genes may place us at a particular level of behavioral "aggressiveness", but it is our learned culture that decides why some people channel that "aggression" into pro football and others channel it into politics or mammoth hunting. Other individuals may have genes that give them "good spatial skills", but that doesn't determine whether he or she will be a fighter pilot or an interior designer or a pyramid-builder--culture and learning decides that. Nurture plays an enormous role in our behavior, but it can only operate within the limits that our genes place on our basic personality traits. Conversely, our genes set our basic personality parameters, but our specific behaviors can vary widely because of our enormously plastic and adaptable cultural learning.
Nevertheless, genes can still have a powerful influence on certain areas of our behavior, which we find very difficult to change. One area in which this is particularly noticeable is in our diet. As savannah dwellers, our brains are wired to crave the most high-calorie food available, and to eat as much of it as possible during times of plenty so we could store enough fat energy to get us through the inevitable lean times of shortage. But now our primate brain works against us, and our bodies pay the price for it--our innate craving for sugar causes rampant health problems from dental cavities to diabetes, and our biological urge to hoard fat now makes us grossly overweight, which produces a whole suite of health problems, from heart disease to high blood pressure, that make up our leading causes of death. As anyone who has tried and failed at dieting knows, we have a hard time changing that innate hard-wiring and find it tremendously difficult to override the brain's now-obsolete urgings. It is not a matter of individual "willpower" or the lack of it--we are fighting against the very biological wiring of our own brain, and that is simply a task that is virtually impossible. (And not just for dieting, but in every other area, as well.)
The Star of Our Own Show
So, what, then, do we do with this brain that both nature and nurture have manufactured inside our head? Mostly, we tell stories about the world, which help us to understand it and function within it.
As far as your brain is concerned, there are two unalterable rules of life.
1. You are always right. Period. You are never wrong. Ever. About anything. Oh, and you're smarter and more honest than everyone else, too.
2. You are always the Good Guy. You operate only from the most lofty and honorable of motives, and only with the very best of all intentions. And nothing bad is ever your fault. Ever.
Those two rules may shock you as incredibly selfish and arrogant. But you can relax---everyone else has precisely the same two rules, and so do you. They are an innate function of how our brains work, a product of that chimp brain we all carry around inside our heads. In a primate society with a hierarchical structure, in which social advancement comes from one's ability to win friends and influence people, there is simply no place for self-doubt. If you get the chance to mate with that high-status partner, or to win a dominance fight with that higher-ranking individual and move yourself up in the social order, but you hold back because of self-examination or second-guessing yourself, then that self-confident primate next to you will seize their own chance and make babies with the higher-status, or form a gang to beat up the higher-rank and move up the social ladder. Because of your indecisiveness, your genes don't go into the next generation--theirs does. As a result, the brain of even the most sociable and humanitarian and philanthropic of humans is, deep down inside, driven by only one thing ---> me me me me me me.
It is remarkable to see to what lengths our brains will go to maintain those two rules. Examples of it can be found right here at DKos, daily, by the thousands--we've all seen the participant in some pie fight or another who uses every available verbal sophistric trick and every possible wiggle and goalpost move to avoid simply admitting they were wrong about something, even if it's about a trivial matter. For all of us, admitting we are wrong is like pulling teeth--except ten times as painful. (Particularly in public--and even more so for the type of motivated ideological evangelists who are self-selected to join groups like DKos.) So we do whatever we can to avoid it, and we only give in when there is absolutely no other alternative (and then we usually turn it to our advantage anyway by using the occasion to gain social status and transform ourselves into the Good Guy after all, with a "See, I'm big enough to admit I'm wrong" performance). And you know what?--you do it too. And so do I. And so does everyone. It is innately wired as the brain's primary rules. We can no more tell our brain to stop doing it than we can tell our gall bladder to stop producing bile.
And if you are still trying to convince yourself that you're not really doing that, but it's people on the other side who do that, your denial of it is rule number two coming into play. You are always the Good Guy. Period. Even the most evil and wicked of people always manage to convince themselves that they are in the right. Hitler firmly believed he was doing the whole human species a favor by exterminating "undesirables". Terrorists of all sorts and all ideologies sincerely believe they are doing "what must be done" to make the world a better place. Murderers on death row will tell you a big long tale about how their victim actually "had it coming" and "got what he deserved". (In what surely must be the clearest example of this phenomenon, one killer confessed to the police that he shot his neighbor, but insisted that it was the neighbor's own fault because "he shouldn't have been picking on the size of my ears".) So as far as your brain is concerned, you are not only the star of your own show, but you are always the Hero.
But here is the paradox that inevitably arises from the brain's two primary rules--in primate society, if you act like a selfish bastard, you find yourself unable to win friends and influence people (the core of all success in primate society), and instead of being the alpha male, you find yourself sitting alone in a corner, marginalized and shunned by all, with no allies to help if anyone decides to beat the crap out of you, eventually to die from lack of access to the good resources. We are, all of us, selfish bastards who desperately need to convince others in the group that we are not selfish bastards--and furthermore, in order to act with confidence and ease, we must also fool ourselves into thinking we are the Good Guy, and not a selfish bastard. This basic dichotomy is the foundation for all of primate social structure. How successfully we balance those two contradictory but equally-vital needs, selfishness and sociability, directly determines our ability to win friends and influence people, and therefore our success in the game of survival and life.
How Do We Get Our Beliefs?
The Emotional Brain
There is one crucial area, however, in which human brains differ from those of our primate ancestors--and it is a significant difference: we can develop a mental framework of global "cause and effect". Other intelligent animals, such as ravens and chimps, demonstrate in experiments that they understand the principle of linear causes (ravens, for example, can perform a series of tasks in which they must modify various things in a particular order to complete the goal--and are then capable of skipping steps that are no longer necessary), but only humans are able to understand the world as a whole, the very universe itself, as a network of causes and effects. Other intelligent animals can use cause and effect to manipulate the world, but we can use it to understand the world, and explain it.
And that leaves our brain with a quandary. As we have already seen, the brain works with mental maps of how the world operates, and fits all its experiences and observations into those frameworks. In chimps and crows, this is not a very complex process, since their world is limited to the individual and those of its surroundings that it needs to understand in order to survive. But humans, with their understanding of the much larger universe (both physical and mental) of cause and effect, have an even more complex task--all of our mental maps and worldviews must, if they are to be useful, make seamless sense of the entire universe--and that includes the internal mental universe as well. The world must make sense to us--and so must our own individual role within it. But also, in accordance with the brain's two primary rules, that mental image of ourselves must reinforce the crucially important points that we are always right, and we are always the Good Guy. No mental worldview will satisfy us unless it does all of those things and fits them all seamlessly together.
But on the other hand, it is simply impossible for any individual to have complete, or even pretty good, knowledge of the whole world. So what does the brain do to deal with that inherent incompleteness and produce a coherent self-consistent worldview? It uses "beliefs" to fill in the gaps--things which we don't know are true, but assume are true. The human brain is, uniquely, what one scientist has called a "belief engine". Our brain is what makes us primates, but our beliefs are what makes us human. Other animals don't have them. When faced with a situation that involves a gap in its knowledge, the non-human animal cannot act; it doesn't know what to do. But humans can fill that gap with belief, and act on that belief as if it were reality. That opened the door to all the immense mental powers that humans have, all based on the ability to manufacture a mental picture of things that do not really exist. And while religious beliefs are the most visible manifestation of this, they are by far not the only ones; everything from our social views to our political stances to our patriotic notions of nation and society, are "beliefs". And all of our social views and political stances, from Nazis to teabaggers to partisan Democrats to Marxists, have their share of ideologues--people who cannot and will not view anything in any way other than from their own belief framework, and who are always trying to cram everything into their ideological belief system, whether it fits or not.
And that raises the most interesting question of all: how exactly do we get our beliefs? Politically, socially, religiously: why do we believe what we believe, and not something else? And here is where things get really fascinating . . .
We may like to . . well . . . believe . . . that our beliefs come from our own intellectual study and careful logical analysis of available facts and experience. But behavioral neurology shows that they do not--and indeed, our belief frameworks do not even need to be true in order to function effectively. The criterion for deciding which beliefs to accept and which to reject lie entirely in the brain's need to form a coherent mental view of the world that is both self-consistent and keeps your place in the center. In other words, once your religious or political or social opinions are set and they "work" for you, your brain will resist making any but the most minor of changes to them, unless particular portions begin to conflict so much with the rest that they threaten the stability of the entire framework (called "cognitive dissonance"). But because the brain is so good at maintaining and protecting its worldview, by plugging in whatever beliefs are needed to keep it intact, very few of us ever even reach that point. The vast majority of us die with the same basic views on the world that we grew up with.
But since we are not born with any pre-set beliefs and have no innate worldview of our own, how do we get our initial beliefs and form our first mental map and worldview? As with any primate, we get it from our parents, by accepting them as an authority . . . .
Deference to Authority
In 1945, the world became keenly aware of the human propensity to accept and obey authority. While the Axis countries were military police states, and coercion and force were a part of life, the reality is that most of the population followed unquestioningly and without protest, and only a tiny minority of people needed to be coerced or punished for disobedience. Most were entirely willing and voluntary actors, who sincerely thought they were doing the right thing, or at least were following legitimate orders.
The evil brutalities of the Second World War sparked intense academic study into "the problem of evil". A whole slew of experiments were carried out, testing our human reaction to authority and our willingness to do what we are told. Some of the most famous of these are the "Stanford Prisoner Experiment" (in which a random group of college students was divided into "guards" and "prisoners"--and the "guards" quickly became so brutal that the experiment was halted), the "Line-Length Experiment" (in which a college student in a fake study group was asked to give his opinion about whether two lines were longer, shorter or the same length as each other: the rest of the group was secretly part of the experiment and intentionally gave wrong answers--and most of the test subjects then altered their own answers to match those of the group even though their answer was obviously wrong), and the "Electric Shock Experiment" (in which a student is asked to help in a "learning study" by giving increasingly-strong electric shocks to subjects who gave deliberately wrong answers--and most of the students followed orders all the way to the extent of giving electric shocks that would have been lethal). The findings were both unmistakable and disturbing: every one of us has an evil person living inside, who will do the most unspeakably brutal things, without question, if told to do so by someone we hold to be in authority--especially if the brutality is directed against people we hold to be outside our own "tribe", and our authority figures justify it with something we already want to believe.
Our first authority figure in our lives is our parents. For most of us, that is the authority that sets the worldview we will carry with us for the rest of our lives. Like all primates, humans are hard-wired at birth to learn from our parents, to imitate their actions, to follow their example, and, when we take the uniquely human step of developing language, of learning from the explanations they give to us.
But here is the crucial part: because we are born before our brains are fully developed, we do not yet have our adult brain circuitry, including the wiring to the parts of the brain that enable us to do skeptical analysis and critical evaluation on the information we receive. As a result, as children we accept all of the information that comes from our parents as true, period. We question none of it and we reject none of it: it is all accepted as gospel. Ask any typical toddler who the smartest people in the world are, and every one of them will answer "Mommy and Daddy".
The things that human parents teach their children, and which those children accept uncritically and unquestioningly, run the entire gamut of everything human--everything from political ideology to religious beliefs to cultural norms about race, gender, and behavior. As the saying goes, we take all of that in with our Mother's milk. For most of us, we then hold onto those beliefs, with only minor changes, for the rest of our lives. The biggest predictor for our own religion, and political affiliation, and cultural attitudes, is the beliefs of our parents.
But not all of us will keep those beliefs. In primates, most of the social and cultural instruction takes place individually. Humans, however, take this a huge leap forward: we gather our young into social groups and teach them all the same lessons, collectively. And that produces a peer group. Primate societies have peer groups as well: it is through play with others of their own age that young primates begin to apply the social lessons they have learned, and first begin to figure out how to win social positions for themselves while functioning within a group--a crucial skill which they carry into their adult lives, where social success depends on the ability to win and maintain a place in a peer group of social allies and friends.
But humans have, through institutionalized social learning, turned the "peer group" into a much more powerful force than it is in other primates. Because humans have such extraordinarily long learning periods, humans have a social institution that other animals do not: school. Outside of the family, the first peer group that human children encounter are their schoolmates. "School" introduces the young human into the first real "society" he or she will encounter: a group of unrelated individuals into which the youngster must enter, and develop and maintain a place in the hierarchy based largely on his or her own social skills. For most, it is a brutally harsh lesson. It is in this peer group that humans learn to curb their selfishness (think about how hard it is for young children to learn to "share") and to use their sociability to win friends and influence people (the core, as always, of every primate society). "Everything I really need to know I learned in kindergarten" is not just a clever title for a book--it is human reality. Our success (or failure) in those crucial years follows us, for better or worse, for the rest of our lives.
As we get older and reach our teenage years, the parts of our brain structures that allow us to carry out critical thinking and reasoned analysis finally begin to come online. For the first time, we gain the capacity to question authority, particularly the authority of our parents. This is a crucial ability if we are to gain a sense of self-identity which allows us to successfully live our own life as an adult. But it comes at a cost, which every teenager knows: the painful search for that self-identity. Now, we are faced with a bewildering plethora of "authorities", everything from religion to political ideologies to cultural figures. During these years, teens move from one peer group to another as they try out different ideologies, different lifestyles and different sources of authority, and mostly reject them. It is often said that the "teenage rebellion" years are a time when teens come to reject authority, but that is not actually true: what they are really doing is searching for the authority to follow which suits them best, and which they want to believe. In particular, what they are searching for are those beliefs that fit best within the basic personality traits--openness to new experiences, shyness, empathy, etc--which have been established by their genetics. And, ironically, after all the sturm und drag of the teenage angst years, virtually all teenagers eventually return to the same basic ideologies that they were taught to believe by their parents. (Not surprising, since they share those same basic genetic personality traits as their parents.) Only a tiny handful do not: they instead adopt the beliefs that are taught to them by some other peer group and its favored authority figure.
So, in human society, what we naturally end up with are groups of humans who share an ideology and a worldview, which they enforce and protect through conformity and submission to an agreed-upon authority, which actively advances itself against rival groups with different views. In addition to the wired chimp brain that produces the human tendency to see the world as "I" versus "we", there is also the wired primate tendency to see "us" vs "them". Neurobiologists refer to these as "tribes".
So Why Do Wrong Beliefs Persist?
All of the various teen peer groups have one characteristic in common--a fierce ideology of "us vs them". Once the human brain is capable of differentiating different types of viewpoints (and no longer simply accepts what it is taught as a "given"), the innate tendency is to draw sharp distinct lines between the "in" group (yours) and the "out" group (everyone else's). And that continues, unchanged, into adulthood.
This is another biological characteristic of primate society. Like chimps and monkeys, our hominid ancestors in Africa lived in small bands, most of whose members were related to the others, which was completely dependent for all its resources upon a particular geographical area. As a matter of survival, every such band had to defend its territory against members of neighboring bands, who might otherwise come in, take all the resources, and starve you and the rest of your band. Primate society is inherently tribalistic: members of each social group live together in cooperation, but there is nothing but suspicion and enmity against those who are not part of the tribe. We are, for better or worse, tribal animals.
Humans, though, with our greater cognitive ability, are able to take the brain's innate biological concept of "our tribe" to an extreme not even imagined by apes: for us, the "tribe" can consist of things even as abstract as "members of my race" or "citizens of my country" or "fans of my football team" or "people on my side of this social issue". And the effect in humans is the same as it is in chimps: we are implacably irrationally emotionally hostile to anyone who we do not view as someone of our tribe. We can see thousands of examples of this here at DKos every day, in every fight between this tribe and that, whether it's "rox/sux", "Snowden hero/traitor", "gunz/no gunz", "GMO/anti-GMO", "Israel/Palestine", or any of the dozens of other emotional issues that we have all drawn tribal lines around. It is primate tribalism at its very clearest, and it is why the constant pie fights here will never end. Ever. They are about emotional tribal identity, not about facts or evidence.
It is when we combine the innate tendency of humans to form tribal groups with the equally inherent tendency of humans to unquestioningly adopt the beliefs of our peer group and bow to its authority that things really get interesting. Within tribes, there are always cultural markers that mark their members off from the others. For football fans, wearing your team colors serves as a marker of membership. In politics, your stand on hot-button issues, such as abortion or global warming, serve as cultural markers, differentiating those within the tribe from those outside it. In these cases, the pressures are enormous for everyone in the tribe to adopt the appropriate markers (by wearing the appropriate colors or believing the appropriate way on the issues), and those who either refuse or aren't as ardent as expected, are usually viewed with suspicion by the rest of the group. As a result, people tend to adapt their opinions and positions to match those of the tribe, accepting the group as their authority figure and the beliefs and opinions of the group as self-evident truths. Criticism of any aspect of those beliefs, therefore, no matter how minor, is viewed not as a mere disagreement of opinion, but as a full-on attack on the tribe. The chimps will all respond appropriately, by defending the group and attacking the other chimps en masse. Indeed, rather than invoking discussion or prompting people to "examine their beliefs", criticism of the tribe's cultural markers usually prompts the opposite reaction, goading members of the group to band together, harden their beliefs, and to search even more deeply for reasons to defend them. (And once again, we see examples of this here at DKos, on both sides of every issue, thousands of times every day.) And that is why out of all the hundreds of thousands of "rational" and "logical" debates and arguments you have seen in your lifetime, you can at best count on the fingers of one hand the number of people whose minds you have actually ever changed--or who have changed yours.
When particular ideological positions become cultural markers for particular tribes, then it no longer even matters if that ideological position is "true" or not--all that matters is the cohesion of the tribe and its defense against outsiders. And before you nod your head sagely and conclude, "That is why they won't change their minds in the face of plain fact"--our side does it too. Our side is just as emotionally tribal as any other tribe, and we act in exactly the same way. You do it, I do it, every human does it. We are no different than White Sox and Cubs fans: we cheer our team and boo their team, no matter what. Our side is always right, and our side is always the Good Guy. Our brains are wired to believe that, just as theirs are. It's all part of our human need for tribal identity.
Some of the "tribes" that humans find themselves in are maintained and enforced through coercion and a system of punishments and rewards. Human nations pass "laws" which allow its members to behave in particular ways and restrain its members from behaving in other particular ways. But the vast majority of human tribes, particularly the social, political, religious and ideological versions, do not have the ability to use social sanctions to enforce conformity in their members. Instead, once again, the wired structure of our chimp brain comes to the rescue. Maintaining social cohesion and group loyalty is vitally important to the survival of any primate troop, and our primate brains have evolved to meet that need. In essence, our own brain forces us into unquestioning tribal loyalty.
As we have seen, human tribes living out on the African plains did not have the luxury of stopping to rationally consider all the available evidence before deciding whether that lion in the distance is a potential danger, or that person from the neighboring tribe is just here to trade roots and berries to us. Instead, natural selection favored those people who were good at making instant snap judgments, usually in the face of no evidence whatsoever. Through experience and social learning, all humans as they grow up develop a rough and ready set of "rules of thumb", called "heuristics", that enable them to make instant judgments on a wide variety of things, especially those things we have limited actual experience of. These are the basic rules we fall back on to make quick decisions. Some of them are hard-wired emotional rules like "be afraid of the dark", "don't trust strangers", and "run away from loud noises". Some are cultural lessons like "don't take food from the alpha male" or "don't criticize anyone on our side". On the African veldt, such rules were usually right, at least often enough to enable us to survive. If we suddenly came upon a stranger from another tribe and instantly decided "Danger! Run away!", we lived more often than those who assumed every stranger was a friend and walked over with extended hand only to get clubbed over the head by a raiding party.
But in our modern world, we are exposed to an avalanche of ideas, experiences and information that humans of the African veldt would never have seen and our heuristics are unprepared for--and inevitably, much of that information and ideas will conflict directly with our own comfortable worldview. Especially here on the Internet. We have all had that sinking feeling in the gut that comes when one sees something, on this blog or elsewhere, that, even if just for a moment, says to us "OMIGOD I MIGHT BE WRONG ABOUT SOMETHING !!!!!!". Maybe it's an anti-nuclear activist who sees data indicating that one of his or her pet arguments is incorrect. Maybe it's a racist who sees an example of a person of another race acting selflessly and compassionately to help his or her fellow human beings. Maybe it's a creationist who sees a fossil that he or she can't explain. Neuroscience refers to this conflict as "cognitive dissonance". It happens when some bit of knowledge threatens to overturn our worldview. It can, if not dealt with, cause enormous emotional and intellectual upheaval.
So what does our brain do when faced with a threatening challenge to its comfortable worldview? Does it carefully seek all the available evidence to determine the objective truth (and selflessly change its mind if it finds it is indeed mistaken?) Um, nope. What it does is search desperately for any acceptable excuse why it was actually correct all along. After all, your brain's rule number one is that you are never wrong.
The first and most obvious way the brain defends its worldview against outside challenges is "confirmation bias", also known as "motivated reasoning". This is our tendency, when researching on a question, to pick out and remember only the bits of information that support the conclusion we already want to reach, and to ignore or downplay any data that doesn't. The alternate form of this, which is probably even more common, is "disconfirmation bias", where once we are challenged with an opposing belief, we search assiduously for any evidence that it is wrong, no matter how trivial or minor--ignoring any evidence that indicates it might be correct. Once the brain is satisfied that it has found enough (selective) information to show us what we already want to see, the search for information ends (neuroscientists call this the "Stop Now" Point). Mission accomplished.
This is not necessarily a deliberate attempt at dishonesty or deception. Even if we are consciously trying our very best to be completely impartial and unbiased (as in a jury trial, for instance), our brain will still make a snap emotional judgment, and only then go looking for the facts and data which will support arguments confirming what the brain has already decided, at the emotional level, that it wants to believe. That is simply how our brains work.
This same process, of bending reality to fit what we want to believe, happens backwards in time as well. We like to think that our memory is an accurately-recalled record of what objectively happened to us at some point in the past. The reality, though, is that most of our memories of the past are actually false memories, completely inaccurate, and were made up by our brain to make the past fit into our current emotional framework and worldview. Our memory of unpleasant people or places (ex-partners, for example, or previous jobs) often has nothing in common with how we actually felt and acted at the time--but conforms perfectly to how we feel about them now.
False memories are especially vulnerable to being planted by peer pressure, which can often prompt us to "remember" things that never actually happened, or did not happen in the way we remember them. In one series of experiments, test subjects were allowed to listen to interviews with family members and friends who described an incident that happened when the subject was very young and he or she got separated and lost at the local mall, and was found by a security guard and kept in the office for a short while before being picked up by the parents. When the subject was then interviewed and asked about it, he or she would often go into great detailed description about what had happened and how they felt at the time. In reality, the whole incident had been made up--it never actually happened. It was all a sincere but completely false memory created to jibe with the fake story told by the relatives. But the subject nevertheless believed it to be entirely true, because his brain told him it was true, even though it was not. And the brain told him it was true because human brains are wired to believe whatever the social group tells it to believe.
So what does the brain do if, despite its best efforts at selective memory and selective facts, a stubborn bit of information still remains sticking into its cherished worldview? It simply makes up a plausible story to explain it away. "Confabulation" is the ability of the brain to fill in information that isn't really there, in order to rationalize a plausible story that allows us to keep believing what we want to believe. Even if there isn't a shred of evidence for it, the brain will treat the story as if it were true anyway. All of the various types of conspiracy theories are good examples of confabulations. They assume dots that are not there, and draw connections between dots that are not really connected, all to tell us what we already want to hear.
On the African plains, confabulation was a crucial survival skill. Our brains had to make quick but crucial life-or-death decisions, and it had to make them even if there were gaps in the available information. As a result, the brain evolved the ability to plug in gaps in its knowledge with any information that might make plausible sense, and then act on that as if it were the truth.
Experiments at the University of Wisconsin demonstrate that we still have this ability. Researchers there made an audio recording of a TV newscaster saying the sentence, "The bill passed both houses of the legislature"--a quite common phrase that any newspaper reader or TV news watcher would be familiar with. But the researchers tossed in a twist: they added some audio static to obscure one or two words in the sentence. When they played the altered recording for their test subjects, they found that all of the subjects were able to correctly repeat the complete sentence. Their brain had simply filled in the gap with information it judged to be appropriate to that context. But here is the interesting part: all of the students reported that they had heard the audio static noise, but most of them could not say which specific word or words had been obscured. Their brain had filled in the gap so seamlessly (and so unconsciously) that they were not even aware of how it had been done. The subjects did not hear what was actually there; they heard only what the brain assumed was there, and reported as reality. And here is the really interesting part: when other researchers did a similar test, but obscured enough of the sentence so that its meaning became ambiguous and unclear, the brain very often still filled in the gap, and this time all it could do is take a good guess as to what information was actually missing. But the test subjects still could not say what words had been obscured; they simply accepted whatever their brain had come up with, as if it were reality. Their brain had confabulated so thoroughly that the students were unaware that it had even been done.
The brain's job is to make sure that everything it receives is integrated seamlessly into the familiar framework within which it views the world, and if anything happens which conflicts with that worldview, it's job is to make up a story that makes sense of the anomaly, and allows the brain to continue on blissfully unaware that the story it just confabulated is entirely untrue.
Can We Change Our Beliefs?
The short answer? No, in general, we can't--at least not in the things that matter to us. We may change our minds and ideas on trivial things. But when our core beliefs change to any significant degree, it is rare and usually painful. For the most part, it only happens when our emotional base changes for some reason beyond our control--a huge change in life circumstances such as a death or divorce, for instance, or if we are forced by circumstances into seeking support (material, emotional, or both) from some new peer group with different beliefs. Religious conversions provide perhaps the best examples. Most people simply accept whatever religious beliefs (or lack of them) that their parents had. But for some people, undergoing a drastic life change can be enough to shatter their previous worldview and leave them open to adopting a different one from some new peer group that is able to provide support. If you read the "testimonies" of religious conversions (whether to or from), it becomes apparent that a shattering life event is usually at the core of it. In those conditions, we don't change our mind; our mind is changed for us by circumstances beyond our control.
Even when one of our core beliefs do seem to change, it often turns out to not actually be as big a change as we might think. A good example is the religious fundamentalist who, because of some shattering life event, "loses his faith" and converts to atheism. Oftentimes, nothing really changes: the new convert is still just as much a fundamentalist as before and is still fervently preaching his opinions about religion to everyone who will listen. The only thing that has changed is what opinion he is preaching about. Similar non-changes can be seen in libertarians who become anarchists, or Trotskyites who become neoconservatives, or Republicans who become Democrats. Often, the underlying emotional motivator hasn't changed at all; just the social expression of it has. They are still the same bird, they just have different feathers now. And, like religious conversions, the extreme rarity of ideological conversions makes it clear how well-insulated the chimp brain is against making serious changes in its core beliefs.
Simply put, our brains are designed by Mother Nature to not change. We are innately wired to adopt the worldview taught to us by our social group, and then to use confabulation, confirmation bias and false memories to maintain that worldview so we can function as a part of a cohesive social tribe. It is vitally crucially important to realize, though, that all the confabulation and confirmation bias and contrived memories we do to ourselves are NOT, repeat NOT, as in N-O-T, deliberate intentional deceptions. Quite the opposite: they are entirely unconscious, unintended and unnoticed. Making up coherent stories for us to believe is what the brain does, just like pumping blood is what the heart does. We are completely unaware that any of it is going on, which is why it's so easy for us to convince ourselves that it doesn't happen. We are entirely unaware that our basic belief about ourselves--that we are an independent free-thinker who makes his or her own decisions rationally and logically after carefully considering all the circumstances--is simply wrong; it is a confabulated story made up by our brains to allow us to function efficiently within our social tribe.
Indeed, studies show that even if we do become aware of what our brain is doing and how it does it, it doesn't affect our behavior at all. In a series of studies, college psychology students were given, at the beginning of the year, tests to determine their propensity to confabulate and select biased facts to fit their beliefs. Then, after the end of the course when they had been taught exactly how the brain actually works (and could therefore intellectually guard themselves against the brain's tendency towards fooling us into believing what we want to believe), they were given a similar test again--and still confabulated their beliefs at the same rate as before. Indeed, other studies have demonstrated that having high intelligence or a good education does not diminish the tendency for a person to use irrational emotional methods to defend and protect their core beliefs; all it did was make it easier for the intelligent person to confabulate a greater number of rationalizations why the beliefs they want to believe are always correct. In the end, the chimp brain always wins.
For the most part, our rational logical brain never makes any real decisions. It only springs to life when we encounter something new or particularly interesting. We operate almost entirely on "automatic pilot", acting out our familiar routines without thinking about anything we are doing. We let our simple emotional heuristics do most of our day-to-day thinking for us. At our biological core, we are prompted by our brains to keep believing what we want to believe and to fit every new experience into that belief system--and that is usually whatever we were taught as children. We may be able to alter them around the edges a bit, maybe even dress them up in different ideological clothing. But our core internal beliefs, the ones we grew up with, don't change. (And this is particularly true if the belief in question is one that serves as an identifying marker for a particular tribe that we identify with, such as global warming, evolution, nuclear power, GMOs, vaccines, the free market, etc.) At best, we may be able to use rational thinking to override our core beliefs temporarily (just as we are sometimes able to use individual willpower to override our brain's cravings and carry out a successful diet.) But it is enormously rare, it only applies to one thing at a time, and it doesn't last.
And right now, everyone reading this is confabulating to themselves, "Aha, see? You are wrong--I'm one of the ones that can use my rational mind to override my emotional beliefs."
Um, no you're not. Your Inner Chimp is just confabulating that you are. (shrug)
LINKS FOR FURTHER READING
This is not an academic paper, so it doesn't contain citations and footnotes. All of the information here is standard textbook cognitive science and behavioral neurology (NONE of it is "my opinion" or "my conclusion"). But of course the subject of neuroscience and cognitive behavior is simply too vast to contain in a mere diary on a blog (indeed I had to cut a lot of stuff out of my diary draft, and it still remains uncomfortably long). So here are some resources available on the Web for those who may want to look deeper into how our Inner Chimp's primate brain works:
Neuroscience of Belief (Video Lecture)
Steven Novella: The Skeptical Neurologist (Video Lecture)
The Science of Why We Don't Believe Science
Our brains, and how they're not as simple as we think
Why We Don't Believe in Science
Your Brain on Politics: The Cognitive Neuroscience of Liberals and Conservatives
The Seductive Allure of Neuroscience and the Science of Persuasion
The War on Reason
Neuroscience vs philosophy: Taking aim at free will
Cognitive Psychology and Cognitive Neuroscience (PDF--online textbook)