This is the rough draft of a lecture I intend to give the students in my Library and Information Sciences 101 class next fall. I think that in order for students to understand the information they receive from the media about global warming, they must be armed first with some facts about the scientific method. This is one part of a series on how information is created. If you find fault with my explanation of scientific concepts please let me know.
The scientific method is essentially the formal process that scientists follow to advance understanding. It goes like this:
Ask a Question
Do Background Research
Construct a Hypothesis
Test Your Hypothesis by Doing an Experiment
Analyze Your Data and Draw a Conclusion
Communicate Your Results
Here is how that method played out for a few scientists who were studying the climate. This is not a thorough review of all the major climate change scientists,nor is it even a list of the most important. But I hope it will show you how the scientific method works in real life.
Back in the 1850s or so, a French scientist named Joseph Fourier asked a really important question that apparently no one had thought to pursue before: “What determines the average global temperature of a planet like earth?” In other words, if light from the sun comes down to earth and warms it up, that’s great, but why doesn’t it just keep on heating up? His theory was that when the light hits the earth it turns into infrared radiation and bounces back into space. But when he tested it and ran the numbers, he found that the earth should be well-below freezing. So he realized that he had to go back to the drawing board. And that is the scientific method in a nutshell. You come up with a theory, test it, and review the results. If the results do not back up your hypothesis,then you try something else. But if they do, then you are onto something, The next step is that you make those results available for other people to replicate, study, pick apart, argue about, and improve upon. This is called PEER REVIEW. And because it is a really important concept,I will rephrase it: that’s when you make your results available to the community of experts in your field, then they double check your work to make sure that you have reported everything accurately and that your experiment actually corresponds to your hypothesis.
So when Fourier’s hypothesis came back null, he researched it more and figured out tha the had not considered the effect of the earth’s atmosphere, which traps a portion of the heat radiation in and keeps it from bouncing out into space. This is known as the greenhouse effect, but no one was talking about that yet. You would have to jump about 80 years into the future to hear the first rumblings about the greenhouse effect. Notably, in 1935 or so, there was an engineer named Guy Callendar, who had noted that temperatures around the globe had been warming. His hobby was weather statistics. That was his hobby. My hobby is comic books, and that makes me feel pretty dumb. He blamed CO2 for the rise in temperature. The idea was met with some skepticism, because, you know, the world is so big, and how could human industry possibly affect something so big?
But because scientists share their work, he knew that there was a reasonable basis for that conclusion: In 1859, a scientist named John Tyndall had sort of unwittingly became the great, great grandfather of global warming. He asked the question, “How does the atmosphere affect temperature?” The prevailing science of the day sort of assumed that all gasses were transparent and that heat radiation would pass right through them. The main gasses in the earth’s atmosphere are oxygen and nitrogen, which indeed, he did find were transparent and didn’t affect heat radiation at all. But then,instead of packing it in, he tried coal gas in his experiment. Coal gas was readily available. After the industrial revolution, you could have it pumped into your office to provide heat. Turned out that when he did the same experiment using coal gas, it was completely opaque to heat radiation. That is, heat would not pass through it. So he tried other gasses as well and found that CO2, carbon dioxide, was likewise opaque. Now, there was a relatively small amount of CO2 in the atmosphere back then about 280 parts per million (ppm). But he warned us: “Hey, you know all these smokestacks that are going up everywhere? If they keep putting CO2 in the atmosphere, we can expect the global temperatures to rise.”
Everyone got a good laugh.
The industrial revolution had brought great progress and riches to the world, and so the last thing people wanted to think about was turning off the power plants just in case the relatively few smokestacks would one day cause unspecified problems at some unspecified time way, way, way in the future. So his findings were widely ignored. Besides which, to crunch the numbers and see exactly how it would play out was far beyond anyone’s capabilities back then. You would need computers to run all those numbers. Besides which, even if you could measure the amount of CO2 in the air in one place, it could change minute to minute if there was a breeze, or rain, or a campfire, or a group of people standing in a line doing jumping jacks, or anything. It seemed pretty hopeless.
About 100 years later, right in the heart of the Cold War, through a long and complicated series of events best summed up as: the pentagon wanted to see if we could manipulate weather to destroy Russia, a scientist named Gilbert Plass began studying CO2 on a new invention called the microcomputer (which at the time was about the size of this room). Being part of the military complex, he had access to many of them, which was necessary given the kinds of computations he was making. After all the number crunching, in 1956, he came out and said that raising or lowering the amount of CO2 in the atmosphere could affect climate by reducing or increasing the amount of heat radiation that was escaping into space. He stated that human activity at present levels would raise the average global temperature about 1.1 degrees C per century.
After he made those results public, other scientists quickly shot them down because the computations were crude and did not include a lot of possibilities (e.g.: changes in temperature would increase the amount of water vapor in the air which would in turn trap more heat, and so on…). And answering questions like that would take way much more computing power than they had back then.
Time passed, many other scientists worked in the field, created new models, employed advanced technology (such as radiocarbon dating), built satellites, studied their observations, and wrote, reviewed, and published articles about the climate and climate change. So we get to skip a few years. By 1977, the National Academy of Sciences reported that catastrophic changes in climate could be on the horizon. They didn’t tell people to stop driving or turn off the smokestacks or anything like that, because they recognized that there was still a lot they did not understand. However, they were fairly certain that there was a direct correlation between energy output and global temperature.
About that same time, there was a scientist named Ed Lorenz, a chaos theorist and student of “strange attractors.” He was the guy who said that a butterfly flapping its wings could lead to a hurricane a thousand miles away. When I find the right tattoo artist, I am going to get a tattoo of the Lorenz Attractor, which is a computer-generated model of differential equations having to do with his study of atmospheric convection (ie. Temperature differences in the atmosphere). Curiously, the model turned out to be roughly the shape of a butterfly, which (awesomely!) was a product of random math, not of his design. But the main thing he did in the world of climate science is make everyone realize that it is all a lot more complicated than previously imagined, so computer models would have to be equally complicated. So many other teams worked on computer models that became increasingly more complex. The scientific method was repeated over and over again.
And then in 1987, a scientist named Wallace Broecker made the breakthrough of more fully realizing the ocean’s role in absorbing heat and CO2 from the atmosphere. It was not his idea alone. It built on the work of many scientists who had come before him. The understanding was not just that the earth was heating up, but that coupling that with the effect of the heat on the ocean could result in severe climate changes globally. He stated in1987 that we had been treating climate change as a “cocktail hour curiosity,”but that now we had to view it “as a threat to human beings and wildlife.” The climate system was a “capricious beast,”and we were “poking it with a sharp stick.”
Around that time, NASA scientist James Hansen appeared before Congress to warn them about global warming. He stated that, among other problems caused by global warming, the future would probably hold extreme events like summer heat waves... and the likelihood of heat wave drought situations in the Southwest and Midwest. He also pointed out that the case was strong that if we put any more than 350 ppm of CO2 in the atmosphere then we would risk the worst case scenarios of climate change.
This year we passed 400 ppm.
So climate change was becoming more public. And because the scientists were pointing at specific industries as a cause for rising CO2 emissions, and because those industries have friends in government, global warming moved from the realm of science into the realm of political debate. Just as scientists were starting to approach consensus on the effects ofCO2 on the atmosphere (at present around 97% of scientists publishing in the field are in agreement that humanity is driving climate change) certain industry leaders and their politicians were lining up against it. For instance,the Wall Street Journal’s editorial page accused global warming scientists of groupthink, which is what social scientist Irving Lanis called the psychological phenomenon in which members of a group valued consensus so much that they set aside their personal objections to go with the flow…even if it meant that the group refused to consider other possible (or even probable)courses of action. Some of the trademarks of groupthink:
1. Illusion of invulnerability –Creates excessive optimism that encourages taking extreme risks.
2. Collective rationalization– Members discount warnings and do not reconsider their assumptions.
3. Belief in inherent morality – Members believe in the rightness of their cause and therefore ignore the ethical or moral consequences of their decisions.
4. Stereotyped views of out-groups– Negative views of “enemy” make effective responses to conflict seem unnecessary.
5. Direct pressure on dissenters – Members are under pressure not to express arguments against any of the group’s views.
6. Self-censorship – Doubts and deviations from the perceived group consensus are not expressed.
7. Illusion of unanimity –The majority view and judgments are assumed to be unanimous.
8. Self-appointed‘mind guards’ – Members protect the group and the leader from information thatis problematic or contradictory to the group’s cohesiveness, view, and/or decisions.
In group think, the group settles on a favored theory or belief and then unconsciously protects it. They avoid dissenting opinions and they ignore opinions that are contrary to their beliefs. But is that really the same as consensus? Doesn’t the scientific method itself guard against this phenomenon?
If I had to sum it up in a couple of sentences, I would say: Scientific consensus demands an adherence to established rules through which decisions are made and ideas are either rejected or advanced. Though certain lines of thinking are ignored, it is not because the participants want one outcome over another and don’t want to rock the boat, it is because those ideas have been eliminated through the process of science and don’t need to be considered any more.
But the process is sometimes misunderstood or misrepresented. For instance, in September of 2013 the UN's Intergovernmental Panel on Climate Change (IPCC)released its report on the global scientific consensus on man-made global warming. But in July The Economist magazine released a portion of the report that claimed to show that IPCC's worst-case predictions were not going to happen. Blogs, newspaper editorial pages, denialist scientists, and politicians quickly ran with it, disseminating the message to the public that even the IPCC had to admit that it was not going to be as bad as they had previously said it would be. But then the IPCC released a statement a couple of days after the leak:
Responding to the news reporting—based on a leaked draft from a working group within the larger framework of the review—the IPCC released a statement which read, in part:
The text is likely to change in response to comments from government and expert reviewers. It is therefore premature and can be misleading to attempt to draw conclusions. Draft reports are intermediate products and do not represent the scientific view that the IPCC provides on the state of knowledge of climate change and its potential environmental and socioeconomic impacts at the conclusion of the process.
We see from this statement the exact process of science. To get to the point where we are today, scientists had asked questions, do background research, construct hypotheses, test their hypotheses, analyze data, draw conclusions, and then subject those conclusions to review by their peers. So, when scientists talk about consensus, it isn’t a popularity contest or something they just decided to agree upon. The consensus on global warming has come from a roughly 175 year history of scientists doing what scientists do. To suggest otherwise seems at best disingenuous and at worst dishonest.