I blame technology for our concerns about climate change. Not so much the centuries-old technologies of steam engines, nor even their successors, the various flavors of internal combustion engines. I don’t blame power plants. I don’t blame cement production.
It’s modern technology that is at fault for our concerns. Without satellites we would not be worried about the extent or area covered by Arctic sea ice. Without sophisticated use of several technologies to pinpoint the amount of sea level rise it would not be a concern. Without communications technologies we would not be aware that the global average temperature has risen.
This has caused the creation of a new discipline of science–climate science. Although we have been studying the climate for centuries, this new science uses new tools, primarily computer-based and folds several other sub-disciplines within it.
Most of the phenomena captured and studied by this new science only have good data going back 30 years, in some cases a century and a half. Older data exists but is corrupted by time to the extent that using it properly means highlighting large error bands and other indicators of uncertainty. What climate scientists are doing is extrapolating from a few decades’ worth of data, extending trends and not having a good grip on natural variability.
We know from well-established physics that doubling CO2 concentrations should try and put the global average temperature up by about 1 degree Celsius. We know from observation that we are well on our way to doubling the concentrations extant in about 1880. And we know that the global average temperature has risen by about 0.8C. Beyond that, we are making guesses, some educated, others less so. This is particularly true when it comes to understanding the sensitivity of the atmosphere to a doubling of concentrations of CO2. It is also particularly true of our understanding of the net effects of clouds.
Perhaps the most pernicious application of technology to climate change is a bit unexpected–the proliferation of statistical analysis software applications that permit number crunching without understanding. This has led to misinterpretation of data, faulty extension of trend lines and a religious observance of the 95% level of confidence that permits publication of results with banner headlines without any backwards look at how that level of confidence is arrived at, what may augur a 5% faulty result. We now have a number of climate scientists that literally do not study climate. They study computer findings.
Just as meteorologists are advised to look out the window prior to making a forecast, we should require climatologists to spend more time in the field gathering data and more time in the lab looking at it.
Otherwise we end up in a similar situation to the 50s and 60s in the U.S. food safety programs, where increases in science’s ability to detect ever smaller residues of pesticides led to the FDA’s ever tighter requirements for limits on their existence on the surface of fruits and vegetables. These limits were tightened not because of findings of harm, but merely because they were increasingly detectable.
Science, like every other part of modern societies, has new toys to play with. What they need is more history to help put it in perspective. What we need is a science historian. The tragedy we face is that someone like Naomi Oreskes has the job title. She is committed to the cause of eradicating human CO2 from the planet, to the point where she gamed her seminal study to conceal the diversity of opinion on climate change, rather than describe it.
So the one scientific position that could help us put new technological findings in perspective has been discredited by its premiere office holder.
So the game continues.