Steve is the author of three books on sustainability, climate change and energy. His latest book, Outside the Green Box: Rethinking Sustainable Development, tells readers “what their green consultant didn’t tell them.” More than 100,000 copies of his books are now in print.
Steve holds an MS in Electrical Engineering from the University of Illinois and an MBA from the University of Chicago. He has more than 30 years of experience at Fortune 100 and private companies in engineering and executive roles. In his last industry position, he was vice president and general manager of an engineering and manufacturing operation with 350 employees and annual sales of $300 million. Steve is a husband and father of three and resides in Illinois.
Latest posts by Steve Goreham (see all)
- “Electrify Everything” Policies Would Eliminate Gas Heating and Cooking - November 4, 2019
- Modern Transportation—A Miracle Under Attack - August 12, 2019
- Never Have US Health Professionals Been So Foolish - July 24, 2019
On Sunday, Death Valley temperatures reached 129oF, a new June record high for the United States, according to the National Weather Service. Temperatures at McCarran International Airport in Las Vegas reached 117oF, tying the previous record set in 1942 and 2005. National Geographic, NBC News, and other media ran stories attributing the Southwest heat wave to human-caused global warming. But history shows that today’s temperatures are nothing extraordinary.
The United States high temperature record was set in 1913, measured in Death Valley on July 10. Twenty-three of the 50 US state high temperature records date back to the decade of the 1930s. Seventy percent of state high records were set prior to 1970.
The alarm about climate change is all about one degree. According to the National Oceanic and Atmospheric Administration (NOAA), global surface temperatures have increased about 1.3oF (0.7oC) since 1880. Proponents of the theory of man-made warming claim that this is evidence that man-made greenhouse gases are raising global temperatures.
One degree over more than 130 years isn’t very much. In contrast, Chicago temperatures vary from about -5oF to 95oF, about 100 degrees, each year. When compared to this 100-degree annual swing, the rise in global temperatures since the 1800s is trivial, captured by a thin line on a graph.
Nevertheless, NOAA repeatedly raises concern about global temperatures. The NOAA website proclaims that “May 2013 global temperatures were the third highest on record.” This sounds alarming unless one understands that “on record” refers to the thermometer record, which only dates back to about 1880.
Climate changes over hundreds and thousands of years. Data from ice cores show several periods during the last 10,000 years that were warmer than today, including the Roman Climate Optimum at the height of the Roman Empire and the Medieval Warm Period, when the Vikings settled southwest Greenland. The warm and cool eras since the last ice age were due to natural climate cycles, not greenhouse gas emissions. The “on record” period that NOAA references is only a tiny part of the climatic picture.
Global average temperature is difficult to measure. The data sets of NOAA are an artificial estimate at best. They start with a patchwork collection of thousands of thermometer stations that inadequately cover the globe. Station coverage of the oceans and of the far northern and southern regions is inconsistent and poor. To cover areas without thermometers, averaging estimates are made from surrounding stations to try to fill in the holes.
In addition to coverage problems, gauge measurements often contain large errors. Man-made structures such as buildings and parking lots absorb sunlight, artificially increasing local temperatures. Cars, air conditioners, and other equipment generate heat when operating, creating what is called an Urban Heat Island effect.
The accuracy of the US temperature record is questionable. Meteorologist Anthony Watts, creator of the science website WattsUpWithThat, led a team of volunteers that audited more than 1,000 US temperature gauge stations from 2007 to 2011. Over 70 percent of the sites were found to be located near artificial heating surfaces such as buildings or parking lots, rated as poor or very poor by the site rating system of the National Climatic Data Center, a NOAA organization. These stations were subject to temperature errors as large as 3.6oF (2oC).
Simple problems can throw off gauge readings. Temperature stations are louvered enclosures that are painted white to reflect sunlight and minimize solar heating. As the station weathers and the paint ages, gauge stations read artificially high temperatures. A study published last month found that after only five years of aging, temperature stations will record a temperature error of 2.9oF (1.6oC) too high. This is greater than the one degree rise in the last 130 years that NOAA is alarmed about.
In addition to temperature measurement error, NOAA makes “adjustments” to the raw temperature data. According to a 2008 paper, after raw thermometer data is received, a computer algorithm “homogenizes” the data, adjusting for time-of-observation, station moves, thermometer types, and other factors to arrive at the official temperature data set.
This sounds good until one looks at the adjustment that NOAA has added. For temperature data from 1900 to 1960, very little adjustment is added. But after 1960, NOAA adds an upward adjustment to the thermometer data that rises to 0.5oF (0.3oC) by the year 2000. This gives a whole new meaning to the phrase “man-made global warming.”
Heat waves are real just as climate change is real. But a heat record in Las Vegas or one degree of temperature rise since the Civil War is not evidence that humans are the cause.
[First Published in The Washington Times]