Latest posts by Nancy Thorner (see all)
- Phyllis Schlafly’s Reagan Archive Reveals How She Would Judge Trump - May 19, 2017
- Heartland Institute Event: Tax Expert Faults Trump’s Tax Reform Package for Failing to Address Taxpayer Assistance - May 5, 2017
- As in the ‘Three Bears’ Story, a Just Right Position on Global Warming Presented at Heartland Institute - April 28, 2017
If you wish to predict into the future, model data is too limited to do so, but they can be used in a very limited way to test a hypothesis. It makes no sense to depend on model predictions 50 or 100 years into the future as being valid, nor to change our way of life under the assumption that the model predictions are truthful. – Dr. Anthony R. Lupo
On Tuesday, September 10, The Heartland Institute hosted the second in its series of conference calls with friends and allies previewing the Nongovernmental International Panel on Climate Change’s (NIPCC) Climate Change Reconsidered II: Physical Science ahead of its digital release on Tuesday, September 17th.
Featured on the September 10th call was Dr. Anthony R. Lupo, Department Chair and Professor of Atmospheric Science at the University of Missouri – Columbia, a lead author of a chapter on climate models” in Climate Change Reconsidered II. Dr. Lupo was a Fulbright Scholar to the Russian Academy of Sciences during the 2003-2004 academic year.
He received his masters and Ph.D. degrees in atmospheric physics from Purdue University and is a member of the American Meterological Society, National Weather Association,and American Geophysical Union, and a Fellow of the Royal Meteorological Society. Lupo’s research has appeared many times in peer-reviewed journals, including National Weather Digest, Journal of Geophysical Research, and Bulletin of the American Meteorological Society.
Prior to the introduction Dr. Lupo, Joseph Bast, CEO and president of The Heartland Institute, reiterated how Climate Change Reconsidered II is the result of a collaboration among three organizations: Science & Environmental Policy Project, Center for the Study of Carbon Dioxide and Global Change, and The Heartland Institute, with Heartland in charge of the editing and the publishing. The lead authors and editors of the report are Dr. Craig D. Idso, Dr. Robert M. Carter, and Dr. S. Fred Singer.
Dr. Anthony Lupo gave a brief summary of the findings presented in his chapter. As related by Lupo, the first part of the chapter looked at all aspects of how numerical modeling works and some of its limitation. The second part deals with the use of models in climate forecasting and the concept of “blocking” in terms of large scale phenomenon and its impact.
Dr. Lupo then answered a series of questions posed by Joe Bast. Here are my own summaries and notes from some the questions and answers:
Bast: How are findings of climate models validated?
Lupo: One way is to run the models backwards. The problem is how to validate models against today’s climate through this backward procedure to capture the variability of the elements affecting climate change. There are at least 78 circulation models. One controversy is that each model will come up with a different forecast in a slightly different way. This shows the lack of understanding of how the climate system works.
Bast: What role do models play in predicting future forecasts given the spread of data produced by the 78 models?
Lupo: Models are the best tools we have, but we must look with an eye toward some of their shortcomings. They are not gospel truth, but only purveyors of possible outcomes. Consider the predictions of the IPPC: By the year 2100 the temperature will increase from 2 to 12 degrees Fahrenheit. Some models even show a slight cooling. Real likelihood is that the model predictions are more like 2 – 4 degree Fahrenheit.
Bast: How did the models fail to predict or miss that during the past 16 – 17 years the temperature has been constant or flat?
Lupo: Models have their limitations and their results can’t be treated as the truth of anything. If you wish to predict into the future, model data is too limited to do so, but they can be used in a very limited way to test a hypothesis. It makes no sense to depend on model predictions 50 or 100 years into the future as being valid, nor to change our way of life under the assumption that the model predictions are truthful.
Joe Lakely, Director Communications at The Heartland Institute, then took calls from those on the conference call. Many of the questions had to do with the models themselves and the reliability of them.
- Responses from Lupo (again in my words, not direct quotes) included: Regarding Jim Hansen, a long-time scientist at NASA and now retired, he is certainly a big player in the global warming debate for a number of years, but with his funding dependent on the success of the global warming hypothesis, it would be unwise to rely on his forecasts.
- Why are climate models given any respect given that for 15 years their projections have been inaccurate and given flaws in their methodology they never will be accurate? Lupo saidmany scientists are aware of the faulty model projections, but models are the best tools we have at this time and that in time they could be made more accurate. He expects they will continue to generate a range of options but with higher confidence in a narrower range than is now the case. But he admitted we may never reach the point where models can determine with certainly that a future model prediction will take place.
- Why does the government (EPA) continue to make policy decisions based on faulty projections by models? Lupo compared any attempt to change government policy [especially one relating to a political agenda] to trying to turn the Titanic around. There is so much money and momentum behind the claim that CO2 is the main culprit of global warming, that backing away from that claim will be slow and painful.
- Can models actually teach us anything worthwhile given that they can be tweaked and modified to provide outputs that seem to support almost anyone’s agenda? Lupo took an optimistic approach by indicating that even our failures can teach us something. To circumvent faulty model assumptions, averaging a bank of models will result in a more realistic projection. “Ensemble” forecasts are also employed whereby initial conditions are tweeked as many as a dozen times. If all or most of the tweaked model runs come up with basically the same numerical predication, there is a relatively high degree of confidence that the prediction can be taken at face value.
- Why does the IPCC repeatedly forecast more warming, and more confidence in computer models than climate science actually supports? Lupo said the IPCC from the beginning has overestimated climate sensitivity to carbon dioxide, whereas to skeptics they properly understand that climate “is a very strong beast” which resists the forcing of climate change through sensitivity to its surroundings. The IPCC ignores natural weather cycles and new evidence of a greater solar impact on climate than previously thought. The IPCC will acknowledge that the temperature has been flat the last 15-20 years but at the same time will declare: “Just wait, global warming is coming!”
A extensively peer-reviewed study published last December in the Journal of Atmospheric and Solar-Terrestrial Physics with research conducted by Nicola Scafetta, a scientist at Duke University, addressed the three gross omissions by the IPCC, noting that climate changes observed since 180 are linked to cyclical, predictably, naturally occurring events in Earth’s solar system with little or no help from us.
Two more calls are scheduled in the CCR-II conference call series:
*Tuesday, September 17 at Noon CST
Climate Change Reconsidered II Release Day
Speakers: Dr. Craig Idso, Dr. Willie Soon, and Dr. Fred Singer
*Tuesday, September 24 at Noon CST
Response to the Intergovernmental Panel on Climate Change (IPCC) Report (www.ipcc.ch)