Pages

March 12, 2009

The Other Side Of The Story




from The Resilient Earth

Why Climate Modeling Is Not Climate Science

Most everyone has heard the recent announcement that Global Warming has been put on hold for 20 to 30 years. Earth's variable climate continues to make fools of climate scientists, obstinately refusing to follow the IPCC's climate change script. Why? Because the climate change doomsayers put their faith in computer models, not in hard science.

Following a 30-year warming trend that started in the 1970's, global temperatures have leveled off and even declined since 2001 (see Greenland's Ice Armageddon Comes To An End). Despite rising greenhouse gas concentrations, and a heat surplus that should have caused global temperatures to continue rising, nature has pulled a thermal about face and sent temperatures downwards. Climate scientists are flummoxed.

“It is possible that a fraction of the most recent rapid warming since the 1970's was due to a free variation in climate,” Isaac Held of the US National Oceanographic and Atmospheric Administration (NOAA) wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”

“This is nothing like anything we've seen since 1950,” Kyle Swanson of the University of Wisconsin-Milwaukee said. “Cooling events since then had firm causes, like eruptions or large-magnitude La Niñas. This current cooling doesn't have one.” Why didn't the vast cadre of global warming researchers see this cooling trend coming? After all they have been predicting how Earth's climate will behave 100 years in the future, surely they can predict what will happen over the next decade or two?

The problem here is that all the predictions bandied about by the IPCC and various climate change celebrities are based on General Circulation Models (GCM). These are large, complex computer programs that attempt to model Earth's entire environment. A computer model is a simplified stand-in for some real system; a computer network, a protein molecule, the atmosphere, or Earth's entire climate. A modeler tries to capture the most important aspects of the system being modeled, while leaving off unnecessary detail. Of course, determining what's important and what isn't is the trick.



This means that a model is always less complicated than the thing being modeled, which in turn means that a model never behaves exactly like the thing being modeled. When you have a large complex system that can react in unexpected ways—called non-linear responses by mathematicians—the accuracy of models becomes even more suspect. In fact, it may be impossible to create an accurate model at all.

Modeling the atmosphere alone is complex, requiring knowledge of incoming solar radiation, the movement of air currents over the land and sea, heat convection, the amount of water vapor, the effects of clouds, and on and on. People have been trying to model Earth's atmosphere for decades, primarily to predict the weather. The weather forecasts you hear on your evening news are all based on computer models. How accurate are these models? In the near term, a few days from today, local weather forecasts are about 60% accurate when predicting high temperatures. For a GCM add in modeling of many more features as well. In fact, GCMs started off very simple and have continued to grow in complexity, if not accuracy, over the years.





Storm track prediction is an example of quantitative modeling, where the expected results of a model are hard numbers. In the case of hurricanes, a storm's track and changes in strength over time are the desired results. But Hurricane trackers get constant feedback from the real storms they track, allowing them to constantly correct their models when they go wrong. We have so little accurate climate data that such corrections are not really possible with GCM. When a GCM prediction diverges from reality is the model wrong, is it missing some critical factor that has been overlooked, or is the baseline data erroneous or non-representative? It is very hard to tell. Ignoring the known inaccuracies of GCM, scientists are still asking models to provide hard answers, something they just cannot do.

Climate change modeling is an example of qualitative modeling. These types of models result in general trends and overall effects of parameter modification. They are used to provide insight into processes where scientists' intuition fails. For example, a qualitative model can tell us that adding more CO2 to the atmosphere should cause warming. But qualitative models should not be used to make concrete predictions of future conditions, such as the global average temperature for the next 100 years. As Richard W. Hamming put it, “The purpose of computing is insight, not numbers.”

That, unfortunately, has not stopped the IPCC and affiliated climate alarmists from trotting out model predictions as science fact. In doing so they are committing the cardinal sin of computer modeling: believing that the model is the thing being modeled. Factually speaking, such numbers are at best a guess and at worst out right lies. So, why is Earth cooling down?

Russian scientists have been saying for years that we’re headed for a cold spell based on known solar cycles. Swanson credits the alignment of a series of climate processes for the cooling climate but what's actually causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be transferring heat into the ocean depths.

Extensive low cloud cover in the tropics may be reflecting more of the sun's energy than usual back out into space. This would be in line with the historically low solar activity according to the predictions of Henrik Svensmark and Nir Shaviv. In 1997, Svensmark and Eigil Friis-Christensen popularised a theory that linked galactic cosmic rays and global climate change mediated primarily by variations in the intensity of the solar wind, which they have termed cosmoclimatology. This theory was later expanded on by Shaviv. The influence of cosmic rays on cloud cover has been ignored by the traditional climate science community since it downplays the impact of CO2 on global temperature.




In 2008, a new computer model developed by German researchers, reported in the journal Nature, suggested cooling will counter greenhouse warming for the next decade. However, this model predicts that temperatures will again be rising by 2020. More recently, the study in Geophysical Research Letterspredicts global cooling for the next 20 to 30 years. But, that report also predicts that, once that hiccup is over, “warming will return and be very aggressive.”

Why do all of the “main stream” climate researchers always add, “global warming will be back”? Because they are still using the outputs of the same models that didn't predict the current cooling trend. The same models that have never correctly predicted Earth's climate. Climate modeling has become a crutch for a previously ignored scientific community that has been thrust into the public light by the global warming scare. They don't have real scientific answers so they use their wonky models, hope for the best and keep asking for more grant money. It is time the public said enough.

1 comment: