What Is A “Normal” Climate?

Guest Opinion: Dr. Tim Ball – Re-Blogged From http://www.WattsUpWithThat.com

There is a form of argument called reductio ad absurdum. If you can reduce a position to the absurd, it was absurd in the first place. It doesn’t work as well as it used to because there is an embarrassment of absurdity in today’s world. However, there are arguments that are exposed by such an approach.

Al Gore’s fairy tale movie, An Inconvenient Truth claimed global temperature was “just right.” It is as Goldilocks said about porridge, “Not too hot, not too cold, but just right.” The movie fully deserved the Oscar because it was a fairy tale produced in Hollywood, the land of make-believe. The great Goracle declares we must maintain this normal

because the wicked witch CO2 threatens it. From whichever castle he currently resides, he dictates we maintain the status quo, so he can continue his “normal” lifestyle, including profiting from selling the tale. He delights in referencing people from the past, such as Arrhenius, Callendar, or Roger Revelle, conveniently ignoring that they all lived through different “normals”. He also ignores the “normal” conditions his Ice Age ancestors enjoyed.

Gore wants to maintain his “normal” by reducing the amount of CO2 in the atmosphere to pre-industrial levels. The Intergovernmental Panel on Climate Change (IPCC) says this was 270 ppm. This is incorrect, but let’s assume it is true in order to consider the consequences of achieving that level. Assume also that the IPCC is correct and that virtually all the increase in global temperatures from the nadir of the Little Ice Age, especially since 1950. The IPCC says current levels are 400 ppm, so presumably to achieve pre-industrial levels requires a 130-ppm reduction. According to the science of the IPCC and fellow Nobel winner Al Gore CO2 levels determine temperature, so this will result in a return to Little Ice Age conditions. A multitude of sources itemizes these conditions, particularly Jean Grove’s The Little Ice Age and a listing at CO2Science.org, and the Nongovernmental International Panel on Climate Change (NIPCC).

The IPCC and Gore only consider the temperature implications of CO2, but it is essential to plant life, which in turn determines oxygen levels essential to all life. How much vegetative loss would occur with a 130-ppm reduction? It is only a computer model determination, but Donohue et al., (pay-walled) abstract explains.

Satellite observations reveal a greening of the globe over recent decades. The role in this greening of the “CO2 fertilization” effect—the enhancement of photosynthesis due to rising CO2 levels—is yet to be established. The direct CO2 effect on vegetation should be most clearly expressed in warm, arid environments where water is the dominant limit to vegetation growth. Using gas exchange theory, we predict that the 14% increase in atmospheric CO2 (1982–2010) led to a 5 to 10% increase in green foliage cover in warm, arid environments. Satellite observations, analyzed to remove the effect of variations in precipitation, show that cover across these environments has increased by 11%. Our results confirm that the anticipated CO2 fertilization effect is occurring alongside ongoing anthropogenic perturbations to the carbon cycle and that the fertilization effect is now a significant land surface process.

Again assuming they are correct, a 14% increase in CO2 resulted in an 11% increase in vegetation. What impact would a reduction of 130 ppm, approximately a 32% decrease, have? What would the combined impact of reduced CO2 fertilization and temperature have? Grove and others showed the impact of temperature reduction, but not CO2.

In the 1970s when global cooling was the consensus, Martin Parry produced studies of the impact of cooling over the course of the LIA (Figure1).

clip_image002

Figure 1

Figure 1 shows the county of Berwickshire in the Borders Region of the UK with a high percentage of land lost to cultivation over the period. What was normal for the people living through these times? The answer is whatever they experienced.

The World Meteorological Organization (WMO) introduced the 30-year normal, purportedly to help this problem of what is normal or average for planning and other applications. As they explain,

“The Standard Climate Normals underpin many climate services and applications, including climatologies, and also comprise the reference period for the evaluation of anomalies in climate variability and change monitoring.”

A problem becomes evident when comparing historic records, such as for the period 1781 to 1810, with the modern normal. Which modern normal would you use, the first one, 1931 – 1960, or the current one 1981-2010? William Wright wrote a paper about the problem in which he

“…argued in favour of a dual normals standard. CCl-MG concurred with the conclusion that there is a need for making frequent updates in computing the normals for climate applications (prediction and climatology purposes), based on the need to base fundamental planning decisions on average and extreme climate conditions in non stationary climate conditions.”

And there is the rub, “non stationary climate conditions.”

There is also the problem of adjustments endemic with all “official” data. The National Oceanographic and Atmospheric Administration (NOAA) says,

“Several changes and additions have been incorporated into the 1981-2010 Normals. Monthly temperature and precipitation normals are based on underlying data values that have undergone additional quality control. Monthly temperatures have also been standardized to account for the effects of station moves, changes in instrumentation, etc.”

Presumably this means you cannot compare the results with those of earlier “normals”.

NOAA informs us that

“Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations across the United States.”

No, they aren’t! They are only 30–year averages that add nothing to understanding typical climate conditions for any location. Since the 30-year average changes because of mechanisms that operate on longer than 30-year timescales, they simply tell you the climate for that period. The problem illustrates the omission of Milankovitch mechanisms from the IPCC. As reported, Professor Lindzen observed in the recent APS workshop,

He also notes that the IPCC estimate of the man-made effect is about 2 Watts/m2 in AR5 and that is much smaller than the Milankovitch effect of 100 Watts/m2 at 65 degrees north, see Edvardson, et al.

Figure 2 shows the 100 Watts/m2 insolation variability at 65°N calculated by Berger in 1978 and discussed in my article “Important But Little Known “Earth “ scientists.”

clip_image004

Figure 2: Variations in the amount of insolation (incoming solar radiation) at 65°N

Source: BERGER, A. 1978. Long-term variations of daily insolation and Quaternary climatic changes. J. Atmos. Sci. 35: 2362–2367.

While on a radio program, an IPCC modeler told me they omitted Milankovitch because they considered the time scale inappropriate.

Apparently people are planning and making management decisions on the basis of these “normals”. NOAA reports,

In addition to weather and climate comparisons, Normals are utilized in seemingly countless applications across a variety of sectors. These include: regulation of power companies, energy load forecasting, crop selection and planting times, construction planning, building design, and many others.

They are assuming that these conditions will continue. It reminds me of the presentation by Michael Schlesinger at a conference in Edmonton on the future of climate on the Canadian Prairies. A bureaucrat said we are planning reforestation in parts of Southern Alberta, and your data shows it is a desert in 50 years. How accurate is your prediction? Schlesinger said about 50 percent. The bureaucrat said, “My Minister wants 98 percent.”

It is no better today. Figure 3 is a map of 12 – month precipitation forecast accuracy for Canada. It is less than 40 % for 95 % of Canada when compared to the 30-year normal for 1981-2010.

clip_image006

Figure 3

I understand 30 years was chosen because 30 is considered a statistically significant sample size (n) for any population (N). It is of no value for climate patterns and the mechanisms that create them that operate over much longer time periods. NOAA acknowledge this when they write,

In fact, when the widespread practice of computing Normals commenced in the 1930s, the generally-accepted notion of the climate was that underlying long-term averages of climate time series were constant.

That idea permeated and became the fundamental public understanding that climate is constant, making current changes unnatural. What also happened was the 30 – year normal became the average for radio and TV weather people. NOAA confirms this adaptation.

Meteorologists and climatologists regularly use Normals for placing recent climate conditions into a historical context. NOAA’s Normals are commonly seen on local weather news segments

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s