Five Points About Climate Change

By Professor Philip Lloyd – Re-Blogged From http://www.WattsUpWithThat.com

Daily we are told that we are wicked to burn fossil fuels.  The carbon dioxide which is inevitably emitted accumulates in the atmosphere and the result is “climate change.” If the stories are to believed, disaster awaits us. Crops will wither, rivers will dry up, polar bears will disappear and malaria will become rampant.

It is a very big “IF”. We could waste trillions for nothing.  Indeed, Lord Stern has estimated that it would be worth spending a few trillion dollars each year to avoid a possible disaster in 200 years’ time. Because he is associated with the London School of Economics he is believed – by those whose experience of insurance is limited. Those who have experience know that it is not worth insuring against something that might happen in 200 years time – it is infinitely better to make certain your children can cope. With any luck, they will do the same for their children, and our great-great-great grandchildren will be fine individuals more than able to deal with Lord Stern’s little problem.
So I decided to examine the hypothesis from first principles.

There are five steps to the hypothesis:

1. The carbon dioxide (CO2) content of the atmosphere is rising.

2. The rise in CO2 in the atmosphere is largely paralleled by the increase in fossil fuel combustion. Combustion of fossil fuels results in emission of CO2, so it is eminently reasonable to link the two increases.

3. CO2 can scatter infra-red over wavelengths primarily at about 15 µm. Infra-red of that wavelength, which should be carrying energy away from the planet, is scattered back into the lower troposphere, where the added energy input should cause an increase in the temperature.

4. The expected increase in the energy of the lower troposphere may cause long-term changes in the climate and thermosphere, which could be characterized by increasing frequency and/or magnitude of extreme weather events, an increase in sea temperatures, a reduction in ice cover and many other changes.

5. The greatest threat is that sea levels may rise and flood large areas presently densely inhabited.

Are these hypotheses sustainable in the scientific sense? Is there a solid logic linking each step in this chain?

The increase in CO2 in the atmosphere is incontrovertible. Many measurements show this. For instance, since 1958 there have been continuous measurements at the Mauna Loa observatory in Hawaii:

clip_image002

The annual rise and fall is due to deciduous plants growing or resting, depending on the season.  But the long-term trend is ever-increasing levels of CO2 in the atmosphere.

There were only sporadic readings of CO2 before 1958, no continuous measurements. Nevertheless, there is sufficient information to construct a view back to 1850:

clip_image004

There was a slight surge in atmospheric levels about 1900, then a period of near stasis until after 1950, when there was a strong and ongoing increase which has continued to this day. Remember this pattern – it will re-appear in a different guise.

The conclusion is clear – there has been an increase in the carbon dioxide in the atmosphere. What may have caused it?

Well, there is the same pattern in the CO2 emissions from the burning of fossils fuels and other industrial sources:

clip_image006

A similar pattern is no proof – correlation is not causation. But if you try to link the emissions directly to the growth in atmospheric CO2, you fail. There are many partly understood “sinks” which remove CO2 from the atmosphere. Trying to follow the dynamics of all the sinks has proved difficult, so we do not have a really good chemical balance between what is emitted and what turns up in the air.

Fortunately isotopes come to our aid. There are two primary plant chemistries, called C3 and C4. C3 plants are ancient, and they tend to prefer the 12C carbon isotope to the 13C. Plants with a C4 chemistry are comparatively recent arrivals, and they are not so picky about their isotopic diet. Fossil fuels primarily come from a time before C4 chemistry had evolved, so they are richer in 12C than today’s biomass. Injecting into the air 12C-rich CO2 from fossil fuels should therefore cause the 13C in the air to drop, which is precisely what is observed:

clip_image008

So the evidence that fossil fuel burning is the underlying cause of the increase in the CO2 in the atmosphere is really conclusive.  But does it have any effect?

Carbon dioxide scatters infra-red over a narrow range of energies.  The infra-red photons, which should be carrying energy away from the planet, are scattered back into the lower troposphere. The retained energy should cause an increase in the temperature.

Viewing the planet from space is revealing:

clip_image010clip_image012
The upper grey line shows the spectrum which approximates that of a planet of Earth’s average albedo at a temperature 280K. That is the temperature about 5km above surface where incoming and outgoing radiation are in balance. The actual spectrum is shown by the blue line. The difference between the two is the energy lost by scattering processes caused by greenhouse gases. Water vapour has by far the largest effect. CO2 contributes to the loss between about 13 and 17 μm, and ozone contributes to the loss between about 9 and 10µm.


The effect of carbon dioxide absorption drops off logarithmically with concentration. Doubling the concentration will not double any effect. Indeed, at present there is ~400ppm in the atmosphere.  We are unlikely see a much different world at 800ppm. It will be greener – plants grow better on a richer diet – and it may be slightly warmer and slightly wetter, but otherwise it would look very like our present world.


However, just as any effect will lessen proportionately with increase in concentration, so it will increase proportionately with any decrease.  If there are to be any observable effects, they should be visible in the historical records.  Have we seen them?


There are “official” historical global temperature records. A recent version from the Hadley Climate Research unit is:

clip_image014


The vertical axis gives what is known as the “temperature anomaly”, the change from the average temperature over the period 1950-1980. Recall that carbon dioxide only became significant after 1950, so we can look at this figure with that fact in mind:

* from 1870 to 1910, temperatures dropped, there was no significant rise in carbon dioxide

* from 1910 to 1950, temperatures rose, there was no significant rise in carbon dioxide.
* from 1950 to 1975, temperatures dropped, carbon dioxide increased

* from 1975 to 2000, both temperature and carbon dioxide increased

* from 2000 to 2015, temperatures rose slowly but carbon dioxide increased strongly.

Does carbon dioxide drive temperature changes? Looking at this evidence, one would have to say that, if there is any relationship, it must be a very weak one. In one study I made of the ice core record over 8 000 years, I found that there was a 95% chance that the temperature would change naturally by as much as +/-2degrees C during 100 years. During the 20th century, it changed by about 0.8degrees C. The conclusion? If carbon dioxide in the atmosphere does indeed cause global warming, then the signal has yet to emerge from the natural noise.

One of the problems with the “official” temperature records such as the Hadley series shown above is that the official record has been the subject of “adjustments”. While some adjustment of the raw data is obviously needed, such as that for the altitude of the measuring site, the pattern of adjustments has been such as to cool the past and warm the present, making global warming seem more serious than the raw data warrants.

It may seem unreasonable to refer to the official data as “adjusted”. However, the basis for the official data is what is known as the Global Historical Climatology Network, or GHCN, and it has been arbitrarily adjusted. For example, it is possible to compare the raw data for Cape Town, 1880-2011, to the adjustments made to the data in developing GHCN series Ver. 3:

clip_image016

The Goddard Institute for Space Studies is responsible for the GHCN. The Institute was approached for the metadata underlying the adjustments. They provided a single line of data, giving the station’s geographical co-ordinates and height above mean sea-level, and a short string of meaningless data including the word “COOL”. The basis for the adjustments is therefore unknown, but the fact that about 40 successive years of data were “adjusted” by exactly 1.10 degrees C strongly suggests fingers rather than algorithms were involved.

There has been so much tampering with the “official” records of global warming that they have no credibility at all. That is not to say that the Earth has not warmed over the last couple of centuries.  Glaciers have retreated, snow-lines risen. There has been warming, but we do not know by how much.

Interestingly, the observed temperatures are not unique. For instance, the melting of ice on Alpine passes in Europe has revealed paths that were in regular use a thousand years and more ago. They were then covered by ice which has only melted recently. The detritus cast away beside the paths by those ancient travellers is providing a rich vein of archaeological material.

So the world was at least as warm a millennium ago as it is today. It has warmed over the past few hundred years, but the warming is primarily natural in origin, and has nothing to do with human activities.  We do not even have a firm idea as to whether there is any impact of human activities at all, and certainly cannot say whether any of the observed warming has an anthropogenic origin. The physics say we should have some effect; but we cannot yet distinguish it from the natural variation.

Those who seek to accuse us of carbon crime have therefore developed another tool – the global circulation model. This is a computer representation of the atmosphere, which calculates the conditions inside a slice of the atmosphere, typically 5km x 5km x 1km, and links each to an adjacent slice (if you have a big enough computer – otherwise your slices have to be bigger).

The modellers typically start their calculations some years back, for which there is a known climate, and try to see they can predict the (known) climate from when they start up to today.  There are many adjustable parameters in the models, and by twiddling enough of these digital knobs, they can “tune” the model to history.

Once the model seems to be able to reproduce historical data well enough, it is let rip on the future. There is a hope that, while the models may not be perfect, if different people run different tunings at different times, a reasonable range of predictions will emerge, from which some idea of the future may be gained.

Unfortunately the hopes have been dashed too often. The El Nino phenomenon is well understood; it has a significant impact on the global climate; yet none of the models can cope with it. Similarly, the models cannot do hurricanes/typhoons – the 5kmx5km scale is just too coarse. They cannot do local climates – a test of two areas only 5km apart, one of which receives at least 2 000mm of rain annually, and the other averages just on 250mm, failed badly.  There was good wind and temperature data and the local topography. The problem was modelled with a very fine grid, but there were not enough tuning knobs to be able to match history.

Even the basic physics used in these models fails. The basic physics predicts that, between the two Tropics, the upper atmosphere should warm faster than the surface. We regularly fly weather balloons carrying thermometers into this region. There are three separate balloon data sets, and they agree that there is no sign of extra warming:

clip_image018


The average of the three sets is given by the black squares. The altitude is given in terms of pressure, 100 000Pa at ground level and 20 000Pa at about 9km above surface. There are 22 different models, and their average is shown by the black line.  At ground level, measurement shows warming by 0.1degrees C per decade, but the models predict 0.2degrees C per decade.  At 9km, measurement still shows close to 0.1degrees C, but the models show an average of 0.4degrees C and extreme values as high as 0.6degrees C. Models that are wrong by a factor of 4 or more cannot be considered scientific. They should not even be accepted for publication – they are wrong.

The hypothesis that we can predict future climate on the basis of models that are already known to fail is false. International agreements to control future temperature rises to X degrees C above pre-industrial global averages have more to do with the clothing of emperors than reality.

So the third step in our understanding of the climate boondoggle can only conclude that yes, the world is warming, but by how much and why, we really haven’t a clue.

What might the climate effects of a warmer world be? What is “climate”? It is the result of averaging a climatological variable, such as rainfall or atmospheric pressure, measured typically over a month or a season, where the average is taken over several years so as to give an indication of the weather that might be expected at that month or season.

Secondly, we need to understand the meaning of “change”. In this context it clearly means that the average of a climatological variable over X number of years will differ from the same variable averaged over a different set of X years.  But it is readily observable that the weather changes from year to year, so there will be a natural variation in the climate from one period of X years to another period X years long.  One therefore needs to know how long X must be to determine the natural variability and thus to detect reliably any change in the measured climate.

This aspect of “climate change” appears to have been overlooked in all the debate.  It seems to be supposed that there was a “pre-industrial” climate, which was measured over a large number of years before industry became a significant factor in our existence, and that the climate we now observe is statistically different from that hypothetical climate.

The problem, of course, is that there is very little actual data from those pre-industrial days, so we have no means of knowing what the climate really was.  There is no baseline from which we can measure change.

Faced by this difficulty, the proponents of climate change have modified the hypothesis. It is supposed that the observed warming of the earth will change the climate in such a way as to make extreme events more frequent. This does not alter the difficulty; in fact, it makes it worse.

To illustrate, assume that an extreme event is one that falls outside the 95% probability bounds. So in 100 years, one would expect 5 extreme events on average.  Rather than taking 100 years of data to obtain the average climate, there are now only 5 years to obtain an estimate of the average extreme event, and the relative error in averaging 5 variable events is obviously much larger than the relative error in averaging 100 variable events.

The rainfall data for England and Wales demonstrates this quite convincingly:-

clip_image020

The detrended data are close to normally distributed, so that it is quite reasonable to use normal statistics for this. The 5% limits are thus two standard deviations either side of the mean.  In the 250-year record, 12.5 extreme events (those outside the 95% bounds) would be expected.  In fact, there are 7 above the upper bound and 4 below the lower bound, or 11 in total. Thus it requires 250 years to get a reasonable estimate (within 12%) of only the frequency of extreme rainfall.  There is no possibility of detecting any change in this frequency, as would be needed to demonstrate “climate change”.

Indeed, a human lifespan is insufficient even to detect the frequency of the extreme events.  In successive 60-year periods, there are 2, 4, 2 and 2 events, an average of 2.5 events with a standard deviation of 1.0. There is a 95% chance of seeing between 0.5 and 5.5 extreme events in 60 years, where 3 (5% of 60) are expected. Several lifetimes are necessary determine the frequency with any accuracy, and many more to determine any change in the frequency.

It is known to have been warming for at least 150 years. If warming had resulted in more extreme weather, it might have been expected that there was some evidence for an increase in extreme events over that period. The popular press certainly tries to be convincing when an apparently violent storm arises. But none of the climatological indicators that have data going back at least 100 years show any sign of an increase in frequency of extreme events

For instance, there have been many claims that tropical cyclones are increasing in their frequency and severity.  The World Meteorological Organisation reports:  “It remains uncertain whether past changes in tropical cyclone activity have exceeded the variability expected from natural causes.”

It is true that the damage from cyclones is increasing, but this is not due to more severe weather.  It is the result of there being more dwellings, and each dwelling being more valuable, than was the case 20 or more years ago.  Over a century of data was carefully analysed to reach this conclusion.  The IPCC report on extreme events agrees with this finding.

Severe weather of any kind is most unlikely to make any part of our planet uninhabitable – that includes drought, severe storms and high winds. In fact, this is not too surprising – humanity has learned how to cope with extreme weather, and human beings occupy regions from the most frigid to the most scalding, from sea level to heights where sea-level-dwellers struggle for breath. Not only are we adaptable, but we have also learned how to build structures that will shield us from the forces of nature.

Of course, such protection comes at a cost. Not everyone can afford the structures needed for their preservation.  Villages are regularly flattened by storms that would leave most modern cities undamaged. Flood control measures are designed for the one-in-a-hundred year events, and they generally work – whereas low-lying areas in poor nations are regularly inundated for want of suitable defences.

Indeed, it is a tribute to the ability of engineers to protect against all manner of natural forces.  For instance, the magnitude 9 Tōhoku earthquake of 2011 (which caused the tsunami that destroyed the reactors at Fukushima) caused little physical damage to buildings, whereas earlier that year, the “mere” magnitude 7 earthquake in Wellington, New Zealand, toppled the cathedral, which was not designed to withstand earthquakes.

We should not fear extreme weather events. There is no evidence that they are any stronger than they were in the past, and most of us have adequate defenses against them.  Of course, somewhere our defenses will fail, but that is usually because of a design fault by man, not an excessive force of Nature. Here, on the fourth step of our journey, we can clearly see the climate change hypothesis stumble and fall.

In the same way, most of the other scare stories about “climate change” fail when tested against real data. Polar bears are not vanishing from the face of the earth; indeed, the International Union for the Conservation of Nature can detect no change in the rate of loss of species over the past 400 years. Temperature has never been a strong determinant of the spread of malaria – lack of public health measures is a critical component in its propagation. Species are migrating, but whether temperature is the driver is doubtful – diurnal and seasonal temperature changes are so huge that a fractional change in the average temperature is unlikely to be the cause. Glaciers are melting, but the world is warmer, so you would expect them to melt.

There remains one last question – will the seas rise and submerge our coastlines?

First it needs to be recognized that the sea level is rising. It has been rising for about the past 25 000 years. However, for the past 7 millennia it has been rising slower than ever before:

clip_image022

The critical question is whether the observed slow rate of rise has increased as a result of the warming climate. There are several lines of evidence that it has not.  One is the long-term data from tide gauges. These have to be treated with caution because there are areas where the land is sinking (such as the Gulf of Mexico, where the silt carried down the Mississippi is weighing down the crust), and others where it is rising (such as much of Scandinavia, relieved of a burden of a few thousand metres of ice about 10 000 years ago). A typical long-term tide gauge record is New York:

clip_image024

The 1860-1950 trend was 2.47-3.17mm/a; the 1950-2014 trend was 2.80-3.42mm/a, both at a 95% confidence level. The two trends are statistically indistinguishable. There is <5% probability that they might show any acceleration after 1950.

Another line of evidence comes from satellite measurements of sea level.  The figure below shows the latest available satellite information – it only extends back until 1993. Nevertheless, the 3.3±0.3mm/a rise in sea level is entirely consistent with the tide gauge record:

clip_image026

Thus several lines of evidence point to the present rate of sea level rise being about 3mm/a or 30cm per century. Our existing defences against the sea have to deal with diurnal tidal changes of several metres, and low-pressure-induced storm surges of several metres more.  The average height of our defences above mean sea level is about 7m, so adding 0.3m in the next century will reduce the number of waves that occasionally overtop the barrier.

The IPCC predicts that the sea level will rise by between 0.4 and 0.7m during this century. Given the wide range of the prediction, there is a possibility they could be right. Importantly, even a 0.7m rise is not likely to be a disaster, in the light of the fact that our defences are already metres high – adding 10% to them over the next 80 years would be costly, but we would have decades to make the change, and should have more than adequate warning of any significant increase in the rate of sea level rise.

To conclude, our five steps have shown:

· the combustion of ever increasing quantities of fossil fuel has boosted the carbon dioxide concentration of the atmosphere.

· The physical impact of that increase is not demonstrable in a scientific way. There may be some warming of the atmosphere, but at present any warming is almost certainly hidden in the natural variation of temperatures.

· there is no significant evidence either for any increase in the frequency or magnitude of weather phenomena, or climate-related changes in the biosphere.

· any sea level rise over the coming century is unlikely to present an insuperable challenge.

Attempts to influence global temperatures by controlling carbon dioxide emissions are likely to be both futile and economically disastrous.

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s