Global Energy Balances … Except When It Doesn’t

By Willis Eschenbach – Re-Blogged From WUWT

I came across an interesting 2014 paper called The energy balance over land and oceans: an assessment based on direct observations and CMIP5 climate models. In it, they make a number of comparisons between observational data and 43 climate models regarding the large-scale energy flows of the planet. Here’s a typical graphic:

Figure 1. ORIGINAL CAPTION: “Fig. 7 Average biases (model—observations) in downward solar radiation at Earth’s surface calculated in 43 CMIP5 models at 760 sites from GEBA. Units Wm−2”. The “CMIP5” is the “Computer Model Intercomparison Project 5”, the fifth iteration of a project which compares the various models and how well they perform.

Now, what this is showing is how far the forty-three models are from the actual observations of the amount of solar energy that hits the surface. Which observations are they comparing to? In this case, it’s the observations stored in the “Global Energy Balance Archive), GEBA. Per the paper:

Observational constraints for surface fluxes primarily stem from two databases for worldwide measurements of radiative fluxes at the Earth surface, the global energy balance archive (GEBA) and the database of the Baseline Surface Radiation Network (BSRN).

GEBA, maintained at ETH Zurich, is a database for the worldwide measured energy fluxes at the Earth’s surface and currently contains 2500 stations with 450‘000 monthly mean values of various surface energy balance components. By far the most widely measured quantity is the solar radiation incident at the Earth’s surface, with many of the records extending back to the 1950s, 1960s or 1970s. This quantity is also known as global radiation, and is referred to here as downward solar radiation. Gilgen et al. (1998) estimated the relative random error (root mean square error/mean) of the downward solar radiation values at 5 % for the monthly means and 2 % for yearly means.

So downwelling solar radiation at the surface is very well measured at a number of sites over decades. And surprisingly, or perhaps unsurprisingly given their overall poor performance, the climate models do a really, really bad job of emulating even this most basic of variables—how much sunshine hits the surface.

Now, bear in mind that for these models to be even remotely valid, the total energy entering the system must balance the energy leaving the system. And if the computer models find a small imbalance between energy arriving and leaving, say half a watt per square metre or so, they claim that this is due to increasing “net forcing, including CO2 and other GHGs” and it is going to slowly heat up the earth over the next century.

So their predictions of an impending Thermageddon are based on half a watt or a watt of imbalance in global incoming and outgoing energy … but even after years of refinement, they still can’t get downwelling sunlight at the surface even roughly correct. The average error at the surface is seven watts per square metre, and despite that, they want you to believe that they can calculate the energy balance, which includes dozens of other energy flows, to the nearest half a watt per square metre?

Really?

Now, I wrote my first computer program, laboriously typed into Hollerith cards, in 1963. And after more than half a century of swatting computer bugs, I’ve learned a few things.

One thing I learned is the mystic power that computers have over peoples’ minds. Here’s what I mean by “mystic power”—if you take any old load of rubbish and run it through a computer, when it comes out the other end, there will be lots of folks who will believe it is absolutely true.

For example, if I were to tell you “I say that in the year 2100 temperatures will average two degrees warmer than today”, people would just point and laugh … and rightly so. All I have to back it up are my assumptions, claims, prejudices, and scientific (mis)understandings. Anybody who tells you they know what the average temperature will be in eighty years is blowing smoke up your astral projection. Nobody can tell you with any degree of certainty what the average temperature will be in two years, so how can they know what the temperature will be in eighty years?

But when someone says “Our latest computer model, which contains over a hundred thousand lines of code and requires a supercomputer to run it, says that in the year 2100 temperatures will be two degrees warmer than today”, people scrape and bow and make public policy based on what is nothing more than the physical manifestation of the programmers’ same assumptions, claims, prejudices, and scientific (mis)understandings made solid.

And how do we know that is a fact, rather than just a claim that I’m making based on a half-century of experience programming computers?

Because despite the hundred thousand lines of code, and despite the supercomputers, and despite the inflated claims, the computer models can’t even calculate how much sunshine hits the surface … and yet people still believe them.

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s