The Picasso Problem

By Willis Eschenbach [See update at end.] – Re-Blogged From WUWT

Let me start explaining the link from Picasso to climate science by looking at what Dr. Nir Shaviv called “the most boring graph I have ever plotted in my life”.

This is the graph of the changes in the best estimate of the range of what is called “climate sensitivity” over the last forty years or so.

What is climate sensitivity when it’s at home? To explain that, I’ll have to take a slight detour. First, downwelling radiation.

“Downwelling” in climate science means headed down towards the planetary surface. Downwelling radiation is the total radiation going downwards towards the surface. It is composed of sunshine (shortwave) plus thermal radiation from the atmosphere (longwave). In climate science, this quantity, total downwelling radiation, is called “forcing”, abbreviated “F”

The central paradigm of modern climate science is that if you change the amount of downwelling radiation (forcing), that the surface temperature perforce will change. The claim is that everything else averages out, and if the forcing increases, then surface temperature needs to change to maintain the global energy balance. It has to change. It must.

In short, the central paradigm of modern climate science is the following:

In the long run, global temperature change is proportional to global forcing change.

The putatively constant proportion between the two, which is the temperature change divided by forcing change, is called the “climate sensitivity”.

“Climate sensitivity” is often expressed as the assumed change in temperature given a change of 3.7 watts per square metre (W/m2) in downwelling radiation. The determination of this so-called “climate sensitivity” is a central question arising out of the paradigm that temperature change is proportional to temperature change.

Which leads me to the most boring graph below. It shows the changes over time in the estimate of the value of the climate sensitivity.

Figure 1. Changes over time in the estimate of the climate sensitivity parameter “lambda”. “∆T2x(°C)” is the expected temperature change in degrees Celsius resulting from a doubling of atmospheric CO2, which is assumed to increase the forcing by 3.7 watts per square metre. FAR, SAR, TAR, AR4, AR5 are the UN IPCC 1st, second, third, fourth and fifth Assessment Reports giving an assessment of the state of climate science as of the date of each report

It is worth noting that since 1979, entire new scientific fields like DNA analysis have first been envisioned, then have come into being, and now have reached amazing levels of development … and during that same time, what Dr. Shaviv rightly calls “the most important question in climate” has gone nowhere. No progress at all.

Since 1979, the amount of computing power that we have available, both as individuals and large organizations, has skyrocketed. My trusty PowerMac has more computing ability than most universities had available in 1979. The cost has dropped as well, from $100,000 per “MIPS” (million instructions per second) to less than $1 per MIPS today. And the speed has gone through the roof, with supercomputers running climate models at more than a trillion floating point operations (which have the lovely name of “TeraFLOPs”) every second. The number of people investigating the value of climate sensitivity has also grown over time. And billions and billions of dollars have been spent on trying to answer the question.

So … since the Charney report on climate sensitivity in 1979 we’ve had huge, stupendous increases in:

Computing power working on the question

Hours of intensive research applied to the question

Discussion, debate, and interest in the question

Money spent on the question

And despite those huge increases in time, work, discussion, and computer power, the investigation of the question of the value of climate sensitivity has gone exactly nowhere. No progress.

How can we understand this scientific oddity? What is the reason that all of that valuable time, money, and effort has achieved nothing? I mean zero. Nada. No movement at all. The most boring graph.

Let me suggest that climate science is the victim of what I call the “Picasso Problem”. Pablo Picasso once said something that has stuck with me for a long time. He said:

“What good are computers? They can only give you answers.”

Now, I wrote my first computer program in 1963, more than half a century ago. I was sixteen. It ran on a computer the size of a small room. I’ve been programming computers ever since then. I’ve written programs to do everything from designing fabric patterns for huge catenary tents, to calculating next year’s tides from this year’s tide tables, to making the plasma-cutting files to guide the cutting of the steel parts for building 25-metre fishing boats, to analyzing the data and doing the math and creating the graphics for this very post. And over the years I’ve made big bucks with my succession of computers.

So when I read that Picasso was dissing computers with that statement, my initial response was to say “Whaa? Computers are great! What is this mad artist on about? I’ve made lots of money with my computer. How can they be no good?”

But upon more mature reflection, I realized that Picasso was right. Here’s what he meant:

Even the best computer can’t give you the right answer unless you ask it the right question.

To me, this was a crucial insight, one that has guided many of my scientific peregrinations—don’t focus too much on the answers. Put some focus on the questions as well.

So regarding climate science, what is the wrong question, and what is the right question? Once again, please allow me to get side-tractored a bit.

I first got interested in climate science around the turn of the century because of the increase in serial doomcasting regarding some rumored upcoming Thermageddon™. So I started with the basics, by learning how the poorly-named “greenhouse effect” was keeping the earth far warmer than the temperature of the Moon, which is at the same distance from the sun.

However, along the way, I read that the best estimate of the warming over the entire 20th century was on the order of 0.6 degrees Celsius. When I read that, I thought … “Whaa … less than one degree??? All this fuss and the temperature has changed less than one degree?”

I was surprised because of my experience repairing machinery which had a governor, and my experience with solar energy. I viewed the climate as a giant solar-driven heat engine, wherein the energy of the sun is converted into the ceaseless movement of the atmosphere and the ocean working against the brake of friction against the mountains and shores and the endless turbulent losses.

When one analyzes the efficiency or other characteristics of a heat engine, or when one uses tools like the Stefan-Boltzmann equation to convert temperature into the equivalent amount of thermal radiation, you have to use the Kelvin temperature scale (abbreviated “K”). This is the scale which starts at absolute zero. Temperature is a function of the motion of the molecules or atoms involved. And absolute zero is where molecular motion stops entirely.

You can’t use degrees Celsius or Fahrenheit for these calculations, because °C and °F have arbitrary zero points. You have to use the Kelvin scale, it’s the only one that works. Kelvin has the same size units as Celsius, just a different zero point, which is at minus 273.15°C (minus 459.67°F).

Now, the average surface temperature of the Earth is on the order of 14° Celsius, which is 57° Fahrenheit … or 287 Kelvin. And with that average global temperature of 287 Kelvin, the global temperature variation of 0.6 K over the 20th century is a temperature variation of a fifth of one percent.

This was the oddity that shaped my investigation of the climate … during a hundred year period, the temperature had varied by only about one fifth of one percent. This was amazing to me. I’d had lots of experience with governed systems because of my work with electrical generators. These need to be tightly governed so that their speed remains constant regardless of the changing load on the system. And what I’d found in my work with mechanical governors is that it’s quite hard to regulate a mechanical system to within one percent.

Yet despite droughts and floods, despite huge volcanic eruptions, despite constantly changing global cloud cover, despite all kinds of variations in the forcing, despite the hemispheric temperatures changing by ~ 13°C twice over the course of each and every year, despite the globe being balanced on a greenhouse effect which is holding it on the order of ~ 50°C warmer than the moon … despite all of those variations and changes, the average temperature of the Earth didn’t vary by a quarter of one percent over the entire 20th century.

That is amazingly tight regulation. Here’s a real-world example of why I was surprised by that stability.

I was looking at the speedometer today with my truck on “cruise control”. Cruise control in your car is a governor that keeps the speed of the vehicle the same regardless of changes in load on the truck. I set it for 50 miles per hour. Up and down hills it varied by plus and minus one mile per hour. That’s a computer-controlled engine that is speed-regulated to within ±2%, pretty tight regulation … but the Earth’s temperature is far better regulated than that. It stays within less than plus or minus one tenth of a percent.

To me at the time, that thermal stability was a clear sign of the existence of some unknown of natural thermostatic processes that acted in a very efficient manner to maintain the Earth’s temperature within those narrow bounds. So my own quest in the field of climate science was to find out what the natural phenomena were that explained the tight regulation of century-long planetary surface temperatures.

Which left me in a curious position. All of the established climate scientists were, and still are, trying to find out why the temperature is changing so much. They spend time looking at graphs like this, showing the variations in the Earth’s surface temperature:

Figure 2. HadCRUT global average surface temperature anomaly.

On the other hand, because I’m someone with an interest in heat engines and governors, I was trying to find out why the temperature has been changing so little. I spent my time looking at the exact same data as in Figure 2, but expressed in graphs like this:

Figure 3. HadCRUT global average actual surface temperature (the same data shown in Figure 2) and also the approximate average lunar temperature, in kelvin.

And that brings me back, after plowing that distant field, to the question of climate sensitivity and to Picasso’s prescient question, viz: “What good are computers? They can only give you answers.”.

I say that we have made zero progress in four decades of attempting to measure or calculate climate sensitivity because we are using our awesome computer power to investigate why the global temperature changes so much.

For me, this is entirely the wrong question. The question that we should be asking is the following:

Why does the global temperature change so little?

After much thought and even more research, I say the reason that global average temperature changes so little is that temperature is NOT proportional to forcing as is generally believed. As a result, the so-called “climate sensitivity” is not a constant as is assumed … and since it is not a constant, trying to determine its exact value is a fool’s errand because it has none. That’s why we can’t make even the slightest advance on measuring it … because it’s a chimera based on a misunderstanding of what is happening.

Instead, my hypothesis is that the temperature is maintained within narrow bounds by a variety of emergent phenomena that cool the earth when it gets too hot, and heat it up when it gets too cool. I have found a wide variety of observational evidence that this is actually the case. See the endnotes for some of my posts on my hypothesis.

But hey, that’s just my answer. And I freely agree that my answer may be wrong … but at least it is an answer to the right question. The true mystery of the climate is its amazing thermal stability.

Finally, how did an entire field of science get involved in trying to answer the wrong question? I say that it is the result of the 1988 creation of the Intergovernmental Panel on Climate Change (IPCC) by the United Nations.

In 1988, the field of climate science was fairly new. Despite that, however, the UN was already convinced that it knew what the problem was. Typical bureaucratic arrogance. As a result, in the UN General Assembly Resolution 43/53 from 1988, the Resolution which set up the IPCC, it says that the UN General Assembly was:

Concerned that certain human activities could change global climate patterns, threatening present and future generations with potentially severe economic and social consequences,

Noting with concern that the emerging evidence indicates that continued growth in atmospheric concentrations of “greenhouse” gases could produce global warming with an eventual rise in sea levels, the effects of which could be disastrous for mankind if timely steps are not taken at all levels,

And in response, it jumped right over asking if whether or not this was scientifically correct, and went straight to taking action on something that of course, the General Assembly knew nothing about. The Resolution says that the General Assembly:

… Determines that necessary and timely action should be taken to deal with climate change within a global framework;

Calls for action always make bureaucrats happy. So the IPCC, an expressly political “Intergovernmental” organization, became the defacto guiding light for an entire field of science … which turned out to be a huge mistake.

Now, up until that time, and since that time as well, every other field of science has managed to make amazing strides in understanding without any global “Intergovernmental” panel to direct their efforts. We’ve had astounding successes with our usual bumbling catch-as-catch-can scientific method, which involves various scientists working fairly independently around the planet on some scientific question, sometimes cooperating, sometimes competing, without needing or wanting anyone to “summarize the science” as the IPCC claims to do.

And given the lack of progress shown by the “Most Boring Graph” at the top of this post, I’d say that the world should never again put a bunch of United Nations pluted bloatocrats in charge of anything to do with science. If we had set up an “Intergovernmental Panel on DNA Analysis” when the field was new, you can be certain that long ago the field would have gone uselessly haring down blind alleys lined by nonsensical claims that “97% of DNA scientists agree” …

Over at Dr. Judith Curry’s excellent blog, someone asked me the other day what I didn’t like about the IPCC. I replied:

Here are some of the major reasons. I have more.

First, it assumes a degree of scientific agreement which simply doesn’t exist. Most people in the field, skeptics included, think the earth is warming and humans may well have an effect on it. But the agreement ends there. How much effect, and how, and for how long, those and many other questions have little agreement.

Second, it is corrupt, as shown inter alia by the Jesus Paper

Third, it generally ignores anything which might differ from climate science revealed wisdom.

Fourth, it is driven by politics, not by science. Certain paragraphs and conclusions have been altered or removed because of political objections.

Fifth, in an attempt to be inclusive of developing countries, it includes a number of very poor scientists.

Sixth, any organization that ends up with Rajendra Pachauri as its leader is very, very sick.

Seventh, they’ve ignored actual uncertainty and replaced it with a totally subjective estimate of uncertainty.

Eighth, it lets in things like the Hockeystick paper and the numerous “Stick-alikes” despite them being laughably bad science.

Ninth, it makes “projections” that have little to no relationship to the real world, like Representative Concentration Pathway 8.5 (RCP 8.5).

Tenth, it generally excludes skeptics of all types, either directly or because skeptics know better than to associate with such an organization.

Eleventh, anyone making “projections” that go out to the year 2100 is blowing smoke up your fundamental orifice.

Twelveth, it is far, far too dependent on untested, unverified, unvalidated climate models.

Thirteenth, the IPCC generally thinks without thinking about it that warming is bad, bad, bad … which is the opposite of the actual effects of the warming since the Little Ice Age.

Fourteenth, the IPCC was given the wrong task at its inception. Rather than setting out to find what actually controls the climate, it was given the task of finding out how much CO2 we could emit before it became dangerous. That tasking assumed a whole host of things which have never been established.

Fifteenth … aw, heck, that’s enough. I have more if you are interested.

So … that’s the climate Picasso Problem. The field of climate science is trying to use computers to find an answer to the wrong question, and as a result, the field is going nowhere.

[UPDATE]

In the comments below, someone brought up the following graphic of recent estimates of climate sensitivity and said that it showed I was wrong about the climate sensitivity.

 

Here is my answer from below:


Thanks, David. The discrepancy in your first graph is much more apparent than real. Here are the recent ECS estimates overlaid on the Charney/IPCC data:

As you can see, the only ones outside the IPCC uncertainty limits are the estimates at 5 & 6 °C per doubling of CO2. Here’s a boxplot of the recent estimates

The recent estimates are not at all unusual given the Charney/IPCC estimates.

Finally, you still haven’t grasped the nettle. The surface temperature is NOT a function of the forcing. For example, in large areas of the Pacific, the surface temperature controls the amount of sunshine. Here’s a plot showing that result.

Correlation between the solar radiation at the surface, and the surface temperature. This is calculated on a 1° x 1° gridcell basis.

Now, the areas in green and blue are areas in which, as the temperature goes UP, the amount of sunshine hitting the ground goes DOWN.

So here’s the question. What is the “climate sensitivity” in those areas?

The answer, of course, is that in such a situation the entire concept of climate sensitivity goes out the window. In those areas, the temperature is NOT a function of the forcing, linear or otherwise. Instead, in those areas, the forcing is a function of the temperature.

That’s the point I’ve been trying to make. The fundamental idea underlying modern climate science is incorrect—temperature is NOT a function of forcing.

So yes, people can calculate the putative “climate sensitivity” and get various answers … but it is a meaningless quest, and the results carry no weight at all.

My best to you,

w.


Here, we’re still in haze and smoke from the Camp Fire, and the number of fatalities is over seventy. I’m wearing an N95 mask when I go outside. Here’s the latest smoke map … Anthony Watts is up in Chico, in the bright red spot at the top, over 100 micrograms per cubic metre of smoke. I’m near the coast to the west of Santa Rosa, north of San Francisco, where it’s much better but still bad.

Keep a good thought over the fire victims, it’s hard times for all.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s