Almost Earth-like, We’re Certain

By Kip Hansen – Re-Blogged From WUWT

There has been a lot of news recently about exoplanets. An extrasolar planet is a planet outside our solar system.  The Wiki article has a list of exoplanets.  I only mention exoplanets because there is a set of criteria for specifications of what could turn out to be an “Earth-like planet”, of interest to space scientists, I suppose, as they might harbor “life-as-we-know-it” and/or be a potential colonization target.

One of those specifications for an Earth-like planet is an appropriate average surface temperature, usually said to be 15 °C.  In fact, our planet, Sol 3 or simply Earth, is very close to qualifying as Earth-like as far as surface temperature goes. Here’s that featured image full-sized:

1374177879538

This chart from the Chemical Society, that Earth’s observed average temperature should be about 15°C, and note that our atmosphere contains mostly Nitrogen (78%), Oxygen (21%) and Argon (0.9%) which makes up 99.9% of the total — leaving about one-tenth of one percent for the trace gases water (H2O and CO2).

Let’s look at the thermometer:

planetary_therrm_400

 

We see the temperatures believed to exist on the surfaces of the eight planets and Pluto (poor Pluto…).

The ideal Earth is right there sat 15°C or 59 °F.

Mercury and Venus are up at the top, one due to proximity to the Sun and the other due to a crushingly dense atmosphere, well out of range for Earth-like planets.

Mars is down below the freezing temperature of water, due to distance from the sun and mostly a lack of atmosphere coming in at  70°F (20°C) near the equator but at night plummeting to about minus 100° F (minus 73°C) with an estimated average of about -28°C.  The average is a little low, but mankind lives on Earth in places with a similar temperature range, at least on an annual basis, so with adequate shelter and clothing (modified for lack of breathable atmosphere), it might do.

The other four planets and Pluto (poor Pluto) don’t have a chance of being Earth-like.

This next thermometer shows that Earth provides a temperature range suitable for human and Earth-type life, ranging from 56.7°C (134°F) at the high end down to -89.2°C (-128.6°F), with an average of 15°C (59°F).

 

 

Hi_lo_therm_400

 

Like Mars, Earth has a comfortable average that falls easily in what most people would consider to be a comfortable range, avoiding extremes,  if properly dressed for the weather.  For me, a southern California surfer boy by birth, 59°F (15°C) is sweater weather – or more properly, Pendleton wool shirt weather.  59°F (15°C) is the average Fall/Winter temperature of the surf at Malibu and most of us required wetsuits that would keep us warm in the water.

 

 

 

 

 

 

 

 

 

 

 

 

 

close-up_200Taking a closer peep at the middle of our little graphic, we see that the IDEAL Earth-like planet would have an average surface temperature of 15°C.  But, in the 1880s through 1910, we were running a bit cool — 13.7°C.   Luckily, after the mid-century point of 1950, we started to warm up a little and got all the way up to 14°C, just 1°C short of the ideal.

So, how have we done since then?

There is good news.  Since the middle of the last century, when Earth was running a little cool compared to the ideal Earth-like temperature expected of it, we have made some gains.

21st_Centruy_v_2

By 2014, Earth has warmed up to an almost-there 14.55°C  (with an uncertainty of +/- 0.5°C).

With the uncertainty in mind, we can see how close we came to the target of 15°C.  The uncertainty bracket on the left for 2014 almost reaches 15°C.

2016 was a banner year, at 14.85°C and could have been, uncertainty taken in, a tiny bit over 15°C!

The numbers used in this image come from NASA GISS’s Director (and co-founder of the private climate blog that he and his pals work on while being paid by the government with your taxes) Gavin Schmidt.  They are from his blog post in August 2017 — and, as always, have already been adjusted to be a bit higher.  The current, adjusted-higher,  numbers show 2017 0.1°C lower than 2016, which is what I have used.

That RealClimate blog post is quite a wonderful thing — it reveals to us several things, some of which I have written about in the past, which is the part I quote under the image. Dr. Schmidt kindly informs us about one of the miracles of modern climate science.  This miracle involves taking data that has rather wide uncertainty — a full degree Centigrade wide, being plus 0.5°C or minus 0.5°C and turning it to accurate and precise data with almost no uncertainty at all!

Dr. Schmidt explained to us why GISS uses “anomalies” instead of “absolute global mean temperature” (in degrees) in the blog post (repeating the link):

“But think about what happens when we try and estimate the absolute global mean temperature for, say, 2016. The climatology for 1981-2010 is 287.4±0.5K, and the anomaly for 2016 is (from GISTEMP w.r.t. that baseline) 0.56±0.05ºC. So our estimate for the absolute value is (using the first rule shown above) is 287.96±0.502K, and then using the second, that reduces to 288.0±0.5K. The same approach for 2015 gives 287.8±0.5K, and for 2014 it is 287.7±0.5K. All of which appear to be the same within the uncertainty. Thus we lose the ability to judge which year was the warmest if we only look at the absolute numbers.”

So, by changing the annual temperatures to “anomalies” they get rid of that nasty uncertainty and produce near certainty!

the-miracleSource: https://data.giss.nasa.gov/gistemp/graphs_v3/           annotated-kh

Dr. Schmidt and the ClimateTeam have managed to take very uncertain data, so uncertain that the last four years of Global Average Surface Temperature data, when straightforwardly presented as degrees Centigrade with its proper +/- 0.5°C uncertainty, cannot be distinguished from one another, have now, through the miracle of “anomalization” been turned into a new, improved sort of data, an anomaly, which is so precise that they don’t even bother to mention the uncertainty — except to add (at least on the above graph) a single uncertainty bar for the modern data which is 0.1°C wide, or in the language used in science, +/- 0.05°C.   The uncertainty in the Global Average Surface Temperature has magically become a whole order of magnitude less uncertain….  And all that without a single new measurement being made.

The miracle is accomplished by the marvel of subtraction!  That is, one simply has to take the current temperature in degrees, which has an uncertainty of +/- 0.5°C,  and subtract from that the climatic-term mean (current 1981-2010) and “voila” — the anomaly with a wee tiny uncertainty of only +/- 0.1°C.

Let’s see if that really works:

miracle_grid_flat

Here’s the grid of all the possibilities with a range of +/- 0.5°C for the 2015 temperature average in absolute degrees, and the 1981-2010 climatic mean in degrees, with the same +/- 0.5°C uncertainty range, both of which are given by Dr. Schmidt in his blog post.  One still gets the +/- 0.5°C or 1°C wide uncertainty range.   It did not magically reduce to a range one-tenth of that — it didn’t turn out to be 0.1°C wide as shown in Dr. Schmidt’s graph. I’m pretty sure of my arithmetic, so what happened?

How does GISS justify the new-improved wee-tiny uncertainty?  Ah — they use statistics!  They ignore the actual uncertainty in the data itself, and shift to using the “the 95% uncertainties on the estimate of the mean.”  Truthfully, they fudge on that a little bit as well, which you can see in their original data. [ In their monthly figures, the statistical uncertainty (+/- 2 Standard Deviations) is a bit wider than the illustrated “0.1°C”.]

So rather than use the actual original measurement uncertainty, they use subtraction to find the difference from the climatic mean and then pretend that this allows the uncertainty to be reduced to the statistical construct of the “uncertainties” of the mean — standard deviations.

This is a fine example of what Charles Manski is talking about in his new paper:

The Lure of Incredible Certitude”, a paper recently highlighted at Judith Curry’s Climate Etc.  While Dr. Curry accepts Manski’s compliments paid to climate science based on his perception that many “Published articles on climate science often make considerable effort to quantify uncertainty.”, we see here the purposeful obfuscation of the real uncertainty of Global Average Surface Temperature annual data, replacing the admitted wide uncertainty ranges with the narrow “uncertainties on the estimate of the mean”.  

Graphically, it looks like this:

GIS_Temp_animation_3sec

Although I was a semi-professional magician in my youth,   I have nothing that compares to the magic trick shown above — a totally scientifically spurious transformation of Uncertain Data into Certain Anomalies,  reducing the uncertainty of annual Global Average Surface Temperatures by a whole order of magnitude — using only subtraction and a statistical definition.  Note that the data and its original uncertainty are not affected by this magical transformation at all — like all stage magic,  it’s just a trick.

A trick to fulfill the need of the science field we call Climate Science to hide the uncertainty in global temperatures — an act of “disregard of uncertainty when reporting research findings” that will “harm formation of public policy.” (quotes from Manski).

The true answer to “Why does Climate Science report temperature anomalies instead of just showing us the temperature graphs in absolute degrees?” is exactly as Gavin Schmidt has stated — if Climate Science used absolute numbers they annual temperatures would “All of which appear to be the same within the uncertainty. Thus we lose the ability to judge which year was the warmest if we only look at the absolute numbers.”

Thus, they use anomalies and pretend that the uncertainty has been reduced.

 It is nothing other than a pretense.  It is a trick to cover-up known large uncertainty.

# # # # #

 

While I was preparing this essay, I  thought to attempt to illustrate the true magnitude of the recent warming (since 1880) in a way that would satisfy the requirements of a Climatically Important Difference.  Below we see the UAH Lower Troposphere temperatures (even these as errorless anomalies) graphed at a scale of temperatures allowed in my personal living quarters before our family resorts to either heating or cooling for comfort — about 8°C (15 °F — as in 79°F down to 64°F).  Overlaid in light blue is a 2°C range, into which the entire most recent satellite record fits comfortably and in purple, the prescribed 3-to-3.5°C comfort range from the Canadian Centre for Occupational Health & Safety for an office setting.  If the office temperature varies more than this, then the HVAC system is meant to correct it by heating or cooling.  As we can see, the Global Average lower troposphere is very well regulated by this standard.

Untitled-y8ygf1An interesting aside is that the Canadian COHS allows an extra 0.5°C in the winter, increasing the comfort range to 3.5°C, accounting for the differences in perception of temperature during the colder months.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s