Resolution and Hockey Sticks, Part 1

By David Middleton – Re-Blogged From WUWT

Resolution vs. Detection

Geoscientists in the oil & gas industry spend much of our time integrating data sets of vastly different resolutions.  Geological data, primarily well logs, are very high resolution (6 inches to 2 feet vertical resolution).  Geophysical data, primarily reflection seismic surveys, are of much lower and highly variable resolution, dependent on the seismic velocities of the rocks and the frequency content of the seismic data.  The rule of thumb is that a stratigraphic unit must be at least as thick as one-quarter of the seismic wavelength (λ/4) to be resolved.

Figure 1a. Seismic wavelength vs velocity for 10, 25, 50 and 100 Hz dominant frequencies. (SEG Wiki)

Thinner beds can be detected, but not resolved.  The same principle applies to normal faults.  If the throw (vertical offset) is at least λ/4, the fault can be resolved and the throw can be accurately estimated from the seismic data. If the throw is less than λ/4, the best I can determine is that a fault is probably present.

ch11_fig1-2
Figure 1b. Seismic expression of a normal fault at λ/16, λ/8, λ/4, λ/2 and λ. (SEG Wiki)

When we integrate geological and geophysical data, we don’t just splice the well log onto a seismic profile.   We convert the well logs into a synthetic seismogram.  This is most effectively accomplished using sonic and density logs.

800px-Synthetic-seismograms_fig1
Figure 2. Synthetic Seismogram (AAPG Wiki)

The sonic and density logs are used to calculate acoustic impedance and a reflection coefficient series (RC).  The RC is convolved with a seismic wavelet, often extracted from the seismic data near the well.  The synthetic seismic traces (3 panels on the right) can then be directly compared to the seismic profile.  The resolution difference is quite large.  The trough-to-trough interval on the seismic pulse below is about 150 m.

19seisres
Figure 3. Schematic comparison or a well log and a seismic pulse with a wavelength of 150 m. (University of Maryland)

How Does This Relate to Climate “Science”

Signal theory and signal processing principles apply to all signals, not just seismic data. A signal is a time-variant sequence of numbers. Almost all temperature, carbon dioxide and sea level reconstructions employ many of the same data processing methods as seismic data processing. Deconvolution is particularly essential to ice core carbon dioxide chronologies. Sometimes the signal processing methods are properly employed. Van Hoof et al., 2005 demonstrated that the ice core CO2 data represent a low-frequency, century to multi-century moving average of past atmospheric CO2 levels. They essentially generated the equivalent of a synthetic seismogram from the stomata chronology and tied it to the ice core.

Figure 4. Panel A is stomatal frequency curve. Panel B is the D47 Antarctic ice core. The dashed line on Panel B is the “synthetic” ice core generated from the stomatal frequency curve. (Van Hoof et al., 2005)

From my geological perspective, most climate “hockey sticks” are the result of the improper integration of high resolution instrumental data (akin to well logs) and lower resolution proxy data (akin to reflection seismic data).  Many of these hockey sticks appear to have been the result of a careless, if not reckless, disregard of basic signal processing principles.

Temperature Reconstruction Hockey Sticks

Figure 5. “Mike’s Nature Trick” Older is toward the left.

One of the most egregious violations of signal processing principles is the generation of climate reconstruction “hockey sticks” through variations of “Mike’s Nature Trick.”

In the aftermath of the “Climategate” scandal, Penn State conducted an “investigation” of Mike’s Nature Trick.   The Penn State whitewash was ludicrous…

After careful consideration of all the evidence and relevant materials, the inquiry committee finding is that there exists no credible evidence that Dr. Mann had or has ever engaged in, or participated in, directly or indirectly, any actions with an intent to suppress or to falsify data.

RA-10 Inquiry Report

It can’t be proven that he intended to suppress or falsify inconvenient data. It’s entirely possible that he accidentally devised a method to suppress or falsify inconvenient data.

This bit here was laughable…

In fact to the contrary, in instances that have been focused upon by some as indicating falsification of data, for example in the use of a “trick” to manipulate the data, this is explained as a discussion among Dr. Jones and others including Dr. Mann about how best to put together a graph for a World Meteorological Organization (WMO) report. They were not falsifying data; they were trying to construct an understandable graph for those who were not experts in the field. The so-called “trick”1 was nothing more than a statistical method used to bring two or more different kinds of data sets together in a legitimate fashion by a technique that has been reviewed by a broad array of peers in the field.

RA-10 Inquiry Report

The most benign possible interpretation of the “trick” is that they edited part of Keith Briffa’s reconstruction because the tree ring chronology showed that the 1930s to early 1940′s were warmer than the late 1990′s. So, they just substituted the instrumental record for the tree ring chronology.  In the private sector, this sort of behavior isn’t benign… It’s grounds for immediate termination or worse.

I suppose that there is no evidence that they did this with intent to deceive.  However, the fact that they called it “Mike’s nature trick” sure makes it seem like this sort of thing was standard operating procedure.

Taking a set of data that shows that the 1930′s were warmer than the 1990′s and using another data set to reverse that relationship is not bringing “two or more different kinds of data sets together in a legitimate fashion.” It’s a total bastardization of the data.

To see an example of “Mike’s Nature Trick,” go here… Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia 

Click this… EIV Temperature Reconstructions

Open up any of the **cru_eiv_composite.csv or **had_eiv_composite.csv files. All of them splice the high frequency instrumental data into the low frequency proxy data. To Mann’s credit,  he at least documents this one enough to sort it out.

This statement from their PNAS paper is totally unsupported by proxy reconstructions… “Recent warmth appears anomalous for at least the past 1,300 years whether or not tree-ring data are used. If tree-ring data are used, the conclusion can be extended to at least the past 1,700 years.”

The anomalous nature of the “recent warmth” is entirely dependent on the “tricky” use of the instrumental data. He didn’t use any proxy data post-1850. The eivrecondescription file states “Note: values from 1850-2006AD are instrumental data”.

This image from Mann’s 2008 paper implies that all of the reconstructions are in general agreement regarding the claim that the “recent warmth appears anomalous for at least the past 1,300 years”…

Figure 6. “Spaghetti” graph from Mann et al., 2008. Older is toward the left.

By cluttering up the image with many reconstructions and plastering the instrumental record onto end of the graph, it’s impossible to see any details.

Here are Mann (Cru_EIV), Moberg, 2005, Christiansen & Ljungqvist 2012 (un-smoothed) Ljungqvist 2010 and Esper 2002 (low frequency)…

Figure 7. Clutter can work both ways. Older is toward the left.

Zoomed in on post-1800 with HadCRUT4 NH added:

Figure 8. Older is toward the left.

The Modern Warming only appears anomalous because of the higher resolution of the instrumental record and its position at the tail-end of the time series.

Ljungqvist (2010) clearly explained the problem with directly comparing instrumental data to proxy reconstructions.

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.

[…]

The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.

[…]

The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.

Ljungqvist, 2010

Figure 9. Ljungqvist demonstrated that the modern warming had not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added the red lines to reflect the margin of error of the proxy data relative to the instrumental data. Older is toward the left.

Direct comparisons of the the modern instrumental record to the older proxy reconstructions are not robust because the proxy data are of much lower resolution. The proxy data indicate the “minimum of the true variability on those time-scales.” The instrumental data are depicting something closer to actual variability.

The proxy data lack the high frequency component of the signal.  When the high frequency component of a signal is filtered out, it attenuates the amplitude.

Figure 10. Sine wave with 10-pt smoothing average applied. Note the reduction in amplitude due to filtering and  smoothing. (Wood for Trees) Older is toward the left.

The direct comparison of instrumental data to proxy data becomes even more problematic when the record length is extended beyond 2,000 years.

Figure 11. Holocene Climate Reconstruction, Andy May WUWT. Older is toward the right.

The supposedly “four warmest years on record” have occurred about 300 years after the coldest century of the past 100 centuries.  This could only be described as a “climate crisis” or “climate emergency” by someone who was totally ignorant of basic scientific principles, particularly Quaternary geology and signal processing.  It’s actually a helluva a lot better than just about any other possible evolution of Earth’s “climate.”

The longer the record length of the reconstruction, the more important the consistency of the temporal resolution becomes.

“Consistency of the temporal resolution” means that the resolution of the older proxies are consistent with the recent proxies. Temporal resolution is a function of the sampling interval…

We believe the greater source of error in these reconstructions is in the proxy selection. As documented in this series, some of the original 73 proxies are affected by resolution issues that hide significant climatic events and some are affected by local conditions that have no regional or global significance. Others cover short time spans that do not cover the two most important climatic features of the Holocene, the Little Ice Age and the Holocene Climatic Optimum.

[…]

We also avoided proxies with long sample intervals (greater than 130 years) because they tend to reduce the resolution of the reconstruction and they dampen (“average out”) important details. The smallest climate cycle is roughly 61 to 64 years, the so-called “stadium wave,” and we want to try and get close to seeing its influence. In this simple reconstruction, we have tried to address these issues.

Andy May WUWT.

This is a table of all of the “used” proxies. They have fairly consistent temporal resolution. They have long record lengths and most, if not all, cover the Holocene Climatic Optimum and Little Ice Age, the warmest and coldest climatic phases of the Holocene. It’s about as close to “apples and apples” you can get with a >10,000-yr global temperature reconstruction. Andy’s proxies have an average resolution of 75 yrs and an average record length of 11,697 yrs with low standard deviations (by proxy series standards). There is no significant trend of degrading resolution with time, as occurs in most proxy reconstructions.

Andy May Resolution
Figure 12. Temporal resolution (left axis) and record length (right axis). Older is toward the right.

Andy’s reconstruction demonstrates that the nadir of the Little Ice Age was the coldest climatic period of the Holocene. This is a feature of every non-hockey stick reconstruction and even most hockey stick reconstructions, including the serially flawed Marcott et al., 2013.  It also demonstrates that the modern warming is inconspicuous relative to the Holocene’s pervasive millennial-scale climate signal (Bohling &Davis, 2001).

If you open the Reconstruction References spreadsheet and go to the far right column (Comments), Andy notes whether the proxy was “used” or explains why it was rejected. The three most common reasons for rejecting proxy series were:

  1. Coarse resolution (denoted as “resolution too big”)
  2. Not old enough
  3. Not Young enough

Andy could have spliced the instrumental record onto the end of this and made a hockey stick… But that would be fraudulent anywhere outside of academic and government “science”. It’s akin to splicing a well log into a seismic line and calling it an anomaly.

Regarding Marcott, the authors even state that their Holocene reconstruction can’t be directly compared to instrumental data due to resolution differences… Yet they do so anyway.

Q: What do paleotemperature reconstructions show about the temperature of the last 100 years?

A: Our global paleotemperature reconstruction includes a so-called “uptick” in temperatures during the 20th-century. However, in the paper we make the point that this particular feature is of shorter duration than the inherent smoothing in our statistical averaging procedure, and that it is based on only a few available paleo-reconstructions of the type we used. Thus, the 20th century portion of our paleotemperature stack is not statistically robust, cannot be considered representative of global temperature changes, and therefore is not the basis of any of our conclusions. Our primary conclusions are based on a comparison of the longer term paleotemperature changes from our reconstruction with the well-documented temperature changes that have occurred over the last century, as documented by the instrumental record. Although not part of our study, high-resolution paleoclimate data from the past ~130 years have been compiled from various geological archives, and confirm the general features of warming trend over this time interval (Anderson, D.M. et al., 2013, Geophysical Research Letters, v. 40, p. 189-193; http://www.agu.org/journals/pip/gl/2012GL054271-pip.pdf).

Q: Is the rate of global temperature rise over the last 100 years faster than at any time during the past 11,300 years?

A: Our study did not directly address this question because the paleotemperature records used in our study have a temporal resolution of ~120 years on average, which precludes us from examining variations in rates of change occurring within a century. Other factors also contribute to smoothing the proxy temperature signals contained in many of the records we used, such as organisms burrowing through deep-sea mud, and chronological uncertainties in the proxy records that tend to smooth the signals when compositing them into a globally averaged reconstruction. We showed that no temperature variability is preserved in our reconstruction at cycles shorter than 300 years, 50% is preserved at 1000-year time scales, and nearly all is preserved at 2000-year periods and longer. Our Monte-Carlo analysis accounts for these sources of uncertainty to yield a robust (albeit smoothed) global record. Any small “upticks” or “downticks” in temperature that last less than several hundred years in our compilation of paleoclimate data are probably not robust, as stated in the paper.

Real Climate

If the “the 20th century portion of our paleotemperature stack is not statistically robust”… Why was it included in the publication? The modern instrumental record would be a single data point at the resolution of Marcott’s reconstruction.

Why does this matter?

So, what would it mean, if the reconstructions indicate a larger (Esper et al., 2002; Pollack and Smerdon, 2004; Moberg et al., 2005) or smaller (Jones et al., 1998; Mann et al., 1999) temperature amplitude? We suggest that the former situation, i.e. enhanced variability during pre-industrial times, would result in a redistribution of weight towards the role of natural factors in forcing temperature changes, thereby relatively devaluing the impact of anthropogenic emissions and affecting future predicted scenarios. If that turns out to be the case, agreements such as the Kyoto protocol that intend to reduce emissions of anthropogenic greenhouse gases, would be less effective than thought.

Esper et al., 2005

It matters because the only way to directly compare the instrumental data to the pre-industrial proxy data is the filter the instrumental data down to the resolution of the proxy data.  This leads to climate reconstructions with “enhanced variability during pre-industrial times” and “result in a redistribution of weight towards the role of natural factors in forcing temperature changes, thereby relatively devaluing the impact of anthropogenic emissions and affecting future predicted scenarios.”

It matters because the advocates of the Anthropocene as a geologic epoch are relying on the Marcott hockey stick.

Figure 13. Run Away! The Anthropocene Has Arrived!!! Older is toward the left.

It matters because hockey sticks are being used to justify policy changes, carbon taxes and destroy individual liberty and prosperity.

Most of the asserted evidence that recent climate changes deviate from the norms of the Holocene are equally consistent with being the result of differences in the resolution of paleo-climate data and instrumental records.

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s