Weekly Climate and Energy News Roundup #263

Brought to You by www.SEPP.org

By Ken Haapala, President,The Science and Environmental Policy Project

False Precision: In their early education, many students of science faced the problem of significant numbers (digits). A useful rule of thumb was that the chain was only as strong as its weakest link. In measurement, the less precise instrument making the measurements determines precision of any dataset representing the measurements. A mathematical operation does not add precision to the instruments, or the dataset. For example, as discussed in the January 21 TWTW, the widely used Automatic Surface Observing System (ASOS) instruments at airports have a precision of plus or minus 1 degree C (1.8 F, correctly, 2 F). Surface datasets using these measurements cannot be more precise than these instruments. Yet, routinely, some government agencies report data, after mathematical manipulation, with far greater precision – to one-hundredths of a degree C. Such precision is false.

Writing in the non-conservative Boston Globe, columnist Jeff Jacoby gives a simple illustration on how small errors in measurement can compound in a computer model with many small errors. Any assumption that the errors will cancel each other out needs to be demonstrated. However, in the reports of the UN Intergovernmental Panel on Climate Change (IPCC) and its followers, such cancellation of errors is not demonstrated.

As reported in the January 7 TWTW. Roy Spencer estimates that: “The resulting 2016 annual average global temperature anomaly is +0.50 deg. C, which is (a statistically insignificant) 0.02 deg. C warmer than 1998 at +0.48 deg. C. We estimate that 2016 would have had to be 0.10 C warmer than 1998 to be significantly different at the 95% confidence level. Both 2016 and 1998 were strong El Niño years.” See links under Challenging the Orthodoxy and Measurement Issues – Atmosphere.

###################################################

Quote of the Week. Accurate and minute measurement seems to the non-scientific imagination, a less lofty and dignified work than looking for something new. But nearly all the grandest discoveries of science have been but the rewards of accurate measurement and patient long-continued labour in the minute sifting of numerical results. Baron William Kelvin

###################################################

Number of the Week: 7,500,000 tonnes of lead acid batteries (8,250,000 tons)

###################################################

Avoiding Dogma: One is tempted to use Spencer’s analysis “to prove” that carbon dioxide (CO2) has no influence on global warming. For example, using December data to minimize seasonal variation, one can calculate that from December 1997 to December 2016, CO2 as measured at Mauna Loa Observatory went up 11%; yet, there was no significant increase in atmospheric temperatures. Then, one can assert that this “proves” that CO2 has no influence on temperatures. But such an assertion is as dogmatic as many used by the IPCC and its followers.

Two points provide insufficient data. The real issues are two-fold: 1) what will happen following the recent El Niño; and 2) are the frequency and severity of El Niños indicative of a CO2-caused warming?

To answer the first question, we must wait and see what happens. To answer the second question, much research needs to be done. The IPCC and its followers have dismissed El Niños as weather events, not indicative of climate change. A relationship between CO2 and El Niños needs to be established. If there is none, so said. See links under Measurement Issues – Atmosphere.

******************

Challenging the Kiehl – Trenberth Model: Last week’s TWTW introduced the Kiehl – Trenberth Annual Global Mean Energy Budget Model, which provides the core for most, if not all, current global climate models (GCMs) used by the IPCC and its followers.

What is of particular interest is the center of the diagram, the Latent Heat (or latent energy) from Evapo-transportation and, to a lesser degree, Thermals. The latent energy is from phase change of water at the surface evaporating into water vapor, then the energy is released as heat when the water vapor condenses in the atmosphere. This gives rise to the so-called “hot-spot”, which was incorrectly called by B. Santer, et al. the “distinct human fingerprint.”

This latent energy from phase change of water is critical to the 1979 Charney Report. It was estimated, “guessed at”, and no empirical evidence was provided. The Charney Report provided the basis for the claim that a doubling of CO2 of will increase surface temperatures by 3 degrees C, plus or minus 1.5 degrees C. Except for an increase in the lower estimate in Fourth Assessment Report (AR-4, 2007), which was discarded in the Fifth Assessment Report (AR-5, 2013), the estimates in the Charney Report have been used by the IPCC and its followers for over thirty-seven years, without improvement.

Any error in the calculations may produce significant errors in climate models over time. In personal correspondence, meteorologist Australian William Kininmonth wrote:

“…While the radiation aspects of the GCMs are relatively well specified from experimental physics and the well-known structure of the atmosphere this is not the case for processes involving heat and latent energy. In the Kiehl and Trenberth assessment (and other like budgets) the surface heat and latent energy exchanges are not verifiable from physical measurements but are values giving closure the steady state model. In modelling these processes the best that can be done is to resort to unverified bulk aerodynamic formula, modified somewhat arbitrarily for changing terrain. There is great uncertainty as to how, globally, the surface heat and latent energies might change with temperature. Yet these rates of change are crucial to understanding how surface temperature will change under changing CO2.

“Then there is the issue of feedbacks. Surface heat and moisture fluxes are functions of temperature and moisture gradient as well as surface temperature. How do these gradients vary as surface temperature increases? In the troposphere, convection distributes heat and moisture vertically and the overturning converts latent energy to heat that is available to offset radiation divergence. If the convective overturning is not sufficiently rapid, then moisture stays longer in the boundary layer thus inhibiting any increase in latent energy flux with temperature increase. From the discussion above it can be recognised that any inhibition to latent energy flux increase with temperature will suppress the overall rate of surface energy loss as temperature rises. A new steady state surface energy balance from an increase in CO2 (and hence increased back radiation) will require a greater fraction of the incremental increase in surface energy loss to be taken up by surface radiation emission – that is, a higher temperature for a new steady state condition. Within GCMs, on average, surface evaporation increases at about 2 percent per degree C. This is about one-third of the Clausius Clapeyron relationship. This discrepancy does not seem to trouble modelers.

“Modellers are not able to maintain long term stability of their models without imposing predetermined conditions on aspects of the energy flows through the system. Initially this was via surface flux adjustment; that is, constraining the very energy flow that realises temperature sensitivity to CO2 forcing. Later models use different methodologies but the constraints to ensure long term steady state also impact on the sensitivity.”

It is the lack of physical verification of the surface heat and latent energy that may be the source of major problems with the GCMs, particularly in their inability to forecast a warming of the atmosphere, which the Chaney Report states will occur. This is the primary amplifier of a CO2-caused warming, which the Chaney Report says, alone, will be modest, based on laboratory experiments.

Independently, using spectral analysis, Antero Ollila of Aalto University, Helsinki, addresses the issues in Watts Up With That? His calculations of the warming contribution of CO2 were very close to those of Kiehl and Trenberth – 27% and 26%, respectively. But, Ollila goes on to state:

“…There is only one small problem, because the water content of this atmosphere is really the atmosphere over the USA and not over the globe. The difference in the water content is great: 1.43 prcm versus 2.6 prcm. I have been really astonished about the reactions of the climate scientists about this fact. It looks like that they do not understand the effects of this choice or they do not care…”

If Ollila’s assertions are correct, then there is a major problem with the global climate models. The surface of the tropics is 80% over water. Water vapor dominates the tropical atmosphere, but is not as prevalent in the U.S. Thus, the transfer of surface heat and latent energy through phase change in water may be stronger in the atmosphere over the US than over the tropics, where this energy transfer is inhibited by atmospheric saturation. The moisture gradient is far less over land in most of the US than the moisture gradient in the tropics. Such a difference would result in a far greater estimate of CO2-caused warming than what will occur. It may be time to re-examine the entire effort to blame CO2 for possible dangerous global warming. See links under Challenging the Orthodoxy and Defending the Orthodoxy.

******************

Peak Oil Peaked? In 1970, oil production in the US peaked at 9,637 thousand barrels per day (BPD), giving rise to government energy models predicting the world would run out near the end of the 20th century, and the US would run out of natural gas about the same time. In 2007 oil production bottomed at 5,077 thousand BPD. Thanks to the advances in precision drilling and in hydraulic fracturing, the US Energy Information Agency is projecting that in 2018 US oil production will exceed that of 1970, at an estimated 9,700 thousand (BPD).

These projections do not include the major finds in Alaska and possible major development in the Gulf of Mexico. See links under Oil and Natural Gas – the Future or the Past?

******************

Number of the Week: 7,500,000 tonnes of lead acid batteries (8,250,000 tons). Recently, South Australia had several major black-outs that can be largely attributed to over-reliance on unreliable wind power. Yet, the government wants to go 100% renewable. On Jo Nova’s web site, Paul Miskelly and Tom Quirk presented their calculations of the least costly way to provide reliable back-up when wind power fails. It would require about 60 to 90 billion Australian dollars and about 8,250,000 tons of lead-acid batteries. No doubt, the government will tuck them away in Adelaide.

******************

ARTICLES:

1. E.ON Posts Biggest-Ever Loss

Uniper spinoff and funding of nuclear-waste storage left deep scars on the company’s balance sheet

By Zeke Turner, WSJ, Mar 15, 2017

https://www.wsj.com/articles/e-on-posts-biggest-ever-loss-1489566692

SUMMARY: Once E.ON was Europe’s largest stockholder electricity company. It bet poorly that with 10% market generation, wind power would become reliable.

The reports states: “German energy giant E.ON SE reported Wednesday the biggest loss in its history, reflecting the scars of the country’s clean-energy revolution and nuclear phaseout.

“E.ON said its net loss deepened 21% in 2016 to €8.45 billion ($8.96 billion), or €4.33 a share, from a loss of €6.99 billion, or €3.60 a share, in 2015. Sales fell 11% to €38.17 billion from €42.65 billion.

“The company incurred restructuring costs last year when it spun off its conventional coal and gas power plants into Uniper SE, seeking to recast itself as a company refocused on forward-looking technologies such as wind, solar and grid infrastructure.

“But it hasn’t been able to shed liabilities from the past—chief among them, the massive risk surcharge connected to finding and building a safe space underground to bury spent uranium fuel rods from its nuclear power plants.”

Subsequently, the report states:

“Earnings across Germany’s energy industry have also suffered from the country’s low electricity wholesale prices, which in the course of 2016 dipped below half of what they were in 2011, around €50 per megawatt hour. After the nuclear disaster at Japan’s Fukushima power plant that year, Germany moved quickly to boost the capacity of the country’s solar and wind power-generating capacity, leading to an oversupply that has wreaked havoc on wholesale prices.

“The country has also pivoted to a leaner subsidies model that will force wind-heavy companies such as E.ON to compete over prices at auction instead of relying on a fixed system.”

***************

2. Trump Heads to Detroit as EPA Reviews Fuel-Economy Targets

Auto makers contend Obama administration targets will be difficult to meet

By Mike Spector, WSJ, Mar 14, 2017

https://www.wsj.com/articles/trump-to-head-to-detroit-as-epa-reopening-review-of-fuel-economy-targets-1489499059

“Donald Trump is waging a war on the environment,” said Sen. Edward Markey (D., Mass.) during a recent conference call. “Undoing the fuel-efficiency standards would harm consumers, harm our energy security and increase global warming pollution.” [Boldface added]

SUMMARY: The report states: “President Donald Trump rattled car executives for months by threatening a stiff border tax on Mexican imports and questioning their commitment to U.S. jobs.

“Now, he’s granting them a much desired reprieve—on fuel-economy regulations.

Mr. Trump heads to Detroit Wednesday, a trip coinciding with the expected reversal of an 11th-hour Obama administration decision to lock in tougher targets for tailpipe emissions.

“The Environmental Protection Agency, after weeks of industry lobbying, plans to reopen a review of the regulations. The standards call for companies to sell vehicles averaging 54.5 miles a gallon, or roughly 40 mpg in real-world driving, by 2025. They would remain in place while under review.

“Auto makers contend the targets, which start toughening in 2022, will be difficult to meet with low gasoline prices steering consumers to higher-emitting and fuel-thirsty pickup trucks and sport-utility vehicles. The EPA found auto makers are capable of meeting the standards without relying too much on electric-car technologies, and that the rules would cut oil consumption and greenhouse-gas emissions, while saving consumers $92 billion at the fuel pump.”

[SEPP Comment: Many of EPA “savings” are imaginary.]

“For car makers, reconsidering the review of emissions and fuel-economy standards opens the door to potentially rolling back costly environmental regulations after several companies were targeted in the last few months by Twitter missives from the president over their investments outside the U.S.

“Auto makers have spotlighted U.S. commitments, and in some cases changed foreign-investment plans. Mr. Trump in turn touted their moves, even some that had been long-planned and weren’t necessarily responses to his criticisms.

At the same time, auto makers pushed to undo the EPA’s final determination, made a week before Mr. Trump was inaugurated, that locked in future emissions targets.

“The process hadn’t been expected to be completed until April 2018. The agency regulates tailpipe emissions and often expresses future targets in terms of fuel economy.”


With the Heartland Conference (ICCC-12) this week, there will be no TWTW next week. Due to other commitments requiring refraining from public comments that may be misconstrued as suggesting policy, this TWTW will be short and comments restrained. Responses to correspondence will be limited. Thank you.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s