Weekly Climate and Energy News Roundup #341

Brought to You by www.SEPP.org, The Science and Environmental Policy Project

By Ken Haapala, President

Quote of the Week: “On what principle is it that with nothing but improvement behind us, we are to expect nothing but deterioration before us?” – Lord Macaulay, [H/t Matt Ridley]

Number of the Week: Up to a 50% increase in efficiency?On to Chile: Some seem to be disappointed with the outcome of the 24th Conference of Parties (COP-24) of the United Nations Framework Convention on Climate Change (UNFCCC) in Katowice, Poland, in a coal mining district. Rather than adopting hard, fast rules for the implementation of the Paris Agreement, the delegates adopted vague generalities and promised to do more. Reading through the “bureaucratic speak,” of the concluding remarks by the UN Secretary General, António Guterres, read by Patricia Espinosa, Executive Secretary of the UNFCCC, the stated goals were not accomplished. The Secretary General wrote:

“I’d first like to thank the Presidency of the COP for the enormous efforts it deployed to organize this 24th session in Katowice, Poland.

“I also want to acknowledge the tireless work of Patricia Espinosa, the Executive Secretary of the UNFCCC [UN Framework Convention on Climate Change], as well as of her staff throughout this session.


“And I of course want to thank all Member States for their commitment and dedication which was once again demonstrated through countless long hours of work here during the last few days.


“Katowice has shown once more the resilience of the Paris Agreement – our solid roadmap for climate action.


“The approval of the Paris Agreement Work Programme is the basis for a transformative process which will require strengthened ambition from the international community. Science has clearly shown that we need enhanced ambition to defeat climate change.


“From now on, my 5 priorities will be: ambition, ambition, ambition, ambition and ambition.


“Ambition in mitigation. Ambition in adaptation. Ambition in finance. Ambition in technical cooperation and capacity building. Ambition in technological innovation.


“Ambition will be at the centre of the Climate Summit that I will convene in September 2019.


“And ambition must guide all Member States as they prepare their Nationally Determined Contributions (NDCs) for 2020 to reverse the present trend in which climate change is still running faster than us.


“It is our duty to reach for more and I count on all of you to raise ambitions so that we can beat back climate change. [Boldface added]

Not only did the Secretary General not bother to attend, but he spoke of raising ambitions. For what purpose? Beating back climate change? Driving the world to the “ideal climate” of the Little Ice Age, when cold, wet European summers led to starvation for millions? If increasing carbon dioxide (CO2) caused this change we all should be thankful. But it is unlikely that CO2 had a significant effect on global temperatures. Any effect must be established by hard evidence, not numerical models based on speculative assumptions. See links under Defending the Orthodoxy, After Paris, After Paris! – COP-24, and https://www.un.org/sg/en/content/sg/statement/2018-12-15/secretary-generals-remarks-the-conclusion-of-the-cop24-0


Five Policy Questions: On his blog, Roy Spencer, the co-developer of the method of measuring temperature trends from satellites, asks five big questions regarding government policy. Three questions deal with science policy regarding global warming / climate change; two deal with energy policy. The three on science policy are:

“1) Is warming and associated climate change mostly human-caused?

2) Is the human-caused portion of warming and associated climate change large enough to be damaging?

3) Do the climate models we use for proposed energy policies accurately predict climate change?”

To these three questions, TWTW would add: Which dataset more accurately and directly measures greenhouse gas warming – atmospheric temperature trends or surface temperature trends?

Spencer gives his answers and more fully describes them in his book “Global Warming Skepticism for Busy People.” He states: “Regarding the first question, I might concede it is indeed possible most of the warming since the 1950s is human-caused.”

SEPP Chairman emeritus Fred Singer may disagree with this concession by Spencer. He has written that the warming since about 1980 is largely man-made, it is artificial. It is a combination of poor surface data collection, change in land use, and surface data manipulation. Further, surface data points cover only a very small part of the earth’s surface. Therefore, the surface dataset is not for the most part reflecting global climate change and it is not recording the effects of CO2, which, at any rate, occur in the atmosphere.

Spencer discusses the growing discrepancy between satellite measurements and surface measurements. This discrepancy is one of the several examples Singer uses to assert that the late 20th century warming in the surface dataset is largely artificial.

Of the five questions, the two energy policy questions Spencer poses are:

“4) Would the proposed policy changes substantially reduce climate change and resulting damage?

5) Would the policy changes do more good than harm to humanity?”

In preparing an answer to these two energy policy questions, TWTW asks: What technology can replace hydroelectric, coal, and natural gas power plants to deliver reliable electricity that modern civilization needs? The only solution today is nuclear. Wind and solar are inherently highly variable. The proposed, but not-yet existent, solution to uncontrollable energy sources is storage in batteries, compressed gas in underground caverns, flywheels, and the like. No matter what storage eventually becomes good enough for grid-scale usage, it will always favor conventional power over variable power. A storage system for wind and/or solar must be able to store many days’ energy.

By contrast, if more conventional 24/7 power plants are installed to provide more than conventional baseload power, the excess energy stored during times of low demand can be released during times of high demand. The storage system would have to store about a quarter of one day’s demand, rather than several days’ or weeks’ worth.

Spencer gives his response to these questions, see links under Challenging the Orthodoxy.


Deficiencies in the IPCC Special Report: Writing for the Global Warming Policy Foundation, Ray Bates, a former professor of Meteorology at the Niels Bohr Institute and former senior scientist at NASA’s Goddard Space Flight Center, discusses some of the deficiencies he found in the special report SR1.5 issued by the IPCC in October, leading up to COP-24. Among the deficiencies, is that SR1.5 deviated from the IPCC Fifth Assessment Report (AR-5, 2013), the latest from the IPCC, without evidence needed for a change.

“The central attribution statement of Working Group I in the Fifth Assessment2 was as


‘It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.’


“This statement did not necessarily attribute all the observed post-1950 warming to anthropogenic effects, nor did it attribute the substantial early 20th century warming (1910–1945) to such effects. In contrast to this caution, SR1.5 portrays all the global warming observed

since the late 19th century3 as being human-induced (see Figure 1). This major departure

from the Fifth Assessment is presented without any rigorous justification.”

Also, Bates takes exception to the frequently used term “Lines of evidence”:

A phrase much used by the IPCC is ‘lines of evidence’. Implicit in this phrase is the realistic acknowledgement that there is much in climate science that is not known with certainty and which perhaps never can be. Something that is known with certainty is that the concentrations of carbon dioxide and other trace gases that trap heat radiation at infrared wavelengths in the atmosphere are increasing as a result of human activities. Balancing this, it is also known that the dominant greenhouse gas (GHG) in the atmosphere is naturally occurring water vapour, which is far more abundant than the trace gases. When air rises and cools, the water vapour it contains condenses into clouds, which affect the greenhouse properties of the atmosphere in major ways. Quantifying cloud radiative effects with precision, however, is beyond the capability of present-day climate science. For as long as this remains the case, modelling the climate system’s response to increasing GHG emissions will remain an area of uncertainty.

Richard Lindzen has raised similar issues with the treatment of clouds and water vapor. Further, the concept of lines of evidence is used in US reports and in the EPA finding that CO2 and other greenhouse gases endanger human health and welfare. Lines of evidence are not hard evidence.

Bates discusses two papers, published after AR-5, which are ignored in SR1.5: 1) ‘Climate scientists open up their black boxes to scrutiny’, by Paul Voosen and 2) ‘The art and science of climate model tuning’, by Hourdin et al. Bates states:

“These papers point out that relatively small changes to the parameter settings in the representations of subgrid-scale physical processes in global climate models can lead to large changes in the models’ rate of warming in response to increasing GHGs. The Voosen paper reports an example in which modifying a poorly-determined parameter controlling how fast fresh air mixes into clouds changed the climate sensitivity – the equilibrium warming resulting from a doubling of carbon dioxide levels – of the model in question from 3.5◦C to 7◦C. The Hourdin et al. paper makes it clear that what modellers do in practice is to tune their models empirically so that they reproduce the observed 20th century warming, while giving a value of equilibrium climate sensitivity that lies in ‘an anticipated acceptable range’. It is known that there is no unique tuning that will give any particular desired result.”

Among the conclusions, Bates states:

“The SR1.5 report represents a very significant departure from previous IPCC reports in the direction of increased alarm regarding global warming, particularly as compared with the Fifth Assessment. No rigorous justification for this departure has been provided.

“In reality, since the Fifth Assessment considerable evidence has accumulated suggesting that global warming is more of a long-term threat than a planetary emergency. This evidence consists mainly of observational results suggesting lower climate sensitivity (i.e. less warming in response to any given increase in greenhouse gas concentrations) and results indicating a greater contribution from natural variability to explaining observed global temperature trends. The IPCC has not passed on this evidence to policymakers in its SR1.5 report.”

Some of the comments by Bates apply to US efforts as well. See links under Challenging the Orthodoxy.


Selling Fear: Michael Connoly, Roman Connoly, and Imelda Connoly, and Willie Soon and Patrick Moore have issued a report on the Greenpeace business model, which one can interpret as selling fear. (Patrick Moore was a founder of Greenpeace and departed some years ago.) The model can be summarized as:

1. “Invent an ‘environmental problem’ which sounds somewhat plausible. Provide anecdotal evidence to support your claims, with emotionally powerful imagery.

2. Invent a ‘simple solution’ for the problem which sounds somewhat plausible and emotionally appealing, but is physically unlikely to ever be implemented.

3. Pick an ‘enemy’ and blame them for obstructing the implementation of the ‘solution’. Imply that anybody who disagrees with you is probably working for this enemy.

4. Dismiss any alternative ‘solutions’ to your problem as ‘completely inadequate’”.

According to the authors:

“This model has been very successful for them, with an annual turnover of about $400 million ($0.4 billion). Although technically a “not for profit” organization, this has not stopped them from increasing their asset value over the years, and they currently have an asset value of $270 million ($0.27 billion) – with 65% of that in cash, making them a cash-rich business. Several other groups have also adopted this approach, e.g., Sierra Club, Friends of the Earth, WWF and the Union of Concerned Scientists.”

The report goes through several campaigns by Greenpeace such as Climate Change, Forests & Oceans, and Plastics. The overall purpose is not to educate, but to persuade. Understanding and details may restrict the goals of persuading others and raising money.

The entire report is a valuable guide to how organizations can profit by selling fear. The appendices give further detail, especially the highly influential writings of Chris Rose on “How to Win Campaigns.” Parts of the writings by Rose reminded Ken Haapala of the 1928 book by Edward Bernays, “Propaganda.” Bernays was called the “Father of Modern Advertising.” But after WWI, Bernays did not sell fear. See links under Challenging the Orthodoxy.


New EPA Rules: In the Federal Register, the EPA announced new rules for greenhouse gas standards for New, Modified, and Reconstructed Stationary Sources for Electric Utility Generating Units. These are particularly important for coal-fired utilities. The previous administration wrote the old rules to prevent High Efficiency, Low Emission (HELE) coal-fired power plants. Not only more efficient, HELE plants reduce emissions of Nitrogen Oxides (NOx), Sulphur Dioxide (SO2) and Particulate Matter (PM) as well a carbon dioxide.

Under the old rules, the limits on CO2 were so low, that HELE plants could not be constructed in the US. Such plants are being constructed in Europe, China, Japan, etc. and the technology continues to improve. By keeping the system under significant pressure, the temperature difference between the heat source and the sink (exhaust) is increased, increasing the efficiency of the system (Carnot cycle). See link under EPA and other Regulators on the March.


Number of the Week: Up to a 50% increase in efficiency? In the US, coal-fired power plants have an efficiency of about 33%. Advanced Ultra-Supercritical power plants, such as GE Steam H with temperatures reaching 650ºC, 1200ºF, are approaching 50% efficiency. If reached, this would be an increase in efficiency of 51%.



1. How America Broke OPEC

Lessons from the U.S. rise to be the world’s largest oil producer.

Editorial, WSJ, Dec 14, 2018


SUMMARY: The word “Broke” may be too extreme. However, the power of OPEC has been greatly diminished. The editorial begins by reminding the readers of “peak oil” then continues with :

“…So much for that. The U.S. the other week for the first time in 75 years became a net petroleum exporter as the Organization of the Petroleum Exporting Countries wrangled over how to respond to America’s growing energy bounty.


“U.S. crude production has surged 20% in a year and nearly tripled in a decade thanks to advances in hydraulic fracturing and horizontal drilling. American output is rising at the fastest rate in a century. Earlier this year the U.S. eclipsed Saudi Arabia and Russia as the world’s largest oil producer.


“For nearly six decades OPEC has dominated oil markets by setting production quotas among its 15 members. In late 2014, OPEC flooded the market with oil in an effort to break U.S. drillers who were burning cash on mounds of debt. As oil prices fell below $40 a barrel in 2015-2016, many wildcatters folded or were absorbed by larger producers.


“But the survivors became more efficient. Technology—including drones with thermal imaging to detect leaks along with improvements in horizontal drilling—boosted productivity. Over the last five years production per rig has more than tripled in the Permian Basin and quadrupled in North Dakota’s Bakken Shale. While the Bakken rig count has fallen by 70%, output has increased by a third.


“Most American oil refineries have processed heavier crudes, which depressed prices for lighter, sweeter grades produced in the new wells. But in late 2015 the GOP Congress expanded shale-oil’s market by lifting the export ban on crude in return for Barack Obama’s demand to extend renewable energy tax credits. U.S. crude exports have since soared to 3.2 million barrels a day.


“Many U.S. producers say they can turn a profit at $50 a barrel and even as low as $30 in the Permian’s most productive regions. Yet most OPEC members need prices ranging between $70 and $90 per barrel to balance their budgets. The cartel scaled back output in 2016, but shale producers roared back as prices recovered. America’s shale gusher has presented a quandary for OPEC and especially its largest member, Saudi Arabia, which faces large budget deficits as it works to contain Iranian influence in the Middle East.


“Earlier this year, the Saudis obliged President Trump by increasing output to prevent prices from soaring with the reimposition of U.S. sanctions on Iran. Even so oil prices hit a four-year high in early October. But they have since declined 30% amid weakening world economic forecasts, sanctions exemptions and surging U.S. production.


“OPEC and Russia last week agreed to scale back production collectively by 1.2 million barrels a day, but the meeting exposed the cartel’s cracks. Qatar quit amid hostilities with the Saudis. Small producers carped they were too insignificant to affect global supply. Algeria produces one million barrels per day, which is as much as U.S. output has increased in five months.


“Saudi Arabia, Russia and allied producers agreed to shoulder the bulk of the cuts while Libya, Iran and Venezuela received exemptions. Some in the media claim the Saudis defied Mr. Trump’s pleas to keep oil prices low, yet U.S. shale producers are likely to benefit from OPEC’s cuts by capturing more market share.


“One of the biggest constraints on U.S. production has been a distribution bottleneck. Hence West Texas Intermediate now sells at a $8 to $9 discount to Brent crude on the world market. But next year three pipelines capable of delivering two million barrels of Permian crude to the Gulf Coast are expected to come online. In 2020 two more pipelines that can carry two million barrels a day are expected to be completed.


“Oil companies are also racing to build more export terminals to handle the supply gusher, which isn’t likely to stop anytime soon. The U.S. Geological Survey reported recently that the Permian’s Delaware Basin holds more than twice as much oil and 18 times as much natural gas as the heavier-drilled Midland region.


“Barack Obama, hilariously, is now claiming credit for the shale boom. ‘You know that whole suddenly America’s like the biggest oil producer . . . that was me, people,’ he said last month at Rice University. But drilling leases on federal land declined 28% during his two terms amid new restrictions on land use. Drilling skyrocketed on private land, despite attempts by his regulators to block pipelines, slow down approvals, and impose higher costs on production.


“The Trump Administration is expediting pipeline and terminal permitting and opening new federal land to drilling. Last year’s tax reform unlocked Alaska’s Arctic National Wildlife Refuge. The Interior Department recently scaled back needless Obama protections for the sage grouse, which will allow drilling on nine million acres in oil-rich states. Leases are being snapped up at auction, even in areas where recoveries are now low and expensive. As technology advances, many investors expect the break-even price of production to fall.


“Politicians in the past have sought to secure American energy independence with price controls, ethanol mandates and the oil export ban. But they and OPEC should note that America owes its new energy prosperity to industry innovation, private property, and the free market.”


2. Only Good Management Can Prevent Forest Fires

There’s nothing new about catastrophic blazes. It’s how nature has always dealt with overgrowth.

By Tom McClintock, WSJ, Dec 17, 2018



The California Congressman writes:


“Pundits and politicians have taken to calling the rising incidence of catastrophic wildfire ‘the new normal. ‘ But California’s experience in the 21st century is neither new nor abnormal. It is, in fact, the old normal. The devastation unfolding today is how nature manages forests. Like an untended garden, an abandoned forest will grow until it chokes itself to death. Nature deals with morbid overcrowding through drought, disease, pestilence and ultimately catastrophic wildfire.


“Scientists studying charcoal deposits in California estimate that prehistoric wildfires destroyed between 4.5 million and 11.9 million acres a year. When Juan Cabrillo dropped anchor in San Pedro Bay in October 1542 (the height of the Santa Ana fire season), he promptly named it the ‘Bay of Smoke. ‘


“Our modern sensitivities reel at the devastation of the Camp Fire, which recently incinerated 153,000 acres, wiped out the entire town of Paradise, and claimed at least 86 lives. Yet in 1910 the ‘Big Burn ‘ in Idaho and Montana consumed three million acres, wiped out seven towns, and killed 87 among a far smaller and sparser population.


“The U.S. Forest Service had formed only five years earlier, driven by scientific breakthroughs in the understanding of forest ecology. The first wave of American conservationists didn’t watch helplessly as the cycle of catastrophic overpopulation followed by catastrophic wildfire wiped out entire forests. Instead, they believed, management could keep forests healthy and resilient for generations.


“Excess timber comes out of a forest in two ways—it gets carried out or burned out. For much of the 20th century, harvesting excess timber produced thriving forests by matching tree density to the ability of the land to support it. Foresters designated surplus trees, and loggers bid for the right to remove them at auction, with the proceeds going to the U.S. Treasury. These revenues were then put back into forest management and shared with local communities.


“What went wrong? In the 1970s, Congress passed a series of laws subjecting federal land management to time-consuming and cost-prohibitive environmental regulations. Instead of generating revenues, forest management now costs the government money. As a result, timber harvested from federal lands has declined 80%, while acreage destroyed by fire has increased proportionally.


“A half-century of environmental regulation hasn’t helped the forests thrive. A typical acre in the Sierra can support roughly 80 mature trees, but the current density is more than 300 trees. A single fully grown tree can draw 100 gallons of water from the soil on a hot day. Drought kills overcrowded forests quickly.


“The environmental left blames climate change. Yet this doesn’t explain the dramatic difference between federal lands and private forests that practice scientific forest management. The boundary lines can often be seen from the air because of the condition of the forests. How clever of the climate to decimate only the lands hamstrung by these environmental laws.”


After discussing that trees absorb CO2, the Representative concludes:


“Today’s environmental laws have restored the old normal, making drought, disease, pestilence and fire a constant scourge of our forests. A healthy forest that is maintained and preserved for the enjoyment of future generations is an abnormal condition produced by modern forest management. Ironically, in the name of improving the environment, we have surrendered our forests to a policy of neglect, which, as it turns out, isn’t benign.”


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s