Weekly Climate and Energy News Roundup #309


By Ken Haapala, President,The Science and Environmental Policy Project

Brought to You by www.SEPP.org

California Litigation, General: The public nuisance lawsuits by San Francisco and Oakland against oil companies continue to attract attention by those interested in carbon dioxide (CO2)-caused global warming. Global warming is now generalized into climate change, as promoted by John Holdren, President Obama’s science advisor. The change implies warming and cooling, although no one has advanced a credible hypothesis how carbon dioxide causes global cooling, other than by its absence.

Previous TWTWs discussed the filings by the two San Francisco Bay cities in the case, which is now before the US District Court for the Northern California District. The judge has ordered the parties to give a tutorial answering eight specific questions. In addition, various parties have filed amicus curiae (friend of the court) briefs. Of particular interest for TWTW are two briefs: one filed on behalf of three distinguished physicists, Professors William Happer, Steven Koonin and Richard Lindzen; the second filed on behalf of Christopher Monckton, et al. This week, TWTW will discuss the brief by the three professors. It will discuss a minor, but valuable, criticism of the Monckton brief by Roy Spencer, and will discuss that brief more fully next week.

Also, a slide show filed was by Chevron, on behalf of the defendants. That will be discussed briefly, emphasizing the strategy the oil companies appear to be taking. See links under Challenging the Orthodoxy.

********************

Quote of the Week. “It is error only, and not truth, that shrinks from inquiry.” –Thomas Paine [H/t William Readdy]

Number of the Week: Up 1.4%

********************

Three Professors: In their filing, Professors Happer, Koonin and Lindzen (Three Profs) accept the data used by the UN Intergovernmental Panel on Climate Change (IPCC) in its Fifth Assessment Report (AR5) and the Climate Science Special Report (CSSR) by the US Global Change Research Program (USGCRP); and accept the evidence presented in the reports. However, they demonstrate the conclusions in the reports are not established, and, at best, premature. Some human caused-global warming skeptics may be disappointed by this approach because it ignores the fact that atmospheric data is far superior, and more direct, than the surface data used in the reports.

But, from a legal standpoint, this may be a solid approach, because the Three Profs need not argue before a judge why satellite data is superior, or why global climate models are deficient. Instead, they assert:

“Our overview of climate science is framed through four statements:

“1. The climate is always changing; changes like those of the past half-century are common in the geologic record, driven by powerful natural phenomena

2. Human influences on the climate are a small (1%) perturbation to natural energy flows

3. It is not possible to tell how much of the modest recent warming can be ascribed to human influences

4. There have been no detrimental changes observed in the most salient climate variables and today’s projections of future changes are highly uncertain.”

To substantiate the first statement, the Three Profs state that, unfortunately, the common practice in climate science is to present graphs without uncertainty bars, which, when added, show significant uncertainties. Further, recent warmings have been marked by El Niño conditions; yet recent warming is similar to early 20th century warming. They also include the historic record and mention the last interglacial period (the Eemian) “when it was the 2C warmer than today and the sea level was 6 meters [20 feet] higher.”

To substantiate the second statement the Three Profs use figures appearing in the CSSR on energy flow and radiative forcing (Figs 2.1 & 2.3).

To substantiate the third statement the Three Profs go into a bit of detail in building global climate models, bring up use of sub-grid-scale parameters. They state:

“While these subgrid-scale parametrizations can be based upon observations of weather phenomena, there is still considerable judgment in their formulation. So the models are not, as one often hears, ‘just physics’ since the parameters in each must be ‘tuned’ to reproduce aspects of the observed climate.”

They assert another major difficulty in the model development, which Fred Singer and others have called circular reasoning.

“A second major problem is that there is no unique tuning that reproduces the historical climate data. Since aerosol cooling plays against GHG warming, a model with low aerosol and GHG sensitivities can reproduce the data as well as a model with high sensitivities. As a result, the GHG sensitivity is today uncertain by a factor of three (as it has been for forty years), therefore enlarging the uncertainty in any projection of future climates.”

The IPCC and others have defended this faulty reasoning. To express the problem in algebra, the equation X minus Y equals 3 has an infinite number of solutions, if the relationship between X and Y is maintained. Both terms can be extremely large or extremely small. If one is to use models to estimate the value for greenhouse gases (GHG), one must independently establish the value for aerosols, which the climate modelers have failed to do.

Further, the Three Profs state:

“A third problem is that the models must reproduce the natural variabilities of the climate system, which we’ve seen are comparable to the claimed anthropogenic changes. Climate data clearly show coherent behaviors on multi-annual, multi-decadal, and multi-centennial timescales, at least some of which are due to changes in ocean currents and the interaction between the ocean and the atmosphere. Not knowing the state of the ocean decades or centuries ago makes it difficult to correctly choose the model’s starting point. And even if that were possible, there is no guarantee that the model will show the correct variability at the correct times.”

Establishing the starting point for the human influence on the earth’s climate dominated by two fluids with changing exposure to a changing sun, and with other external influences, is a herculean task. The claim that the starting point is the start of the industrial revolution, or the start of a network of instrument measurement (US 1880s) is not sufficient.

To substantiate their fourth statement, the Three Profs use the low confidence the IPCC assigns to understanding weather events since 1951 including floods, droughts, severe weather events, cyclones, etc. One does not create certainty by compounding uncertainty. The Three Profs specifically discuss heat waves, sea level rise, and tropical cyclones. They conclude with:

“To summarize this overview, the historical and geological record suggests recent changes in the climate over the past century are within the bounds of natural variability. Human influences on the climate (largely the accumulation of CO2 from fossil fuel combustion) are a physically small (1%) effect on a complex, chaotic, multicomponent and multiscale system. Unfortunately, the data and our understanding are insufficient to usefully quantify the climate’s response to human influences. However, even as human influences have quadrupled since 1950, severe weather phenomena and sea level rise show no significant trends attributable to them. Projections of future climate and weather events rely on models demonstrably unfit for the purpose. As a result, rising levels of CO2 do not obviously pose an immediate, let alone imminent, threat to the earth’s climate.”

After this, the Three Profs follow with answers to the eight questions. It will be interesting to learn how the judge will address this amicus brief. See links under Challenging the Orthodoxy.

********************

Spencer – Use Clear Terminology: For years, meteorologist William Kininmonth of Australia has asserted that incorrect terminology is being used by the IPCC, which leads to vague or sloppy thinking. The issue of carbon dioxide-caused warming should be studied from the perspective of energy flows through the atmosphere. In assessing the amicus brief of Christopher Monckton, et al., Roy Spencer suggests the terms “feedback” and “forcing” are vague and poorly understood. They are not needed for describing the climate system or for modeling that system. The issue is energy flows: is the system in energy equilibrium?

Spencer asserts:

“…How modern 3D coupled ocean-atmosphere climate models work does not depend upon the feedback concept.

“What they DO depend upon is energy conservation: If the system is in energy equilibrium, its average temperature will not change (that’s not precisely true, because it makes little sense energetically to average the temperature of all ocean water with the atmosphere, and there can be energy exchanges between these two reservoirs which have vastly different heat capacities. Chris Essex has written on this). The point is that the total heat content of the system in Joules stays the same unless an energy imbalance occurs. (Temperature is focused on so intensely because it determines the rate at which the Earth sheds energy to outer space. Temperature stabilizes the climate system.)”

The climate models are essentially weather models, stabilized to run 100-year projections.

“Nowhere do the IPCC models invoke, use, assume, or otherwise depend upon any feedback equations. Those equations are just greatly simplified approximations that allow us to discuss how the climate system responds to an imposed energy imbalance. If somebody has published a paper that incorrectly explains the climate system with a feedback equation, that does not invalidate the models.” [Boldface added]

“Feedbacks in the IPCC models are diagnosed after the model is run; they are not specified before it is run.”

Embodied in Spencer’s comments is the issue of starting point, discussed by the Three Profs (see above). According to NASA-GISS, from ice core data, CO2 levels increased from 285 parts per million (ppm) in 1850 to 311 ppm in 1950, or by 26 ppm over the 100 years. Yet, there were

pronounced warming periods such as 1910 to 1940 like the late 20th century warming, even though CO2 increased by 11 ppm, compared with an increase of over 90 ppm from 1950 to today. (Note, there was a modest cooling from 1940 to about 1975, even though CO2 increased by 21 ppm.) There appears to be no strong justification to accept any particular starting point. See links under Challenging the Orthodoxy and https://data.giss.nasa.gov/modelforce/ghgases/Fig1A.ext.txt

********************

Oil Company Strategy? The filing by BP, et al. included two-part slide presentation for the Tutorial and another slide for a timeline of major developments in climate science. Some journalists reported that the oil companies were accepting the assertions of the IPCC. Skeptics may be disappointed by the presentation.

The presentation was divided into three parts citing the Notice:

“The first part will trace the history of scientific study of climate change, beginning with scientific inquiry into the formation and melting of the ice ages, periods of historical cooling and warming, smog, ozone, nuclear winter, volcanoes, and global warming.”

This part included the covers of major reports by the IPCC and the USGCRP, a 1965 quote from President Lyndon Johnson that humans are changing the composition of the atmosphere, and the growth in publications on climate science and climate change. It included the early science of the greenhouse effect by Jean-Baptiste Joseph Fourier (1766-1830) as well as growth in climate modeling.

“The second part will set forth the best science now available on global warming, glacier melt, sea rise, and coastal flooding.”

It presented slides on temperature variations, glacier melt, sea level rise, CO2 emissions, the spaghetti maze of climate model projections and a warning of dire sea level rise in San Francisco Bay Area. This part concluded with two slides:

“San Francisco Bond Disclosures (2017)”

“The City is unable to predict whether sea-level rise or other impacts of climate change or flooding from a major storm will occur, when they may occur, and if any such events occur, whether they will have a material adverse effect on the business operations or financial condition of the City and the local economy.”

Oakland Bond Disclosures (2017)

“The City is unable to predict when seismic events, fires or other natural events, such as sea rise or other impacts of climate change or flooding from a major storm, could occur, when they may occur, and, if any such events occur, whether they will have a material adverse effect on the business operations or financial condition of the City or the local economy.”

The presentation concluded with a Timeline of Climate Change Science showing that the greenhouse effect has long been known. The time line showed the IPCC AR5 and 2014 USGCRP report were followed by the municipal bond disclosures cited above.

The federal courts may improperly defer to government agencies on issues of science. But, as discussed in February 24 TWTW, attorney Richard Epstein asserts they take fiduciary responsibilities seriously on issues regarding selling bonds to the public.

It is interesting to speculate on the comfort-level the bond counsels and the bond holders have with the current litigation. See links under Litigation Issues – California Cities v. Oil Companies.

********************

Repeal Endangerment Finding: The group known as the Concerned Household Electricity Consumers Council (CHECC) submitted a proposal to the EPA in response to EPA’s request for comments on its rule making for replacing the Obama Administration’s Power Plan. CHECC repeated its petition to reconsider and repeal the Endangerment Finding. EPA veteran Alan Carlin summarizes the press release by CHECC. See links under Challenging the Orthodoxy.

********************

Externalities: In his blog, Energy Matters, energy analyst Euan Mearns is examining the external costs and benefits of various forms of generating electricity. He states:

“The economics term externality is a cost or benefit accrued by a third party from the actions of others where the third party did not choose to acquire said costs or benefits. The term has been widely adopted by the environmental lobby to describe negative impacts of energy production systems. What is all too often overlooked are the externalised benefits the same energy production systems provide. This post aims to summarise both internal and external costs and benefits of 12 electricity production systems employing 12 different measures.”

It is under this concept that economist Nicholas Stern calculated enormous, speculative costs to the British public from carbon dioxide emissions, promoting the passage of the UK Climate Change Act of 2008. As usual, politicians and economists do not pay for their mistakes.

It appears that Mearns is making a rigorous effort to measure costs and benefits of 12 electricity generating systems using 12 metrics to measure them. One metric that appears to be missing is the external benefits of increased carbon dioxide. As the NIPCC reports and CO2 Science show, agriculture, the environment, and humanity are greatly benefiting from increasing CO2. See links under Challenging the Orthodoxy – NIPCC, Review of Recent Scientific Articles by CO2 Science, and Energy Issues – Non-US

********************

Neo-Colonialism? Writing in Master Resource, Environmental Scientist Vijay Jayaraj explains his views why developing countries do not object to the Paris agreement, and the weak science on which it is based. It is for fear of damaging trade with the EU, which stipulates it will not ratify trade pacts with any country that does not ratify the Paris agreement. This is similar to what occurred after the US banned DDT, and efforts were made to ban it world-wide. Millions died of preventable malaria. See link under After Paris!

********************

Number of the Week: Up 1.4%. According to the International Energy Agency, CO2 emissions rose 1.4% in 2017, after being flat for three years. See links under Problems in the Orthodoxy.

********************

ARTICLES:

1. How Pennsylvania Slashed Coal Emissions Without Alienating Industry

New regulation gave power plants flexibility to cut smog-forming emission in half while remaining efficient

By Kris Maher, WSJ, Mar 21, 2018

https://www.wsj.com/articles/how-pennsylvania-slashed-coal-emissions-without-alienating-industry-1521633601

SUMMARY: The journalist writes:

Coal-fired power plants in Pennsylvania cut smog-forming emissions by more than half last year, in a rare regulatory effort that has won support from both industry officials and environmentalists.

Emissions of nitrogen oxides, or NOx, at the state’s six power plants that burn newly mined coal exclusively fell 60% last year from the year earlier to 23,133 tons.

At the 50-year-old Keystone Generating Station in the rural southwestern part of the state, emissions of nitrogen oxides fell 54% to 6,095 tons last year from the prior year, even though the plant burned more coal in 2017.

Industry experts credit a state regulation for the reductions. The rule, which took effect in January 2017, lowered the rate at which power plants and other sources can emit NOx. For power plants, it requires use of a potentially costly pollution control, in which ammonia is injected to reduce NOx, but only during times of high power usage, when the method is most cost-effective.

Vince Brisini, director of environmental affairs at Olympus Power LLC, an owner of the big Keystone plant and another nearby coal-burning plant, said he hadn’t ever seen emissions fall so rapidly across a state. He also said the regulation provides enough flexibility for power plants to run cost-effectively.

“This regulation will not force any coal-fired units to retire,” Mr. Brisini said.

The article shows a chart that nitrogen oxides were highest in Pennsylvania among 7 upper mid-west states in 2014 and have dropped by more than two-thirds to be second to Illinois. Although the conversion from coal to natural gas helped, in part, the journalist states further:

In Pennsylvania, a rule known as the Reasonably Available Control Technology II was implemented so that the state could meet the federal 2008 standard of 75 parts per billion of ozone within its own borders.

The Pennsylvania rule requires power plants to operate existing pollution controls more often in a way that factors in the chemistry of electricity generation and allows the plants to continue to run efficiently, say industry officials.

Under the rule, companies are operating at lower emissions rates at times of high demand, including on hot summer days when they are most likely to contribute to ozone, industry experts say. Before, companies were allowed to emit NOx at higher rates, and some had banked allowances enabling them to emit even more.

“The beauty of the rule that Pennsylvania adopted is that it found a way to strike a balance,” said Dave Flannery, legal counsel for the Midwest Ozone Group. He said requiring the use of controls that use ammonia during periods of low demand, for example, would have increased costs dramatically. The association’s members include power generation and industrial companies.

Of course, Northeastern states, which have eliminated coal-fired power plants and have high electricity costs complain Pennsylvania is not doing enough. The journalist concludes with a quote from Mr. Brisini:

“’This rule was written to achieve the necessary environmental outcome,’ he said. ‘It was not written with the idea that you want to force people out of business.’”

*****************

2. Fine-Tune Your B.S. Detector: You’ll Need It

In the digital age, misinformation—from nonsense to lies—spreads faster than ever and is becoming an area of serious research

By Elizabeth Bernstein, WSJ, Mar 19, 2018

https://www.wsj.com/articles/fine-tune-your-b-s-detector-youll-need-it-1521471721

“Do you have a good B.S. detector? You need one in our digital age.

“The skill of spotting false information—rubbish, nonsense and, yes, fake news—is so important these days that scientists have begun serious research on it. They’re attempting to quantify when and why people spread it, who is susceptible to it, and how people can confront it.

“This month in Atlanta, at the annual conference of the Society for Personality and Social Psychology, a group of psychologists and other scientists presented a symposium on their research. The title? “Bullshitting: Empirical and Experiential Examinations of a Pervasive Social Behavior.”

“B.S. is a form of persuasion that aims to impress the listener while employing a blatant disregard for the truth, the researchers explained. It can involve language, statistics and charts and appears everywhere from politics to science. This definition closely adheres to the one presented by the philosopher and Princeton emeritus professor Harry Frankfurt in his now-classic 2005 book “On Bullshit.” Dr. Frankfurt explored how B.S. is different than lying because liars know the truth and push it aside while B.S.ers don’t necessarily care about the truth at all.

“Of course this isn’t new. But false information moves faster and farther these days, thanks to social media. A new study conducted by researchers at Massachusetts Institute of Technology, published earlier this month in the journal Science, analyzed the spread of 126,000 rumors tweeted by 3 million people over more than 10 years and found that false news spreads faster than truth. “We have reached epidemic levels of information pollution, and we need to do something about it,” says Jevin West, a professor of information science at the University of Washington. Dr. West co-created a class launched last year at the university, “Calling Bullshit,” that teaches students how to spot and refute the way data, such as statistics and charts, can be manipulated to make false arguments. More than 60 schools have requested permission to use the materials to set up classes of their own, Dr. West says.

“Some people spread false information unknowingly. But others simply don’t care if what they’re posting is untrue, Dr. West says, and pass along the information as a way to signal their views and values to their group. Philosophers call this tribal epistemology.

“Website algorithms often favor salacious stories. (YouTube came under fire last month for the way its recommendation algorithm promotes conspiracy-theory videos aimed at viewers on both the left and the right.) And millions of bots—computer programs that can appear to be real people—also spread false information across the internet.

After some discussion, the journalist provides some tips such as: Check the source. If it sounds to be too good to be true, it probably is; Ask questions; Ask for evidence; and Pay attention to people who discount evidence, a red flag.

TWTW comment: The advice may apply for those wading through bureaucratic science as well.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s