Weekly Climate and Energy News Roundup #311

By Ken Haapala, President, The Science and Environmental Policy Project

Brought to You by www.SEPP.org

Adjusting Data: SEPP Chairman emeritus Fred Singer has an essay in The Hill, a newspaper targeted for Capitol Hill, namely the US legislature. In the essay, Singer recognizes the importance of adjusting data. But, there is a right way and a wrong way to go about it. Singer uses actions by the National Oceanic and Atmospheric Administration (NOAA) and its National Climatic Data Center (NCDC) to exemplify both ways. NCDC is now called the National Centers for Environmental Information (NCEI). It is entrusted for “preserving, monitoring, assessing, and providing public access to the Nation’s treasure of climate and historical weather data and information.” [Boldface added.] According to the American Institute of Physics, NOAA’s enacted budget for Fiscal Year 2018 is $5.9 billion, up 4% from FY 2017.

Singer believes that NOAA and NCDC (NCEI) have received both justified and unjustified criticism for its treatment of “the Nation’s Treasure.” He writes:

“It is important for the public to gain some perspective on such changes before indulging in wild accusations. Equally important, NOAA must use more transparency and not only announce data adjustments, but explain them so that reasonable people of goodwill will understand.”

He explains that adjustments to data for moving weather stations are completely understandable. But, adjustments made to data immediately prior to the Paris Conference are not. It was an effort to explain away “the pause.” Further, Singer believes that the warming shown in the US record in the last decades of the 20th century and the beginning of the 21st century are unjustified and are inconsistent with other records of surface temperature.

See links under Challenging the Orthodoxy, Defending the Orthodoxy, https://www.ncdc.noaa.gov/, and *https://twitter.com/FYIscipolicy/status/982372591309590529/photo/1?ref_src=twsrc%5Etfw&ref_url=https%3A%2F%2Fwww.aip.org%2Ffyi%2F2017%2Fwhite-house-proposes-deep-funding-cuts-noaa

******************

Quote of the Week. “The ultimate measure of a man is not where he stands in moments of comfort and convenience, but where he stands in times of challenge and controversy.” Martin Luther King Jr.

Number of the Week: 0.013%

******************

Adjusting Data Correctly: According to Singer, to maintain public confidence and assure accuracy when adjusting data, an organization needs to be transparent and use all appropriate available data to cross check the work. In short, frequent testing is needed. It appears that the team at the Earth System Science Center (ESSC) of the University of Alabama in Huntsville (UAH) has done that.

Some time ago UAH reported that the satellite identified as NOAA-14 (operating in the 1990s and 2000s) was giving questionable readings. This led to spurious warming trends in the four datasets compiled from the data: UAH, RSS (Remote Sensing Systems), NOAA, and UW (University of Washington). Orbital drift is a problem with all satellite measurements, but the problem appeared to be worse with NOAA-14 than any other.

As the ESSC team was preparing the latest version of the UAH dataset, they recognized that the problem with NOAA-14 was becoming severe and implemented two important changes. They stopped using data from NOAA-14 in 2001 and implemented an algorithm (processes or mathematical formulas) to limit the error from readings by other satellites. [Algorithms are useful in correcting errors if there is a consistency in the errors. If there is no consistency in error, their use is doubtful.] A key to realizing there was a problem with NOAA-14 was that it was showing greater warming than NOAA-15, which was launched in the late 1990s.

After they built the latest dataset (Version 6.0) the ESSC team analyzed their earlier version as well as the other three datasets. In this analysis they used datasets from U.S. weather balloons, that had not changed software or instruments for the major part of the NOAA-14 record. These data came from NOAA, the University of Vienna and the University of New South Wales.

This data were coupled with global weather “reanalysis” data from the European Center for Medium Range Forecasts, the Japan Climate Research Group and NASA. Reanalysis data is very important in correcting and updating weather forecasts and, hopefully, improving numerical weather models. This effort gave the ESSC team weather balloon data from 564 stations around the world.

The work was published by the International Journal of Remote Sensing and is available on-line. Based on their analysis, the global (near global) atmospheric warming trend is 0.07 to 0.13°C per decade and the tropical (20°S-20°N) trend is +0.10 ± 0.03°C decade. This is where Mr. Santer’s distinct human fingerprint should be, so prominently emphasized in the Second Assessment Report (AR-2) of the UN Intergovernmental Panel for Climate Change (IPCC).

The importance of the study is illustrated in one sentence:

“The satellite-monitored layer for this study is commonly referred as the mid-troposphere (TMT) because the peak of energy received by the satellite originates in the mid to upper troposphere with the main weighting function (96% of the signal) going from the surface to 70 hPa.” [To about 60,000 feet, 18,000 meters.]

The authors estimate that the warming trend of the (nearly) global bulk atmosphere layer is 0.10 ± 0.03°C decade. This is where the greenhouse effect occurs. See links under Challenging the Orthodoxy.

******************

What Does It Mean? The analysis by the ESSC team is useful in evaluating the work of the IPCC, the US Global Change Research Program (USGCRP), and their followers. For that reason, one can expect that it will be ignored by those entrenched in Bureaucratic Science. The 40-year icon of CO2-caused global warming is that a doubling of CO2 will result in a warming of 3°C plus or minus 1.5°C, from the Charney Report. The data record discussed above is from 1979 (December 1978) to December 2016, during which time the atmospheric CO2 concentration rose from 335 ppm to 404 ppm, an increase of 20.6%. This works out to 0.54% per year for the 38-year record. At that rate it would take about 190 years to double CO2.

Assuming the entire calculated warming trend of 0.1 °C per decade is from CO2 and indirect influences (positive feedbacks such as water vapot) caused by CO2, the 38-year trend indicates a doubling of CO2 would require almost 190 years and cause a warming of less than 2°C. Given natural variation, the warming trend is hardly cause for alarm.

The issue is further complicated by the earlier part of the atmospheric record includes a cooling from aerosols emitted by volcanoes and the later part includes a warming from sea temperatures surface by El Niños. These efforts can be removed from the data using widely-accepted statistical techniques such as those employed by Wallace, et al. Using such techniques would reveal the dangerous, “unprecedented” atmospheric warming is insignificant.

Twenty years ago, Richard Lindzen wrote a paper questioning whether CO2 could cause significant warming. In his conclusion, he stated::

“Indirect estimates, based on response to volcanos, suggest sensitivity may be as small as 0.3–0.5°C for a doubling of CO2, which is well within the range of natural variability.”

Of course, the paper has been largely ignored by bureaucratic scientists who prefer to alarm the public rather than accurately discuss science. See links under Challenging the Orthodoxy.

******************

Exxon’s Secret Science Revealed? A reporter for Legal News Line, Daniel Fisher, discovered Exxon’s “secret science” claimed in the Public Nuisance complaint by the cities of Oakland and San Francisco against oil companies. It involved a group formed by the American Petroleum Institute called the Global Climate Coalition (GCC). The plaintiff (city) lawyers claimed a memo by GCC contained evidence of scientific knowledge that Exxon was trying to hide from the public. According to the reporter: the secret was a summary of the IPCC AR-2, 1995, which included Mr. Santer’s mysterious distinct human fingerprint, the hot spot, which is still missing, or barely visible, even 20 years later.

Former SEPP Chairman, the late Fredrick Seitz, called the process involved in inserting the terminology of a distinct human fingerprint the worse abuse of the peer review process he had seen in 60 years of American science. Numerous times, Fred Singer and TWTW have brought up the missing hot spot. As stated in the new ESSC paper a weak hot spot, regardless of cause, indicates that the earth is less sensitive to greenhouse gases than claimed by the IPCC:

“Lower trends would suggest relatively modest sensitivity of the climate system to extra greenhouse gas forcing, while higher trends would support greater sensitivity. Thus, the trend magnitudes are critical, for example, to understanding the response of the climate system to enhanced forcing. In this paper, our focus will be on the credibility of the datasets and associated trends, suggesting reasons for their differences and offering a best estimate based on multiple, independent efforts.”

A search of the docket for the District Court in the case did not reveal an amended filing by San Francisco, perhaps it had not been entered yet, but it did reveal an amended filing by Oakland. The statements of the reporter are supported. Among other things, the amended filing stated:

“In February 1996, an internal GCC presentation summarized findings from the 1995 IPCC Second Assessment report and stated that the projected temperature change by 2100 would constitute ‘an average rate of warming [that] would probably be greater than any seen in the past 10,000 years.’” [Note that the rate of warming from 20,000 to 10,000 years ago is more interesting.]

Is one to assume that by quoting the IPCC, one accepts its highly questionable science? The amended complaint further states: [The numbers are paragraph numbers.]

110 “Between 1998 and 2014, Exxon paid millions of dollars to organizations to promote disinformation on global warming. During the early- to mid-1990s, Exxon directed some of this funding to Dr. Fred Seitz, Dr. Fred Singer, and/or Seitz and Singer’s Science and Environmental Policy Project (“SEPP”) in order to launch repeated attacks on mainstream climate science and IPCC conclusions, even as Exxon scientists participated in the IPCC. Seitz, Singer and SEPP had previously been paid by the tobacco industry to create doubt in the public mind about the hazards of smoking. Seitz and Singer were not climate scientists.”

111 “Exxon’s promotion of fossil fuels also entailed the funding of denialist groups that attacked well-respected scientists Dr. Benjamin Santer and Dr. Michael Mann, maligning their characters and seeking to discredit their scientific conclusions with media attacks and bogus studies in order to undermine the IPCC’s 1995 and 2001 conclusion that human-driven global warming is now occurring.”

SEPP strongly demands the Cities produce evidence of their accusations, which SEPP asserts are false.

The filing concludes with maps showing a projected sea level rise in 2050. It shows the Oakland Airport and parts of the wealthy community of Alameda underwater. Has the City informed the bond holders and the residents that their securities and homes may be unsaleable because they will be submerged? See links under Litigation Issues — California

******************

In Defense of Oakland and San Francisco: To substantiate their claims, the cities used Myles Allen of the Environmental Change Institute, School of Geography and the Environment, and Professor in the Department of Physics, University of Oxford. He is a long-time climate alarm advocate, author of IPCC reports, and involved with the Attribution of Climate-related Events (ACE) initiative.

A few slides in his presentation stand out:

· “Evidence that a detectable signal was not needed to make predictions. (One can make predictions based on nothing.)

· Warming is unequivocal (See Singer’s comments above)

· Formal comparison of expected responses to known drivers (‘fingerprints’) allowed the null-hypothesis of negligible human influence to be rejected at the 95% confidence level (P<0.05) back in the 1990s

· Human-induced warming is now 1°C ± 0.15°C, about 80% due to CO2”

A number of slides were based on sea level rise, including increasing ice loss in the cryosphere.

On Climate Etc. Judith Curry summed her review of evidence that sea level rise is accelerating with the statement: “The concern about sea level rise is driven primarily by projections of future sea level rise.”

In short, the fears of dire global warming, accelerating sea level rise, etc. are based on global climate models that fail basic tests to accurately predict atmospheric temperatures. Apparently, even the poster child of dire sea level rise, the Maldives, are no longer concerned – they are building a runway to handle the world’s largest passenger plane. See links under Challenging the Orthodoxy, Defending the Orthodoxy, Changing Seas, and Below the Bottom Line.

******************

Reverse Osmosis for Tidewater? The fear of salt water intrusion in coastal communities greatly plays into the hands of those who use highly questionable models to claim major sea level rise in the future. For some US communities, entities of NOAA produced studies greatly exaggerating this risk, and ignoring the real problem: the land is sinking due to ground water extraction. By ignoring the real threat and pretending the false threat, these entities of NOAA have demonstrated they no longer serve public interests or for the public benefit.

In the past, coastal communities had one alternative to solve the problem of sinking land from ground water extraction, build reservoirs and pipelines, either nearby or at considerable distance. For example, Los Angeles and New York City rely on piped water. An alternative has been developed that appears to be cost effective: desalination using reverse osmosis.

The concept has been around for some time, but reverse osmosis can be expensive for desalination because algae thrives in the filters, forcing expensive replacements. The Israelis are widely expanding desalination through reverse osmosis by using porous lava rock for prefiltering the water, removing the algae. Carlsbad, California, implemented a similar system, using layers of anthracite (hard coal), gravel, and sand for filtration. After over a year of operation the operating costs seem reasonable – under $7 dollars per 1,000 gallons. See links under Other News that May Be of Interest.

******************

He Must Go! EPA Administrator Scott Pruitt has been a lightning rod of criticism against President Trump, as if Mr. Trump needed one. Mr. Pruitt has declared that the EPA will no longer base regulations on science that is not transparent (secret science), eliminated the long-established practice of channeling moneys from government litigation against private companies to politically favored special interest groups, and is now proposing to soften the fuel economic standards for automobiles, a relic of the fear of running out of oil. No wonder why many green politicians and green organizations are demanding he must go. See Article # 1 and links under Change in US Administrations.

******************

Junk Electricity? Power expert Donn Dears has been writing essays on how wind and solar are disrupting the electrical grid. The problems stem from government mandates and the auction system for the grid operator accepting delivery of electricity. The term dispatchable is important, it means that the grid operator can dispatch, transmit, electricity to the consumer when needed. Driven by nature, wind and solar power are not dispatchable (though efforts are being made to make industrial solar dispatchable in appropriate regions.)

The grid can be looked upon as an energized system, serving all on it, consumers and producers. It is similar to a nervous system serving all organs in the human body. Solar and wind power do not reliably serve all the consumers and producers on the grid. Thus, they can be looked upon as a form of junk food, pleasing to some senses, but not needed or beneficial to the entire system. They can be called junk electricity. See links under Energy Issues – US.

******************

Number of the Week: 0.013%. Another dire emergency of sea level rise from the “collapsing” West Antarctica ice sheet arose this week from a paper in Nature Geographic. As stated in previous TWTWs the “collapse” is in geological terms of thousands of years, and the ice sheet is on the Antarctic fault / rift with over 100 geothermal hot spots (active or inactive volcanoes). It is similar to the mid-Atlantic rift that runs through Iceland, providing hot springs, geothermal warming, and electricity to that island. On her web site, Jo Nova calculated that the area of Antarctica covered by ice discussed in the latest crisis is 0.013% of the area of Antarctica covered by ice. The volume of ice is far less, what is smaller than miniscule? See link under Changing Cryosphere – Land / Sea Ice.

******************

ARTICLES:

1. Coffee Won’t Kill You, But CAFE Might

While these downsized cars are more fuel-efficient, they are also less crashworthy.

By Sam Kazman, WSJ, Apr 4, 2018

https://www.wsj.com/articles/coffee-wont-kill-you-but-cafe-might-1522880249

The general counsel of CEI argues against federal government corporate average fuel economy (CAFÉ) standards because people are being killed. He writes:

“On Monday EPA Administrator Scott Pruitt announced he is re-examining the stringent standards set by the Obama administration in 2012. This might finally bring some honesty to the issue of CAFE’s lethal effects and push the safety issue to the forefront of the debate over government efficiency mandates. Or it might not.

“To call it a coverup isn’t hyperbole. CAFE kills people by causing cars to be made smaller and lighter. While these downsized cars are more fuel-efficient, they are also less crashworthy. In 1992 in Competitive Enterprise Institute v. NHTSA, a lawsuit my organization brought with Consumer Alert, a federal appeals court ruled that the agency had “obscured the safety problem” through a combination of “fudged analysis,” “statistical legerdemain” and “bureaucratic mumbo-jumbo.” In the court’s view, nothing in the record “appears to undermine the inference that the 27.5 mpg standard kills people.”

“How many people? A 1989 Harvard-Brookings study estimated the death toll at between 2,200 and 3,900 a year. Similarly, a 2002 National Academy of Sciences study estimated that CAFE had contributed to up to 2,600 fatalities in 1993. This was at a relatively lenient CAFE level of 27.5 miles per gallon. Under what the Obama administration had in store, CAFE would soon approach levels twice as stringent.

“These inconvenient truths should have led the government to change its approach to CAFE. At least the standards didn’t get worse for about a decade throughout the 1990s, despite environmentalist demands for a stricter—and therefore more lethal—approach. But then CAFE was swept up in climate-change politics.

“Advocates of stringent standards claim that automotive technologies have advanced since that 1992 court ruling, making vehicle mass less significant. But the basic relationship between size and safety has not changed. The Insurance Institute for Highway Safety, which closely monitors crashworthiness, still provides the same advice it has been giving for years: “Bigger, heavier vehicles are safer.”

“CAFE advocates like Consumer Reports treat lighter cars as merely a question of comfort, not crashworthiness. Car makers and dealers may express concerns about safety in the abstract, but considerations of politics and marketing make them hesitant to discuss hard numbers.

“In his announcement, Mr. Pruitt proved admirably blunt in characterizing the Obama CAFE standards as based on “politically charged expediency” and assumptions “that didn’t comport with reality.” Let’s hope he’ll be similarly candid about CAFE’s risks. A lethal program that’s been in effect for decades deserves one thing above all—an accounting.”

***************

2. ‘Meltdown’ Review: Flirting With Disaster

Thanks to dense networks and the complacency of groupthink, small glitches can cascade into catastrophic failures. David A. Shaywitz reviews “Meltdown” by Chris Clearfield and András Tilcsik.

By David A. Shaywitz, WSJ, Apr 2, 2018

https://www.wsj.com/articles/meltdown-review-flirting-with-disaster-1522708270

SUMMARY: Clearfield and Tilcsik are, respectively, a business school professor and a former derivatives trader. They wrote about the failures at Three Mile Island and of capital firms using computer driven trading. The authors of the book state the failures stem from two variables:

“The first is complexity: the extent to which a system is linear and observable (like an assembly line) or interconnected and largely invisible. The second is coupling: the degree to which a system possesses ‘slack’—allowance for the time and flexibility to manage problems. Our determination to increase complexity and wring out inefficiencies, the authors warn, moves us into a danger zone and set us up for calamity.”

The reviewer continues:

“Avoiding disasters requires more than just speaking up, of course. ‘Dissent makes no difference if no one listens,’ Messrs. Clearfield and Tilcsik remind us, but listening proves to be difficult. When your beliefs are challenged, the authors write, your body reacts as if you suddenly spotted a wild animal—’your heart beats faster and your blood pressure rises.’ This physiological response aligns with decades of psychological studies showing subjects adjusting their opinions to match the group.

“Managers who overcome these instincts and encourage divergent opinions, the authors say, are more likely to avoid disaster. After a National Transportation Safety Board study of airplane crashes revealed that most happened when the more senior pilot was in command—generally because the less experienced officer was reluctant to point out any errors he observed—a new training program focused on changing hierarchical cockpit norms was launched and ultimately embraced by the industry.

“Perhaps the most important lesson of ‘Meltdown’ is captured by a series of studies performed by Evan Apfelbaum and his colleagues at MIT. They found that, as much as we’re predisposed to agree with a group, our willingness to disagree increases dramatically if the group is diverse; deference dissipates. Homogeneity may facilitate ‘smooth, effortless interactions,’ according to the authors, but diversity drives better decisions.

“Organizations can keep themselves honest by leveraging ‘strangers,’ individuals who understand enough to be relevant but are removed enough to see things differently. NASA’s Jet Propulsion Laboratory, for example, embeds designated outsiders—JPL engineers with an independent line of reporting—within teams to provide a fresh perspective and flag problems.

“Despite occasionally having the feel of a business journal article that was extended against its will into a book-length text, ‘Meltdown’ effectively conveys why addressing systemic failures is both difficult and essential: difficult because it’s so much more comfortable to rely on gut instinct and trust familiar colleagues than to insist on structured approaches and solicit the views of others; essential because we are moving into the danger zone and need all the help we can get.”

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s