Weekly Climate and Energy News Roundup #324

Brought to You by www.SEPP.org The Science and Environmental Policy Project

By Ken Haapala, President

Group Think-Bureaucratic Science: Last week’s TWTW discussed Judith Curry’s review of a rather remarkable paper by retired MIT professor Carl Wunsch, who participated in 1979 report “Carbon Dioxide and Climate: A Scientific Assessment,” headed by Jule Charney. The findings in Charney Report have become the core reasoning for the UN Intergovernmental Panel on Climate Change (IPCC), the US Global Change Research Program (USGCRP), and many US government actions, including the EPA’s illogical finding that carbon dioxide endangers human health and welfare.

Yet, the Charney Report presents no hard evidence that carbon dioxide (CO2) will cause a warming beyond a modest one demonstrated in numerous laboratories since the 1920s. Instead the Charney Report relies on speculative findings by five modeling groups, using the extremes for the upper and lower bounds in its finding. The characteristics of the participants described by Wunsch as reported by Curry are characteristic of what has been termed as groups think. They include:

1) Tremendous self-confidence, leading to a sense of entitlement and of belonging to an elite community of experts;

2) An unusually monolithic community, with a strong sense of consensus, whether driven by the evidence or not, and an unusual uniformity of views on open questions;

3) A disregard for and disinterest in the ideas, opinions, and work of experts who are not part of the group;

4) A tendency to interpret evidence optimistically, to believe exaggerated or incorrect statements of results and to disregard the possibility that the theory might be wrong. This is coupled with a tendency to believe results are true because they are ’widely believed,’ even if one has not checked (or even seen) the proof oneself; and, perhaps most importantly,

5) The failure to constantly test the basic tenets, instead to work hard to buttress them.

These characteristics were revealed in the Climategate emails, and are found in government-funded climate science, on which the US has spent over $40 billion, according to government reports, while largely ignoring strong, hard evidence contradicting the findings of the Charney Report. See last week’s TWTW, and links under Challenging the Orthodoxy and Defending the Orthodoxy.

***************

Quote of the Week: “The image of a dangerous world has never been broadcast more effectively than it is now, while the world has never been less violent and more safe.” – Swedish statistician, the late Hans Rosling in “Factfulness” [H/t Roger Pielke, Jr. See Article # 1]

Number of the Week: 50 Million Gallons a Day

***************

Saving the World: As if it were a follow-up, on August 1, the New York Times Magazine issued a digital special edition, with photographs and videos, on a segment of the history of climate science titled “Losing Earth: The Decade We Almost Stopped Climate Change.” This illustrates, probably unintendedly, the extent that group think has extended into segments of the public.

Decades of laboratory experiments show that all atmospheric gases affect the flow of radiant energy from the sun to earth (including its atmosphere), and from the earth to space. Those that slow the flow of infrared radiation, which is not visible, from the earth to space, are called greenhouse gases and warm the planet particularly at night. By far, water vapor is the most important greenhouse gas in both concentration and absorption efficiency. Carbon dioxide is a minor greenhouse gas both in concentration and absorption efficiency. It was long recognized that the absorption ability of CO2 was limited and at pre-industrial levels of CO2 was approaching saturation levels, where the effect of any increased concentration is insignificant. (A graph showing heating effects versus concentration of CO2 is highly logarithmic, becoming almost horizontal at current CO2 levels.)

The NYT’s show opens with pictures of melting ice, destroyed homes (perhaps by tornado), and flooded residences interspaced with black screens with the following captions:

· Thirty years ago, we had a chance to save the planet.

· The science of climate change was settled.

· The world was ready to act.

Then it gives an editor’s note by Jake Silverstein:

“This narrative by Nathaniel Rich is a work of history, addressing the 10-year period from 1979 to 1989: the decisive decade when humankind first came to a broad understanding of the causes and dangers of climate change. Complementing the text is a series of aerial photographs and videos, all shot over the past year by George Steinmetz. With support from the Pulitzer Center, this two-part article is based on 18 months of reporting and well over a hundred interviews. It tracks the efforts of a small group of American scientists, activists and politicians to raise the alarm and stave off catastrophe. It will come as a revelation to many readers — an agonizing revelation — to understand how thoroughly they grasped the problem and how close they came to solving it.” [Boldface added]

The history discusses some of the earlier participants in the realization that industrialization was increasing global CO2 concentrations. But, it fails to discuss the laboratory evidence that CO2 is a minor greenhouse gas. Instead, it focuses on the participants in the Charney Report, particularly the modelers such as Jim Hansen. The Charney Report does not discuss observational data, evidence, but presents speculation from models. As later written by Wunsch,

“From one point of view, scientific communities without adequate data have a distinct advantage: one can construct interesting and exciting stories and rationalizations with little or no risk of observational refutation.”

The NYT narrative then goes into politics and how the usual villains are sabotaging “true science.” Those interested in the personalities involved may be interested in reading it. But the skeptic may ask: why does this history stop in 1989?

One possible answer is that March 1990 saw the publication of a paper by Roy Spencer and John Christy “Precise Monitoring of Global Temperature Trends from Satellites.” Initially, the paper was well-received, and Spencer and Christy received significant honors. But, soon the climate establishment realized the atmospheric temperature trends did not support the modeling in the Charney Report, which was adopted by the IPCC, formed in 1988. And, driven by the fear of carbon dioxide-caused warming, the US government was opening the spending floodgates for “climate science.” Spencer and Christy were shunned, their work vilified, “discredited.” Small errors in orbit calculations were found, the satellites had no thrusters to maintain precise orbits. These errors were acknowledged and promptly corrected as required by rigorous science. Yet, editors of western journals, such as Science, declared they would no longer accept articles that question the established science – a science built on speculation, not evidence.

The NYT’s history of climate science is reminiscent of the formation of modern science, natural philosophy, particularly astronomy. The ideas of a heliocentric system of planetary motion, rather than an earth-centered system, by Copernicus (1473-1543) had no facts, hard evidence, substantiating their adoption. Yet, those who formed modern science had immense patience in observation and great boldness in forming hypotheses. And they understood the necessity of testing the hypotheses. Decades after Copernicus, Galileo (1564-1642) accepted the heliocentric system and discovered the telescope to collect data to substantiate the hypothesis. But his views were limited by concept that planetary orbits must be perfect circles. Using the observations of Tycho Brahe (1546-1601), Galileo’s contemporary, Kepler (1561-1630), greatly simplified the Galileo’s scheme by developing three laws of planetary motion, including elliptical orbits.

All these men were vilified for questioning views from ancient times. It fell on Newton (1643-1727) to bring together the views of Galileo and Kepler, showing that the laws of motion apply to objects on earth and celestial bodies in his “Mathematical Principles of Natural Philosophy.” Newton improved on the telescope, which he used to test his hypotheses with evidence. Of course, Newton was criticized for this work and he famously stated:

“I can calculate the motion of heavenly bodies, but not the madness of people.”

If the NYT’s piece, “Losing Earth: The Decade We Almost Stopped Climate Change,” is to be considered a history, it is best likened to a history of astronomy before the telescope and careful observations. See links under Challenging the Orthodoxy and Defending the Orthodoxy.

***************

Fredrick Seitz Memorial Award: During the luncheon of The Heartland Institute’s “America First Energy Conference 2018,” on August 7, at the Hilton New Orleans Riverside Hotel; SEPP will be honored to present the Fredrick Seitz Memorial Award to Dr. Roy Spencer. Spencer was a Senior Scientist for Climate Studies at NASA’s Marshall Space Flight Center, when he and Dr. John Christy received NASA’s Exceptional Scientific Achievement Medal for their global temperature monitoring work with satellites. (Note the change in schedule.)

Few deserve an award for exceptional courage in the quest for knowledge as much as Roy Spencer. (John Christy received the award in 2016.) We thank him and his important work. See commentary above and for conference information see http://americafirstenergy.org/

***************

Need For Both – Models and Observations: Consulting Meteorologist Anthony Sadar has an article in the Washington Examiner emphasizing that modeling is a vital part of modern science and may be replacing the traditional “’scientific method,” which consists of observation, hypothesis and testing, with rigorous testing of a hypothesis eventually leads to a “theory.”

Sadar points out that mathematically modeling is essentially an investigative tool, that can greatly benefit our understanding of complex nature. But, the trend may be disturbing. Sadar uses quotes from “A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming” by Professor Paul Edwards:

“…a supporter of the ‘consensus’ view of climate change, asserts that ‘Everything we know about the world’s climate — past, present, and future — we know through models.’ He also notes that ‘without models, there are no data.’

 

“Models have become integral to modern scientific practice. In many fields, Edwards says ‘computer models complement or even replace laboratory experiments; analysis and simulation models have become principal means of data collection, prediction, and decision making.’”

One is tempted to ask: In what decade does Professor Edwards dwell? As explained above, since 1990 we have had data of atmospheric temperature trends, with the data going back to late 1978, almost 40 years. These data directly contradict the atmospheric temperature trends calculated from all but one of the many climate models used by the IPCC. Further, these data show that models greatly overestimate temperature trends in the critical tropics at every elevation from the surface to about 60,000 feet (18km).

Models are important to modern scientific research. But they must be rigorously tested against physical evidence, not similar models. If they test poorly, they have little value in prediction. And elaborate schemes built on such models are little better than elaborate schemes of planetary motion developed by medieval scientists to justify the earth-centered universe. See links under Challenging the Orthodoxy, especially the 2018 paper by Christy, et al. p. 3600, fig. 18.

***************

Southern California’s Problem: The fear of carbon dioxide and the fear of nuclear energy are putting the economic future of Southern California into doubt. It appears to be headed in the direction of dramatically increasing electricity rates. A significant portion of the electricity delivered to the Los Angeles Department of Water and Power (LADWP) comes from the Navajo Generating Station, which the Obama administration stated needed significant upgrades. LADWP has withdrawn from a plan to make the upgrades, probably with the intent of obtaining power elsewhere, but where? Wind and solar are not reliable sources.

The nuclear power plants in southern California are scheduled to be closed, giving greens and politicians their dream of making California nuclear-free. What is not discussed is how this dream impacts on the major interconnector transmitting power between Southern California and the Pacific Northwest, the Pacific DC Intertie, which takes advantage of differing power demand patterns between the two regions.

During winter, the northern region needs power for heating, while consumption is low in the southern region. During summer, the power need of the north is reduced while the south needs power for air conditioning. Also, there is a day-night exchange to balance out nuclear power generation the night, when there is an excess. Effectively, the hydropower of the of the Columbia River system acts as a pumped-storage system for southern California using the Pacific DC Intertie. The line capacity is 3,100 megawatts, almost half LADWP electrical system’s peak capacity, enough to serve almost three million households.

The power from the south comes from nuclear power plants scheduled to be closed and lessens demands the largest hydroelectric system in the US, operated by the Bonneville Power Administration. How the lack of power from the south will affect operations is not clear.

Now, the LADWP has proposed using Hoover Dam for a pumped storage system for solar and wind power. How states such as Wyoming, Utah, and Colorado, which have drainage systems going into the Colorado, and which are part of the 1920’s Colorado River Compact will react is anyone’s guess. Further, the Colorado River once flowed mightily into the Sea of Cortez, but no more. How Mexico will react is anyone’s guess. Roger Andrews provides an analysis of the practicality of the LADWP proposal. See link under California Dreaming.

***************

Number of the Week: 50 Million Gallons a Day. The Carlsbad Desalination Plant has announced that it delivers nearly 50 million gallons of fresh, desalinated, drinking water to the people of San Diego County per day. It uses technology designed by Israelis of prefiltering the water with inexpensive material to remove algae that can clog the expensive filters used in reverse osmosis to remove the salt.

As Roger Bezdek has written, Tidewater Virginia, with the largest naval base in the world, is suffering a problem from the land subsiding. The primary cause is ground water extraction from the Virginia coastal plain aquifers. The total extraction is about 120 million gallons per day, a little more than twice the drinking water delivered by the Carlsbad plant.

It would make far more sense to address the problem directly by using desalination to eliminate groundwater extraction for urban uses, than to pretend the problem is caused by fossil fuels, the elimination of which will do nothing. See links under Other News that May Be of Interest.

**********************

ARTICLES:

1. Some Good News—About Natural Disasters, of All Things

In half a century, the average number of annual fatalities declined more than 80%.

By Roger Pielke Jr., WSJ, Aug 3, 2018

https://www.wsj.com/articles/some-good-newsabout-natural-disasters-of-all-things-1533331596

The professor of environmental studies at the University of Colorado, Boulder, whose most recent book is “The Rightful Place of Science: Disasters and Climate Change” (CSPO, 2018) continues with his use of data, hard evidence, to counter those who are predicting ever increasing natural disasters, usually from increased CO2. After introducing the above Quote of the Week that the earth is becoming safer, Pielke writes:

“…A case in point: natural disasters. The earth will always be volatile, but despite recent fires, volcanoes and hurricanes, humanity currently is experiencing a stretch of good fortune when it comes to disasters.

“It’s difficult to be ‘factful’ about disasters—the vivid trauma of each event distracts observers from the long-term decrease in destructiveness. But climate activists make the problem worse by blaming every extreme weather event on human-caused climate change, hoping to scare people into elevated concern.

 

“Disasters certainly continue to cause catastrophic damage across the globe. The annual cost of disasters has doubled since reliable accounting of all events world-wide began in 1990, rising from about $100 billion to $200 billion a year in 2017 dollars.

 

“But it’s deceptive to track disasters primarily in terms of aggregate cost. Since 1990, the global population has increased by more than 2.2 billion, and the global economy has more than doubled in size. This means more lives and wealth are at risk with each successive disaster.

 

“Despite this increased exposure, disasters are claiming fewer lives. Data tracked by Our World in Data shows that from 2007-17, an average of 7,000 people each year were killed by natural disasters. In the decade 50 years earlier, the annual figure was more than 37,000. Seven thousand is still far too many, but the reduction represents enormous progress.

 

“The material cost of disasters also has decreased when considered as a proportion of the global economy. Since 1990, economic losses from disasters have decreased by about 20% as a proportion of world-wide gross domestic product. The trend still holds when the measurement is narrowed to weather-related disasters, which decreased similarly as a share of global GDP even as the dollar cost of disasters increased.

 

“The decrease in disaster damage isn’t a surprise, because as the world population and economy have grown, the incidence of the most damaging extreme events has hardly changed. The Intergovernmental Panel on Climate Change reported in 2014 that there has been no increase in hurricanes, floods, droughts or tornadoes within the past 30 years. And 2018 is on track to have the lowest losses from disasters as a share of global GDP since 1990.

 

“It is then no surprise that the climate-disaster scare campaign has been ineffective at swaying public opinion. Gallup reported earlier this year that 63% of Americans worried a ‘great deal’ or ‘fair amount’ about climate change—the same level as in 1989, when the question was first posed. But though popular worry hasn’t boiled over, the public debate around climate change has become more politicized, more partisan and less ‘factful.’

 

“In place of today’s unproductive scare campaign, activists and the media should facilitate debate on the merits of actual climate-policy proposals, such as a carbon tax or improved flood defenses. Carbon dioxide emissions have indeed contributed to a global temperature increase and may yet influence extreme weather, so the public and policy makers must decide the best ways to reduce emissions and increase society’s resilience to extreme weather.

 

“The U.S. has a long way to go in this regard. Last year Hurricanes Harvey, Irma and Maria together caused more than $300 billion of damage. Among other issues, the storms revealed the lack of proper planning and infrastructure in Houston and the unpreparedness of the federal government in Puerto Rico.

 

“Improving resilience to disasters will be easier if it is based on evidence. That means acknowledging both the progress made so far and the risks and vulnerabilities that lie ahead. As Rosling advises: ‘Factfulness, like a healthy diet and regular exercise, can and should become part of your daily life. . . . You will make better decisions, stay alert to real dangers and possibilities, and avoid being constantly stressed about the wrong things.’ It’s good advice.”

********************

2. Trump’s Car Freedom Act

Easing fuel-mileage rules is a boon to auto makers and consumers.

Editorial, WSJ, Aug 3, 2018

https://www.wsj.com/articles/trumps-car-freedom-act-1533337130?mod=hp_opin_pos1

SUMMARY: The editorial begins with:

“The Trump Administration’s deregulation is improving consumer choice and reducing costs from health care to appliances. Its proposed revisions Thursday to fuel economy rules continue this trend to the benefit of car buyers, not that you’d know it from the political hyperventilation.

 

“Corporate average fuel economy (Cafe) standards are a relic of the 1975 Energy Policy and Conservation Act, which sought to reduce oil consumption by requiring manufacturers to produce more efficient cars. But the law has outlived its purpose as shale hydraulic fracturing has made the U.S. the world’s largest oil producer.

 

“Regulators aren’t clairvoyant, but the Obama bureaucrats were acutely blind—perhaps willfully so—to economic and technological trends in 2012 when they set a fleetwide average benchmark of 54.5 miles a gallon by 2025. The Environmental Protection Agency assumed unproven technologies would be widely adopted, but many have stalled or combusted. Dual-clutch transmissions resulted in a sudden loss of power and throttle, for example.

 

“The EPA projected that oil prices would be about $125 a barrel today and “high-cost petroleum liquids projects” in unstable regions and biofuels would be among the “most important components” of new supplies. Production in Venezuela and Libya has plunged, yet oil prices are about $70 per barrel as U.S. shale drillers increase output.

 

“Americans prefer bigger cars, which makes it harder for automakers to meet the escalating Cafe targets. SUVs and pick-ups make up about two-thirds of vehicle sales. Incremental improvements in fuel efficiency are also becoming more costly. Carmakers should be able to achieve the standards over the next couple of years due to credits for technologies like low-leakage air conditioning systems.

 

“But automakers would have to sell hundreds of thousands of electric cars—or buy credits from those that do—to meet future Cafe targets. And consumers aren’t buying electric cars en masse despite subsidies that can amount to $10,000 a car in California. Former CEO Sergio Marchionne estimated that Fiat Chrysler lost $20,000 on each electric car it sold. Carmakers then must raise prices on SUVs and pick-ups.

 

“As prices rise to meet the new standards, consumers would also wait longer to replace their cars. The average age of a car is approaching 12 years, up from about 8.5 in 1995. Newer cars are more efficient and safer, so longer vehicle turnover could result in more traffic fatalities and increased CO2 emissions.

 

“Enter Thursday’s Trump Administration proposal to freeze—not roll back—fuel economy standards at the current 2020 target of 37 miles a gallon. Credits would disappear, eliminating market distortions.”

The editorial closes with a discussion of possible political and judicial actions, including California suing for losing its place of special consideration.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s