Weekly Climate and Energy News Roundup #290

By Ken Haapala, President, The Science and Environmental Policy Project

Brought to You by www.SEPP.org

Academic Threats? On November 3, the US Global Change Research Program (USGCRP) released what may be the final climate report of the Obama Administration. The USGCRP was established 1989 by an executive order by President George H.W. Bush and was “mandated by Congress in the Global Change Research Act (GCRA) of 1990 to assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It comprises 13 Federal agencies and had a 2016 enacted budget of $2.6 billion and a 2017 requested budget of $2.8 billon. [These numbers are out of date, but more recent data was not found in a search of its web site.] The current executive director is Michael Kuperberg, who was appointed by President Obama in July 2015.

The preliminary report was leaked to certain favored groups such as the New York Times months ago. Based on this leaked report, CATO’s Patrick Michaels suggests that the new report will ignore critical issues regarding climate change. A major issue is that the widely used global climate models are overestimating atmospheric warming, with the average model prediction being 2.5 to 3 times that which is occurring. A second major issue discussed by Michaels is the divergence between the observed and predicted values of atmospheric temperature change with increasing altitude (lapse rate), particularly over the tropics. Again, the model estimates are about twice that of which is occurring.

Both these issues indicate that there is something very wrong with the widely accepted greenhouse gas theory, as published in 1979 by the US National Academy of Sciences, called the Charney Report, which is the premise of the global climate models. The speculated estimate of temperature increase for a doubling of carbon dioxide (CO2) of 3 degrees C plus or minus 1.5 degrees C (50%) have continued, with minor variation, in the 38 years since in the reports of the UN Intergovernmental Panel for Climate Change (IPCC) and the USGCRP.

Writing in the Wall Street Journal, Steve Koonin argues that the new report demonstrates the need for a Climate Red Team to counter the climate establishment. Judith Curry has posted the critical portions of Koonin’s arguments on Climate Etc.

Over the next several weeks, TWTW will be reviewing the USGCRP report focusing on how well the USGCRP fulfilled a critical part of its mandate: to assess and predict natural processes of global change. If we do not understand the natural processes, we cannot hope to understand the human influence on them. Natural threats to humans from climate change are real, particularly a global cooling. Human influence on these threats may be academic or real (practical). Areas of concern will include agriculture, ocean chemistry, sea level rise, temperature change (atmosphere v. surface), etc.

It is useful to recall the distinction Admiral Rickover made to Congress in 1953 between academic and practical nuclear reactors:

“An academic reactor or reactor plant almost always has the following basic characteristics: (1) It is simple. (2) It is small. (3) It is cheap. (4) It is light. (5) It can be built very quickly. (6) It is very flexible in purpose. (7) Very little development will be required. It will use off-the-shelf components. (8) The reactor is in the study phase. It is not being built now.

“On the other hand, a practical reactor can be distinguished by the following characteristics: (1) It is being built now. (2) It is behind schedule. (3) It requires an immense amount of development on apparently trivial items. (4) It is very expensive. (5) It takes a long time to build because of its engineering development problems. (6) It is large. (7) It is heavy. (8) It is complicated.

“The tools of the academic designer are a piece of paper and a pencil with an eraser. If a mistake is made, it can always be erased and changed. If the practical-reactor designer errs, he wears the mistake around his neck; it cannot be erased. Everyone sees it.

“The academic-reactor designer is a dilettante. He has not had to assume any real responsibility in connection with his projects. He is free to luxuriate in elegant ideas, the practical shortcomings of which can be relegated to the category of “mere technical details.” The practical-reactor designer must live with these same technical details. Although recalcitrant and awkward, they must be solved and cannot be put off until tomorrow. Their solution requires manpower, time and money.

“Unfortunately for those who must make far-reaching decision without the benefit of an intimate knowledge of reactor technology, and unfortunately for the interested public, it is much easier to get the academic side of an issue than the practical side. For a large part those involved with the academic reactors have more inclination and time to present their ideas in reports and orally to those who will listen. Since they are innocently unaware of the real but hidden difficulties of their plans, they speak with great facility and confidence. Those involved with practical reactors, humbled by their experiences, speak less and worry more.

“Yet it is incumbent on those in high places to make wise decisions and it is reasonable and important that the public be correctly informed. It is consequently incumbent on all of us to state the facts as forthrightly as possible.” [Boldface added]

See links under Challenging the Orthodoxy, Defending the Orthodoxy, http://www.globalchange.gov/about, and http://ecolo.org/documents/documents_in_english/Rickover.pdf


Quote of the Week. “If you are going to sin, sin against God, not the bureaucracy. God will forgive you, but the bureaucracy won’t.” Hyman Rickover, Admiral USN

Number of the Week: One Bonneville, or Two Niagaras, or Five Hoovers


Significant Omission in TWTW: The October 28 TWTW discussed that the earth’s climate is partially determined by the movement of two dynamic fluids: 1) the atmosphere; and 2) the oceans. Fluid dynamics is not thoroughly understood; thus, the actions of these fluids cannot be clearly defined. Richard Lindzen, Alfred P. Sloan Professor of Meteorology, Emeritus, at MIT, wrote TWTW stating a profound omission – the rotation of the globe alone is sufficient to set up a turbulence in these fluids that may take years to multiple centuries to become evident, even if the energy from the sun is constant. The turbulence applies to both the atmosphere and the oceans.

The earth’s climate will vary even without orbital variations and solar variability, which further complicate the understanding of the climate.

TWTW deeply appreciates such corrections.

**************

ENSO – Natural or CO2 Enhanced? The El Niño Southern Oscillation (ENSO) is a pressing topic because 2015-2016 were warm years, as warm, or slightly warmer (but within the range of error of the instrumentation), as the previous warm year of 1998. Both periods were marked to strong El Niños. The 1998 El Niño was followed by a strong cooling, a La Niña, the cooling period in the ENSO that may or may not follow an El Niño. Thus far, a La Niña has not followed the recent El Niño, giving rise to speculation that global temperatures may go to a new, higher level than the period of 1999 to 2015, called the hiatus. Indeed, in reporting the October temperature calculations of the atmosphere by the University of Alabama, Huntsville (UAH), Roy Spencer remarks that both he and John Christy “are a little surprised that the satellite deep-layer temperature anomaly has been rising for the last several months.”

In recent months, physicist Donald Rapp, author of “Assessing Climate Change: Temperatures, Solar Radiation, and Heat Balance”, has suggested that if the temperatures remain above the 1999 to 2015 period, it may be a signal of a human-induced CO2 influence on climate. Conversely, such an increase may be a manifestation of natural change set up years ago, or an increase in the total solar energy hitting the earth, which has not been properly measured, as suggested by physicist Nir Shaviv (below). See links under Challenging the Orthodoxy and Measurement Issues – Atmosphere.

**************

A Debate: Physicist Nir Shaviv posted his comments during a debate at the Cambridge Union, the oldest debating club in the world, going back to 1815. The proposition was awkward: “This house would rather cool the planet than warm the economy”. But, it gave Shaviv an opportunity to express his concern for the lack of evidence supporting the arguments that if governments do not address global warming / climate change by forcing reductions in CO2 emissions, the results will eventually be catastrophic.

Shaviv’s arguments are mostly logical and easy to follow. He believes that the solar influence is greatly underestimated and the 1979 Charney report greatly overestimates the influence of CO2. The so called “missing heat” has already escaped.

In discussing the solar influence, Shaviv brings up his 2008 paper that the oceans can be used to measure total solar influence. In that paper, he asserts that the evidence suggests that “the total radiative forcing associated with solar cycles variations is about 5 to 7 times larger than just those associated with the TSI (total solar irradiance) variations, thus implying the necessary existence of an amplification mechanism, although without pointing to which one.” The three sets of records are: 1) net heat flux into the oceans over five decades; 2) the sea-level change rate based on tide gages over the 20th century; and 3) the sea-surface temperature variations. There are issues with these mechanisms. But, the claimed “missing heat” said to be hiding in the oceans may be solar energy not properly measured and not related to CO2. See links under Challenging the Orthodoxy and Questioning the Orthodoxy.

**************

Number of the Week: One Bonneville, or Two Niagaras, or Five Hoovers. The California Duck illustrates the time-of-day problem created by politicians mandating wind and solar power. These alternative sources produce well at mid-day, but poorly between 5 and 9 pm when demand is the greatest. The politicians in Sacramento “solved” their problem by passing a law requiring the utilities solve it. In an interview, the CEO of Edison International said its operating unit, Southern California Edison, can comply with the new law requiring greenhouse gas emissions to be cut 40% by 2030 by adding “10 gigawatts of energy storage and developing another 30 gigawatts of renewable energy.” Since the profits of regulated utilities are a percentage of allowable costs, if allowable costs go up profits go up. The only one hurt is the consumer. [Denmark has failed in trying to have consumers adjust their living patterns to require electrical power only when it is available.]

Storing electricity has vexed utilities for over 100 years. What is required for 10 gigawatts of dispatchable electricity storage, non-fossil fuel? That is, it can be turned on, off, and controlled as needed. Nuclear requires too much time, and solar and wind are non-dispatchable. One Bonneville, or Two Niagaras, or Five Hoovers?

Hydropower is the only dispatchable non-nuclear, non-fossil fuel, electrical power that can be ramped up and down daily, though the wear and tear on the turbines will increase maintenance costs significantly. The largest hydropower operator in the US is the Bonneville Power Administration.

“The Bonneville Power Administration, a division of the U.S. Department of Energy, sells the output of 29 federal hydroelectric dams in the Columbia River Basin; two in the Rogue River Basin of Southern Oregon; one non-federal nuclear power plant, the Columbia Generating Station near Richland, Washington; and several small non-federal power plants.

The federal system has a total capacity — capacity is the maximum generating potential — of 17,462 megawatts, [17.5 gigawatts] and an-energy output —energy is the normal power production — of 9,871 average megawatts. Hydropower accounts for 80 percent of the capacity and 67 percent of the energy provided by Bonneville.”

A few more dams may be needed to get to 10 gigawatts, but that is insignificant to the academic power planner.

“The Columbia River is uniquely situated as a hydropower river, in that it flows through multiple mountain ranges on its 1,214-mile journey to the sea, and these engorge the river and its tributaries with millions of acre-feet of snowmelt runoff every year. Also, the Columbia drops at a fairly uniform rate of about two feet per mile, and much of its course is through solid rock, carved by repeated floods at the end of the last Ice Age. The rocky canyons provide excellent footing for dams.”

From the border with Canada to the Columbia River’s first impoundment in the US by the Grand Coulee Dam, Lake Roosevelt, the reservoir, stretches 150 miles. Based on google maps, the distance to the last dam, Bonneville Dam at the Washington / Oregon Border, is about 300 miles. Thus, the total impoundments along the Columbia River, alone, stretch about 450 miles.

Where would the politicians in Sacramento like the duplicate to be placed?

Alternatively, there is the Niagara River.

“The Niagara River is one of the world’s greatest sources of hydroelectric power. The beauty of its wild descent from Lake Erie to Lake Ontario attracts millions of visitors each year. During its short course (56 km), the river drops 99 metres, with much of the spectacular plunge concentrated in a 13 km stretch of waterfalls and rapids.

“Today the churning river provides the driving force for almost 2 million kilowatts of electricity from a number of power plants on the Canadian side. The three largest are Sir Adam Beck Niagara Generating Station Nos. 1 and 2 and the nearby pumping-generating station.

“On the American side of the border, down river from the Falls, the Robert Moses Niagara Power Plant and the Lewiston Pump Generating Plant, together generate more than 2.4 million kilowatts of electricity,

http://www.infoniagara.com/attractions/hydro_power/index.aspx

Producing a little less than 4.5 gigawatts, at least two of these would be required to produce 10 Gigawatts. Where should they be located?

Alternatively, there is the Hoover Dam.

“Hoover Dam rises 726.4 feet above bedrock, equivalent to a 60-story building. The base of the dam is 660 feet thick, equivalent to the length of two city blocks. It is 45 feet thick at the crest, and the length of the crest from canyon-wall to canyon-wall, is 1,244 feet, nearly one-quarter of a mile.

“Presently, Hoover Dam can produce over 2,000 megawatts of capacity…”

http://www.powerauthority.org/hoover-dam/

The reservoir, Lake Mead, extends about 112 miles up the Colorado River. Five dams and reservoirs would be required to provide 10 gigawatts of back-up electricity. Where would the politicians in Sacramento mandate these go?

Perhaps the politicians in Sacramento confuse 10 gigawatts of electricity storage with 10 gigabytes of computer memory storage. The latter can be put on a chip the size of a thumbnail.

This entire exercise demonstrates that practical thinking about green energy is as rare today as it was about nuclear reactors when Admiral Rickover wrote over 60 years ago.

**************

ARTICLES:

1. Utility Touts Electrification to Meet California Climate Goals

Southern California Edison releases vision for how the state can comply with new law that requires greenhouse gas emissions to be cut 40% by 2030

By Russell Gold, WSJ, Oct 31, 2017

https://www.wsj.com/articles/utility-touts-electrification-to-meet-california-climate-goals-1509457320

SUMMARY: The interviewer states:

“California can meet its ambitious goals for slashing greenhouse gas emissions, but it will require a massive shift to electric vehicles, car charging stations and renewable energy, one of the state’s biggest power companies says in a new analysis.

“Southern California Edison, an operating unit of Edison International, plans Tuesday to release its vision for how the state can comply with a new law that requires greenhouse gas emissions to be cut 40% by 2030.

“In an interview, Pedro Pizarro, president and chief executive officer of Edison International, said the most ‘efficient and affordable’ path involves increased electrification of the economy, through a ‘robust, modern electric grid’. That would mean more investment by Southern California Edison, paid for by its 15 million customers, as well as California’s other regulated utilities.

“‘It is certainly technologically achievable,’ Mr. Pizarro said, adding, ‘This is a big task and we have to get going now.’

“There were other paths to cutting emissions, but Southern California Edison said increased reliance on electricity was better than developing a hydrogen-based energy system or relying on biogas. It was the simplest, quickest way to lower emissions ‘while minimizing costs to consumers and the economy,’ the company said.

“Among the targets the company suggests need to be reached by 2030 include adding 10 gigawatts of energy storage and developing another 30 gigawatts of renewable energy.

“The utility also said the state’s drivers needed to transition from gasoline- and diesel-powered vehicles to ones that run on electricity. The company said the simplest path to reach the state goal requires 24% of the state’s automobile fleet—seven million cars—to go electric. About one million charging stations would need to be built in the next dozen years.

“To put that into context, there were about 134,000 electric vehicles registered in California at the beginning of this year, according to the state’s Department of Motor Vehicles.”

“Mr. Pizarro said that while reaching the greenhouse gas reduction goal will require an all-out effort, the cost to utility customers would be manageable….”

The reporter states that the estimates do not include the cost of forcing car owners to switch to electric vehicles.

*****************

2. Global Oversupply of Grains Puts a Squeeze on Giant Processors

As low prices persist, the situation grows more difficult for the grain-trading giants

By Jacob Bunge and Jesse Newman, WSJ, Nov 2, 2017

https://www.wsj.com/articles/global-oversupply-of-grains-puts-a-squeeze-on-giant-processors-1509620400

SUMMARY: As the government-funded scientists in Washington announce academic threats from global warming, farmers in the US are facing real threats from lowering grain prices caused by increased grain production in tropical countries. After an introduction, the reporters state:

“Farmers have spent the past few years cutting their spending to cope with a global glut of crops. Now it’s commodity traders’ turn.”

Archer Daniels Midland Co. (ADM) and Bunge Ltd. are “among the companies that dominate world-wide grain trading and processing, said this week that they are slashing hundreds of millions of dollars in annual spending and restructuring operations to navigate a world awash in corn, soybeans and wheat.

“For grain traders, ‘this has been a humbling year,’ said Soren Schroder, Bunge’s chief executive, in an interview Wednesday as after the company reported a decline in quarterly profit.

Five years of back-to-back bumper crops in markets across the globe have kept grain prices low and upended traditional dynamics in the farm sector. Trading giants like ADM, Bunge and Cargill Inc., which buy farmers’ crops to market and process, are being squeezed.

“A glut of corn is fueling greater exports from Argentina, Brazil and Ukraine, and squeezing U.S. trading companies.

“Chicago-based ADM on Tuesday said its quarterly net income dropped by 44%, weighed down by declining U.S. grain exports, as cheap Brazilian corn undercut U.S. shipments.”

The article continues with some details. But, it does not discuss that increasing yields may be associated with increasing CO2.

*****************

3. The Acid-Rain Playbook Worked Beneficially

Reasonable regulations have been successful in reducing emissions of SO2 and NOx from power generation. As a result of these reductions, air quality has improved.

Letter by James S. Falcone Jr., WSJ, Oct 31, 2017

https://www.wsj.com/articles/the-acid-rain-playbook-worked-beneficially-1509476811

The writer takes exception with Rupert Darwall that the “acid rain” playbook is identical to the current CO2 program. After an introduction, the author states:

“It is uncontestable that the burning (oxidation) of organic matter with relatively high amounts of sulfur and nitrogen produce heat, SOx and NOx along with COx and water among other more minor compounds. The acidic oxides of sulfur, nitrogen and carbon all produce more acidic water when dissolved. SOx and NOx are far more acidic than carbon dioxide. In fact, as he states, natural water is slightly acidic due to carbon dioxide, but the effects of SOx and NOx can and have made rain, snow, particulate matter and water from 10 to more than a 1,000 times more acidic with negative consequences for living systems. The effects of these byproducts of fossil-fuel use in power generation were extremely well documented and severe throughout Europe and North America. Any “hysteria” was likely a result of media representations, not the workings of science. Nature is deaf and blind to human logic and consensus.

“The 2011 National Acid Precipitation Assessment Program Report to Congress clearly states that reasonable regulations have been successful in reducing emissions of SO 2 and NOx from power generation. As a result of these reductions, air quality has improved, providing significant human health benefits, and acid deposition has decreased to the extent that some acid-sensitive areas are beginning to show signs of recovery.”

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s