Weekly Climate and Energy News Roundup #381

The Week That Was: October 19, 2019, Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

Quote of the Week “The human mind is not capable of grasping the universe. We are like a little child entering a huge library. The walls are covered to the ceilings with books in many different tongues. The child knows that someone must have written these books. It does not know who or how. It does not understand the languages in which they are written. But the child notes a definite plan in the arrangement of the books … a mysterious order which it does not comprehend, but only dimly suspects.”— Albert Einstein

Number of the Week: 2,352 cubic feet about 30 tons, with 20,000 vacuum tubes using150 kilowatts of power and with five million hand-soldered connections.

NASA-GISS Audit: A search of the web sites of the major US government-funded global modeling entities revealed an April 5, 2018 audit of NASA’s management of the Goddard Institute for Space Studies by the Office of Inspector General (IG) of NASA. Under the heading “What We Found” the IG report states, in part:

“GISS is a major contributor in helping NASA meet its Earth science research goals, in particular the Agency’s effort to improve the ability to predict climate change by better understanding the roles and interactions of the ocean, atmosphere, land, and ice in the climate system. In addition to climate modeling and maintenance of publically available climate-related datasets, the Institute’s major efforts include research in atmospheric chemistry, astrobiology, aerosols, and water isotopes.

“Apart from its substantial scientific contributions and contrary to NASA policy, we found that 43 of 66 (65 percent) new GISS scientific publications publicly released from October 2015 through September 2017 were not approved by GISS or Goddard officials prior to release. NASA policy requires numerous reviews and approvals before scientific information can be publically released. These procedures – which include a technical review, export control review, a series of supervisory approvals and, if needed, a legal review – are designed to ensure the accuracy of scientific information released to the public and to prevent inadvertent release of sensitive information. Moreover, we found inadequate NASA guidance related to the independence and qualifications of the initial approver in the technical review process and other practices not in conformance with best practices.”

Under the Heading of “What We Recommended” the report states, in part:

“In order to ensure accurate scientific information is released to the public and to prevent sensitive information from inadvertent release, we recommended NASA’s Chief Information Officer and the Chief of GISS ensure all NASA and GISS generated publications complete a thorough and independent pre-publication review and approval process prior to release.”

The IG also found funding irregularities, but for purposes of TWTW that is not important. Note that the report only discusses publications released from October 2015 through September 2017 on whether or not they have undergone extensive review by NASA. The findings of the IG prompted a search of GISS web site for updates of several important and highly questionable studies by GISS – mainly sea level rise and incorporation of space age technology in understanding the atmosphere in the GISS global climate model. See links under Challenging the Orthodoxy.

*******************

NASA-GISS Increasing Sea Level Rise: As discussed in the 2008 report of the Nongovernmental International Panel on Climate Change (NIPCC) on pages 16 to 19, James Hansen, then director of NASA-GISS, came out with some wild predictions of 21st century sea level rise. In his essay “The Threat to the Planet” in The New York Review, July 13, 2006, Hansen asserted a sea level rise of up to 600 cm (20 feet) by the end of the century. Eventually sea levels would rise by up to four times that amount, 2400 cm (80 feet) higher. He claimed that was the height about 3 million years ago (before our current period of ice ages). Hansen claims the sea level increases are driven by increases in carbon dioxide (CO2).

The current NASA-GISS web site links to Hansen’s essay and to a GISS paper by Russell et al, projecting increases in the rate of sea level rise. The abstract of the Russell paper states:

“Sea level has been rising for the past century, and coastal residents of the Earth will want to understand and predict future sea level changes. In this study we present sea level changes from new simulations of the Goddard Institute for Space Studies (GISS) global atmosphere-ocean model from 1950 to 2099. The free surface, mass conserving ocean model leads to a straightforward calculation of these changes. Using observed levels of greenhouse gases between 1950 and 1990 and a compounded 0.5% increase in CO2 after 1990, model projections show that the global sea level measured from 1950 will rise by 61 mm in the year 2000, by 212 mm in 2050, and by 408 mm in 2089. By 2089, 64% of the global sea level rise will be due to thermal expansion and 36% will be due to ocean mass changes. The Arctic Ocean will show a greater than average sea level rise, while the Antarctic circumpolar regions will show a smaller rise in agreement with other models. Model results are also compared with observed sea level changes during the past 40 years at 12 coastal stations around the world.”

In 50 years from 1950 to 2000 sea levels will increase by 61 mm, or 2.4 inches; in the next 50 years they will increase by another 151 mm, 6 inches, in the next 40 years they will increase by another 196 mm, 7.7 inches. They will increase at an ever-increasing rate, according to the Russell paper. The earth has not experienced such an increase in sea level rise since the earth was coming out of the last Ice Age when the great ice sheets covering the northern part of the northern hemisphere melted. Does NASA continue to endorse such papers? See links under Defending the Orthodoxy and https://pubs.giss.nasa.gov/docs/2006/2006_Hansen_ha06110e.pdf

*******************

NASA-GISS – Atmospheric CO2: Principal Control Knob Governing Earth’s Temperature: More disturbing is that NASA-GISS still references a 2010 paper published in Science claiming that the earth’s temperatures are principally controlled by CO2. In part, the entry on the NASA-GISS web site reads:

The bottom line is that atmospheric carbon dioxide acts as a thermostat in regulating the temperature of Earth. The rapid increase in atmospheric carbon dioxide due to human industrial activity is therefore setting the course for continued global warming. Because of the large heat capacity of the climate system, the global surface temperature does not respond instantaneously to the sharp upturn of the carbon dioxide thermostat, which at this moment stands at 386.80 ppm compared to the normal interglacial maximum level of 280 ppm. Since humans are responsible for changing the level of atmospheric carbon dioxide, they then also have control over the global temperature of the Earth. Humans are at a difficult crossroad. Carbon dioxide is the lifeblood of civilization as we know it. It is also the direct cause fueling an impending climate disaster. There is no viable alternative to counteract global warming except through direct human effort to reduce the atmospheric CO2 level.

The basic physics for the present study is rooted in the high precision measurements documenting the rise of atmospheric carbon dioxide and other greenhouse gases as fully described in the IPCC AR4 report, and in the comprehensive HITRAN database (Rothman et al. 2009) of atmospheric absorption data. The radiative transfer calculations involve well-understood physics that is applied to the global energy balance of the Earth, which is maintained by radiative processes only, since the global net energy transports must equal zero. This demonstrates the nature of the terrestrial greenhouse effect as being sustained by the non-condensing GHGs, with magnification of the greenhouse effect by water vapor and cloud feedbacks and leaves no doubt that increasing GHGs cause global warming. [Boldface added.]

At best, the study is dated, worse it is misleading. We are not seeing a significant, steady rise in atmospheric temperatures over the past 40 years as the atmospheric concentration of CO2, as measured at Mauna Loa Observatory, has gone from 337 parts per million (ppm) to about 410 ppm, about a 22% increase. This is not to say that CO2 does not have a greenhouse effect, but the temperature effect of the rise in CO2 over the past 40 years has been modest. Prior to that, we simply do not know because there were no comprehensive measurements of atmospheric temperatures. Surface temperatures are poor proxies for the influence of CO2, they are far from complete and altered by other influences, such as change in land use.

However, the paper is interesting for two reasons: 1) the temperature forcing mechanism it presents and 2) recognition of the HITRAN database.

“It is this sustained warming [from CO2] that enables water vapor and clouds to maintain their atmospheric distributions as the so-called feedback effects that amplify the initial warming provided by the non-condensing GHGs, and in the process, account for the bulk of the total terrestrial greenhouse effect.”

As the graphic describes, both clouds and water vapor have a strong positive greenhouse effect which increases significantly as warming from CO2 (or any other reason) increases. The positive feedback from water vapor is described as twice that from clouds. Yet, atmospheric measures are not picking up the strong positive influence from water vapor and from clouds. Thus, the paper is highly questionable and should be so marked.

That the paper recognizes the HITRAN database is important because it is one of the two libraries of compiled data from actual measurements of the greenhouse effect, the other being MODTRAN. Described in the June 22 and other TWTWs, the HITRAN database, though not perfect, are far superior for gaining knowledge of the greenhouse effect than the use of surface temperatures. NASA-GISS has no excuse for not updating its research using HITRAN and adjusting its models accordingly. See links under Defending the Orthodoxy and previous TWTWs on The Greenhouse Effect.

*******************

US Modeling Efforts: In addition to GISS, the IG report lists five other major US Climate Modeling Efforts. They are: 1) Department of Energy, ACME V1, which “Supports the Department of Energy’s energy planning and computational resource needs.” This model is being upgraded to E3SM, Exascale Earth System Model; 2) NOAA GFDL, CM3, which focuses on “Long-term climate change research advancing NOAA’s mission goal to understand and predict changes in climate,” and has been discussed previously; 3) NASA GMAO, GEOS5, which focuses on “Data assimilation products for short-term weather, longer seasonal forecasts, and re-analyses”; 4) NCAR, CESM1, which focuses on “Long-term climate change research”, and was discussed in the August 31 TWTW; and 5) NCEP, CFS V1 & V2, which focuses on “Operational data assimilation products for short-term weather, longer seasonal forecasts, and re-analyses.”

The weather models are important because they give an idea of the limitations of using modified weather models for modeling climate. In upcoming TWTWs, NCAR, GISS, and DOE models will be discussed to evaluate how well they incorporate space age technology – that is data that has been gathered since the 1979 Charney Report, which presented no physical evidence justifying its speculations of what was occurring in the atmosphere because there were no comprehensive measurements of the atmosphere. We have far more systematic information on the atmosphere today, and to ignore this information is to ignore the principles of the Scientific Method. See links under Model Issues.

*******************

Climate Limits: Writing for her website, Climate Etc., Judith Curry presents her reasoned responses to questions from a reporter. Her conclusion to the essay needs repeating:

“Bottom line is that these [Intergovernmental Panel on Climate Change (IPCC)] timelines are meaningless. While we have confidence in the sign of the temperature change, we have no idea what its magnitude will turn out to be. Apart from uncertainties in emissions and the Earth’s carbon cycle, we are still facing a factor of 3 or more uncertainty in the sensitivity of the Earth’s climate to CO2, and we have no idea how natural climate variability (solar, volcanoes, ocean oscillations) will play out in the 21st century. And even if we did have significant confidence in the amount of global warming, we still don’t have much of a handle on how this will change extreme weather events. With regards to species and ecosystems, land use and exploitation is a far bigger issue.

“Cleaner sources of energy have several different threads of justification but thinking that sending CO2 emissions to zero by 2050 or whenever is going to improve the weather and the environment by 2100 is a pipe dream. If such reductions come at the expense of economic development, then vulnerability to extreme weather events will increase.

“There is a reason that the so-called climate change problem has been referred to as a ‘wicked mess.’” See link under Challenging the Orthodoxy.

*******************

Is There a Statistician in the House? Since the notorious hockey-stick by Mr. Mann, climate advocates and publications which favor them have been published that contain important statistical errors, often disguised in the statistical morass. Ross McKitrick, one of the two statisticians who exposed the hockey-stick, has advocated that important statistical work include a review by a statistician competent in the type of statistics being used. That competence is necessary to be able to clearly identify the errors and correct them if possible.

For example, last week, Gregory et al. published a paper claiming that the sensitivity of earth to changing CO2 can be derived from studies of historic climate change. This week, statistician Nic Lewis, writing in Climate Etc., explains why the study is bunk. The mistakes may be honest, but it is clear the authors that the did not understand the weaknesses of the approaches they took.

On another statistical issue, Anthony Watts has an hour-long audio interview, with Steve McIntyre, the other statistician who helped expose Mr. Mann’s hockey-stick. The interview reflects the caliber of the man. See links under Challenging the Orthodoxy.

*******************

Regulated Utilities: Often US citizens forget that their electricity and natural gas probably come from a regulated utility. In regulated states, utilities must abide by electricity rates set by state public utility commissions with the regulators appointed by politicians. This type system is often considered a monopoly due to its limitations on consumer choice. However, its benefits include stable prices and long-term certainty.

Many investors like regulated utilities because it assures long-term profits, thus, virtually, a guaranteed return on investment. A significant problem arises when regulators, or politicians, place other burdens on the utility, such as not allowing sufficient expenses for maintenance, or making other concerns, such as the environment, a priority over safe delivery of reliable electricity or natural gas at the lowest possible price. Such a problem is occurring in California. See links under California Dreaming and Article #1.

*******************

Number of the Week: 2,352 cubic feet, the size of a large school bus, about 30 tons with 20,000 vacuum tubes using150 kilowatts of power and with five million hand-soldered connections – Eniac, the first programmable computer. What’s in your smartphone? See Article # 2.

*******************

ARTICLES

1. PG&E CEO Says It Could Impose Blackouts in California for a Decade

Bill Johnson makes the disclosure in a hearing at which California officials blast PG&E’s shutoffs this month

By Katherine Blunt, WSJ, Oct 18, 2019

https://www.wsj.com/articles/pg-e-ceo-says-it-could-impose-blackouts-in-california-for-a-decade-11571438206?mod=hp_lead_pos3

TWTW Summary: The journalist states:

“PG&E Corp.’s chief executive said Friday that it could take as long as 10 years for the company to improve its electric system enough to significantly diminish the need to pull the plug on customers to reduce the risk of sparking fires.

“Bill Johnson, who joined the company in May, made the disclosure at a California Public Utilities Commission hearing where the panel’s president, Marybel Batjer, sharply criticized the company’s ‘inadequate execution’ of a shut-off in which it turned off power to large portions of Northern California for more than two days last week.

“The commission convened an emergency meeting to examine PG&E’s handling of the massive blackout, which left roughly two million people in the dark and created widespread havoc from the Bay Area to the northern reaches of the state. Several of the company’s top executives were summoned to detail the problems and take questions from regulators.

“‘I can tell you that you guys failed on so many levels on fairly simple stuff,’ Ms. Batjer said.

“The agency earlier this week ordered PG&E to address numerous problems with its strategy for such blackouts, known as public safety power shut-offs. It condemned the company’s failure to provide maps and other critical information to residents and local officials ahead of the shut-off. PG&E’s website crashed for two days during the blackout, and its call centers were overwhelmed.

“Mr. Johnson on Friday apologized for the hardships caused by the shut-off but defended the company’s decision to implement it, noting that none of its power lines sparked fires, even though strong winds in certain areas caused damage to its system.

“PG&E, which provides gas and electricity to 16 million people, shut off the power to more than 700,000 homes and businesses in anticipation of strong winds that could have increased the chances of its power lines sparking fires. The company’s equipment has sparked 19 major fires during windy periods in 2017 and 2018, mostly because vegetation blew into live wires.

“PG&E isn’t the only California utility to deploy shut-offs to mitigate wildfire risks. Edison International’s Southern California Edison and Sempra Energy’s San Diego Gas & Electric also cut power recently in response to windy conditions. But PG&E is the only U.S. utility to have initiated a weather-related blackout on such a large scale.

“The decision drew the ire of legislators and local officials who have called on PG&E to act more prudently in enacting future shut-offs. A group of Northern California governments, including Napa and Sonoma counties, on Thursday filed a scathing brief with the utilities commission that berated PG&E for its lack of preparedness.

“‘The experience of working with PG&E to effect real changes to its de-energization program has been like battling the Hydra,’ it read. ‘This has got to stop.’

“For now, the shut-offs will continue as PG&E scrambles to trim trees near power lines and upgrade equipment across its 70,000-square-mile service territory, after a protracted drought this decade turned millions of acres of forest into a tinderbox.

“Another major fire tied to PG&E’s equipment would likely drive the company to insolvency. It sought bankruptcy protection in January, citing more than $30 billion in liability costs stemming from the 2017 and 2018 fires, which collectively killed more than 100 people.

“At the meeting Friday, commissioners questioned the company’s commitment to its customers and how long it anticipates deploying its shut-off strategy on such a large scale.

“Mr. Johnson said the utility is working to limit the scope of future shut-offs by trimming more trees and installing technology to enable the shutdown of smaller, more targeted portions of the grid. But he estimated it will take as long as a decade before its shut-offs will have ‘ratcheted down significantly.’”

The article concludes with promises by PG&E to reduce the length of shut-offs from 5 days to 48 hours.

*****************

2. How Steam and Chips Remade the World

Cheap energy powered an economic revolution in the 18th century, and cheap information in the 20th.

By John Steele Gordon, WSJ, Oct 18, 2019

https://www.wsj.com/articles/how-steam-and-chips-remade-the-world-11571436222

[SEPP Comment: Internal to the microchip is not only a huge number of transistors, but also the wiring.]

TWTW SUMMARY: The author of “An Empire of Wealth: The Epic History of American Economic Power” writes:

“Not all inventions are equal. While the wheeled suitcase was a great idea, it didn’t change the world. But invent a technology that causes the price of a fundamental economic input to collapse, and civilization changes fundamentally and quickly. Someone born 250 years ago arrived in a world that, technologically, hadn’t changed much in centuries. But if he had lived a long life, he would have seen a new world.

“For millennia there had been only four sources of energy, all expensive and limited: human muscle, animal muscle, moving water and air. Thomas Newcomen invented the steam engine in 1712, but its prodigious fuel consumption severely limited its utility. Then in 1769 James Watt greatly improved it, making the engine four times as fuel-efficient. In 1781 he patented the rotary steam engine, which could turn a shaft and thus power machinery.

“The steam engine could be scaled up almost without limit. The price of energy began a steep decline that continues to this day.

“As factories came to be powered by steam, the price of goods fell, sending demand to the sky. When the steam engine was mounted on rails, overland transportation of people and goods became cheap and quick for the first time in history. Andrew Jackson needed nearly a month to get from Nashville, Tenn., to Washington by carriage for his inauguration in 1829. By 1860 the trip took two days.

“Quick overland transportation made national markets possible. Businesses seized the opportunity and benefited from economies of scale, sending demand up even more as prices declined further. The collapsing cost of wire and pipes allowed such miracles as the telegraph, indoor plumbing and gas lighting to become commonplace. Enormous new fortunes came into being.

“The age dominated by cheap energy lasted until the mid-20th century. Steam had been supplanted by electricity and the internal combustion engine, but someone from 1860 would have mostly recognized the technology of 1960, however dazzling its improvement. Then another world-changing technology emerged.

“Before the 1940s the word ‘computer’ referred to people, usually women, who calculated such things as the trajectories of artillery shells. The first programmable computer, Eniac, was powered up for work on Dec. 10, 1945. It was 2,352 cubic feet, the size of a large school bus, with 20,000 vacuum tubes that sucked up 150 kilowatts of power. But it was 1,000 times as fast as any electromechanical calculator and could calculate almost anything.

“Transistors soon replaced vacuum tubes, shrinking the size and power requirements of computers dramatically. But there was still a big problem. The power of a computer depends not only on the number of transistors, but also the number of connections among them. Two transistors require only one connection, six require 15, and so on.

“While transistors could be manufactured, the connections had to be made by hand. Eniac had no fewer than five million hand-soldered connections. Until the “tyranny of numbers” could be overcome, computers would, like Newcomen’s steam engine, remain very expensive and thus of limited utility.

“Had every computer on earth suddenly stopped working in 1969, the average man would not have noticed anything amiss. Today civilization would collapse in seconds. Nothing more complex than a pencil would work, perhaps not even your toothbrush.

“What happened? In 1969 the microprocessor—a computer on a silicon chip—was developed. That overcame the tyranny of numbers by creating the transistors and the connections at the same time. Soon the price of storing, retrieving and manipulating information began a precipitous decline, as the price of energy had two centuries earlier. Computing power that cost $1,000 in the 1950s costs a fraction of a cent today.

“The first commercial microprocessor—the Intel 4004, introduced in 1971—had 2,250 transistors. Today some microprocessors have a million times as many, making them a million times as powerful but only marginally more expensive.

“Microprocessors began to appear everywhere. Today’s cars have dozens of them, controlling everything from timing fuel injection to warning when you stray out of your lane. Even money is now mostly a plastic card with an embedded microprocessor.

“As the railroad was for the steam engine, the internet is the microprocessor’s most significant subsidiary invention. It revolutionized retailing, news distribution, entertainment, communication and much more.

“Like cheap energy, cheap information has created enormous new fortunes, ineluctably increasing wealth inequality. But also like cheap energy, the source of those fortunes has given nearly everybody a far higher standard of living.”

The author concludes with:

“A man from half a century ago would surely regard the now-ubiquitous smartphone as magic.”

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s