Weekly Climate and Energy News Roundup #233

The Week That Was: July23, 2016 – Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

Political Fads: Roy Spencer has written a 22-page booklet, “A Guide to Understanding Global Temperature Data,” published by the Texas Public Policy Foundation and available on the web at no cost. The booklet covers some of the scientific research that demolish a number of fashionable beliefs on global warming/climate change. First and foremost is the fad seized upon by some politicians that global warming skeptics are funded, or paid-off, by Exxon or other oil companies, etc. The recent antics by certain Senators and state attorneys general failed to present convincing evidence. In fact, later it was claimed that the purpose of the investigations was to find evidence – as if IRS filings of Exxon are not available or not reviewed. The US Government has spent over $40 Billion since 1993 on what it calls climate science. Comparable spending by Exxon cannot be hidden.

Unlike the US government, which has not undergone a full audit since the 1990s, stockholder-held corporations are subject to rigorous audits. Conversely, a February 26, 2015 report by the U.S. Government Accountability Office (GAO) states that major impediments, uncertainties, and material weaknesses prevent its ability to conduct an effective audit. The IRS and the SCC would not permit such a report from Exxon.

To his credit, Roy Spencer lists the organizations who have funded his research: NASA, NOAA, Department of Energy (DOE) and Department of Transportation (DOT). At least some of the government spending on climate science goes to objective science.

The main questions Spencer addresses are:

“1) Does an increasing CO level mean there will be higher global temperatures?

2) Can global temperatures go up naturally, even without rising CO levels?

3) How are temperature data adjusted?

4) Are global temperatures really going up? If so, by how much?

5) Is warming enough to be concerned about? Is warming necessarily a bad

thing?

6) Could the warming be both natural and human-caused?

7) Why would the climate models produce too much warming?

8) What is climate sensitivity?

9) Don’t 97 percent of climate researchers agree that global warming is a serious

man-made problem?

10) Haven’t ocean temperatures been going up, too?

11) What does it mean when we hear “the highest temperature on record”?

12) Is there a difference between weather and climate?

13) Why would climate science be biased? Isn’t global warming research

immune from politics?”

Spencer’s responses to these questions are consistent with what has been written in TWTW. Spencer concludes:

“It should be clear that the science of global warming is far from settled.

“Uncertainties in the adjustments to our global temperature datasets, the small amount of warming those datasets have measured compared to what climate models expect, and uncertainties over the possible role of Mother Nature in recent warming, all combine to make climate change beliefs as much faith-based as science-based.

“Until climate science is funded independent of desired energy policy outcomes, we

can continue to expect climate research results to be heavily biased in the direction

of catastrophic outcomes.”

See links under Challenging the Orthodoxy and http://www.gao.gov/products/GAO-15-341R

###################################################

Quote of the Week: “Beware of false knowledge; it is more dangerous than ignorance.”– George Bernard Shaw [H/t William Readdy]

###################################################

Number of the Week: Increase by 1 part per 10,000 or 0.01%

###################################################

Energy Flow: In his booklet, Spencer discusses the poorly labeled greenhouse effect as follows:

“The temperature of anything you can think of can be increased in one of two ways: (1) by adding more energy (e.g., turn up the stove top to warm a pot of water; turn up the furnace in your house), or (2) by reducing energy loss (e.g., put a lid on the uncovered pot of water as it is heated; add insulation to your walls).

“For the Earth’s climate system, the energy input is sunlight, while the energy loss is through infrared (heat) radiation emitted by the surface and atmosphere to the cold depths of outer space. Infrared radiation is the radiant heat you feel at a distance from a fire, and is emitted by all solid objects and by some gases in the atmosphere.

“Carbon dioxide is a so-called “greenhouse gas,” an admittedly misleading name for the gases which are good absorbers and emitters of infrared (IR) radiation. Water vapor is by far the most important greenhouse gas, while CO2 and methane have lesser influences.

“In the case of global warming theory, the extra CO2 we have added to the atmosphere is believed to have reduced the rate at which the Earth loses infrared radiation to space by about 1 percent, based upon theoretical calculations backed up by laboratory measurements. It’s like covering the pot of water on the stove slightly more with a lid, or adding a little more insulation to the walls of a house.”

According to this explanation, the net effect of increasing the greenhouse effect (adding CO2 to the atmosphere) is to slow the net flow of infrared radiation into space. Those who live in the desert experience this reduction of radiation to space. On cloudless nights with high humidity, (water vapor is a greenhouse gas) temperatures do not cool as rapidity as on cloudless nights with low humidity. See links under Challenging the Orthodoxy.

***************

El Niño – Southern Oscillation: The influence of the El Niño – Southern Oscillation (ENSO) on global temperatures is coming to the fore. It is clear that ENSO influences surfaces temperatures and, with a lag in time, atmospheric temperatures. Using surface temperatures, some entities are declaring the hottest month/year on record without differentiating between naturally caused- and human-caused warming. There is limited understanding of the causes of ENSO, El Niños, and La Niñas. Since El Niños, and La Niñas are short term events, lasting a few years at most, the UN Intergovernmental Panel on Climate Change (IPCC) has largely dismissed them, although their frequency may have significant impacts on both surface and satellite temperatures over a much longer term.

On his web site under “The Pause Is Back, It Never Went Away,” Paul Homewood gives a good presentation of the current status using satellite temperatures from RSS. One may disagree with drawing a line between two points on the temperature graph (imagining one point) between 1998 and 2015, but the line illustrates the issue that the pause in temperature rise may re-occur. Further, Homewood illustrates the lag in time between ENSO events and atmospheric temperatures. This major issue of naturally caused temperature variations has been largely brushed aside by the IPCC and its followers.

Fortunately, the Earth System Research Laboratory of NOAA has maintained a web site showing historic La Niña events since 1950, historic El Niño events since 1950, and a short description of the Multivariate ENSO Index (MEI). The MEI is the Laboratory’s effort to monitor ENSO. It is based on six variables over the tropical Pacific: sea-level pressure (P), zonal (U) [east-west] and meridional (V) [north-south] components of the surface wind, sea surface temperature (S), surface air temperature (A), and total cloudiness fraction of the sky (C).” The MEI is calculated back to 1950.

A 2009 paper by McLean, de Freitas, and Carter calculated that the time lag between ENSO events and tropospheric temperatures is distributed around a center of about 7 months. Section 1.4.1 of the 2013 report of the Nongovernmental International Panel on Climate Change (NIPCC) addressed some of the issues concerning ENSO. The report found that three claims arising from global climate models are at variance with reality. The claims are: 1) global warming will increase the frequency of ENSO events; 2) global warming will increase their intensity; and 3) weather-related disasters will be exacerbated under El Niño conditions.

One cannot conclude what will happen with the current ENSO event or even if it will result in a strong La Niña, as in 1998-99. These natural events have an important influence on climate change, which effects humanity; but, are largely ignored by government-funded entities. Certainty, a beginning would be to explore if there is a relationship between earlier MEI estimates and atmospheric temperatures measured by weather balloons, which go back to 1959. See links under Challenging the Orthodoxy – NIPCC, Measurement Issues – Atmosphere, and Changing Weather.

***************

Antarctic Peninsula – Posterchild of Doom? Much has been written about the warming of the Antarctic Peninsula and the melting of the West Antarctic Ice Sheet. The implications are that human-emitted carbon dioxide (CO2) caused global warming are rapidly melting the ice sheet, which will result in rapid sea level rise.

Researchers with the British Antarctic Survey recently reported that the Peninsula has been cooling for the past 20 years. The researchers attribute this finding to the stabilization of the ozone hole and changing wind patterns. The researchers stated that the absence of 21st century warming on Antarctic Peninsula is consistent with natural variability. Please note that TWTW did not double-check to ozone hole claim, which should be in the October to November period, and that the link to the article in Nature did not work on July 24. See links under Changing Cryosphere – Land / Sea Ice

***************

Energy Subsidies: Energy subsidies are a concept that has been distorted and abused to justify other subsidies. For example, petrostates have long discounted their fuels to domestic consumers. wind advocates in the US have used this discounted pricing by petrostates as a justification for wind subsidies in the US.

Ross McKitrick, along with Steve McIntyre, exposed the statistical bias in the model used by Mr. Mann in his now-broken hockey-stick. Recently, McKitrick wrote a working paper on energy subsidies and the inefficiencies they create. The abstract reads:

“Governments around the world have pledged to eliminate or sharply reduce subsidies to energy firms in order to increase economic efficiency and reduce environmental externalities. Yet definitions of subsidies vary widely and, as a result, estimates of their global magnitude vary by orders of magnitude. I review why energy subsidies are so difficult to define and measure. I show why some non-standard measures are very poor proxies for subsidy costs and in fact may vary inversely with them. In particular, recent attempts to treat unpriced externalities as subsidies yield especially misleading results. In general, energy subsidies as conventionally understood do exist but only comprise a small portion of some very large recently-reported estimates, the bulk of which are indirect measures that may have little connection with actual costs to governments or allocational inefficiencies.”

The paper uses language familiar to economists, with some quantitative background. Thus, it may be challenging to some readers. One main point is that efforts to adjust for indirect subsidies, real or perceived, may lead to significant distortion and more, inefficient subsidies. US federal efforts to calculate the social cost of carbon are a glaring example. See link under Subsidies and Mandates Forever.

***************

Storing Wine: Forget the French, with their age-old caves in Burgundy and Bordeaux, or the tens of millions of bottles of Champagne in caves in and around Reims. They don’t know how to store wine; but, the Department of Energy will teach you – and them. The DOE has issued 159 pages of regulations defining procedures for miscellaneous refrigeration production – including wine chillers. See links under Below the Bottom Line.

***************

Number of the Week: 1 part per 10,000 or 0.01%. According to calculations by Roy Spencer, and others, based on observations from Mauna Loa, over the past 60 years, humans have increased the concentration of CO2 in the atmosphere by about 1 part per 10,000 or 0.01%. Yet, politicians are calling for investigations of those who doubt this minuscule increase is the primary cause of global warming/climate change, and labeling the skeptics as “deniers.” See links under Suppressing Scientific Inquiry – The Witch Hunt and Challenging the Orthodoxy.

***************

ARTICLES:

Letter to New York Times July 13, 2016 (Unpublished)

By S. Fred Singer, Arlington, Virginia

“In his New York Times op-ed, July 12, “Another Inconvenient Truth: It’s Hard to Agree How to Fight Climate Change”, John Schwartz does a fine job of describing the splintering of the community of Global Warming alarmists over issues like nuclear energy and natural gas (methane). He suggests but doesn’t spell out how major enviro-groups have become “big business,” with executive salaries approaching the half-million-dollar mark.

“His story mentions Al Gore’s objection to nuclear energy, based on its high cost. But is Gore unaware that mass construction in factories of safer, modular reactors would lower costs by a factor of ten or more; regulatory pre-approval could eliminate the enviro-caused delays that have been largely responsible for the current high cost?

“Schwartz states that Bill McKibben has argued that the potency of methane as a greenhouse gas, might make it worse than coal

“While methane is certainly a greenhouse gas, its effect on climate is negligible. Doesn’t McKibben know that naturally occurring atmospheric water vapor not only overlaps the relevant infrared absorption bands of methane, but also is about 10,000 times more abundant?

“Wind and solar can certainly produce electricity, but are notoriously unreliable and require costly back-up power – supplied by stand-by generators using methane or nuclear energy. In 2008, then-candidate Obama promised that electricity prices would ‘sky-rocket.’ European experience with wind and solar seems to have proven him correct.”

The writer is emeritus professor of environmental sciences at the University of Virginia and was the founding director of the US weather satellite service.

**************

2. The Marvel of Electricity

Although the grid’s vulnerability to cyberattack and sabotage gets the headlines, the No. 1 cause of power outages is foliage. Almost as problematic are squirrels.

By R. Tyler Priest, WSJ, July 15, 2016

http://www.wsj.com/articles/the-marvel-of-electricity-1468616241

The review of Gretchen Bakke’s book “The Grid” opens with:

“Without electricity, life shuts down. Buildings and streets go dark. Computers and smartphones die. Televisions flicker off. Food perishes in refrigerators. Even money, which is stored, traded and monitored electronically, becomes inaccessible. Yet power outages occur regularly—whether caused by storms, computer bugs or overgrown trees like the ones near Akron, Ohio, that in 2003 triggered a blackout in eight Northeast states and the Canadian province of Ontario. Fifty million people were without power for two days, making it the largest blackout in U.S. history. Lost business revenue amounted to $6 billion.

Power outages may be regular in the United States, but they are not frequent. On average, Americans can depend on power 99.7% of the time. Perhaps this is why most of us are blithely incurious about the source of the electricity that powers our lives, safely counting on it to come out of the socket in the wall. Beyond the wall lie three synchronized regional grids of electrical lines spanning the continental United States and large parts of Canada. Nearly 20,000 power generators housed in more than 7,500 powerhouses convert primary energy sources—coal, natural gas, nuclear fission, flowing water, sunlight, the wind—into electric current, which is then transmitted across more than 160,000 miles of high-voltage power lines to some 60,000 substations. From there, transformers step-down the voltage and send it through millions of miles of distribution lines to keep the lights on, the fridge cold and the information streaming into our homes and offices.

The U.S. electrical grid is one of the engineering marvels of modern history. But it is aging and under stress. As Gretchen Bakke explains in “The Grid: The Fraying Wires Between Americans and Our Energy Future,” a vast network that arose to provide electricity in a centralized and standardized fashion is “being colonized by a new logic: little, flexible, fast, adaptive, local.” The large utilities no longer enjoy a monopoly over the power that they spread across the grid. They are obliged to transmit electricity produced by independent generators and to purchase power from a growing number of wind and solar installations, which, on cloudy or calm days, can subject the system to massive fluctuations in electricity generation.

A cultural anthropologist at McGill University who has studied the subject of state failure in Eastern Europe, Ms. Bakke investigates in “The Grid” what she fears is another failing system. She begins by summarizing electric power’s history in the United States, describing how the system emerged not from a preconceived plan but through business and regulatory improvisation. The key figure was Samuel Insull (1859-1938), the father of centralization. This former secretary to Thomas Edison consolidated the many small, local and intermittent power generators in the Midwest into a utility behemoth, Commonwealth Edison. Unlike its predecessors, it was able to deliver power around the clock and at greatly reduced prices. Progressive Era reformers encouraged the rise of “natural monopolies”—mostly investor-owned or “private”—granting them exclusive control over large service areas in exchange for tight regulation. “You know these,” writes Ms. Bakke, “you likely pay a bill to one of them every month. They have names like ConEd, National Grid, PG&E, PEPCO, Xcel, Entergy, and Southern Company. These are the direct descendants of the system that Insull built in Chicago.”

After a brief review of the federal governments involvement in the 1930s the review continues:

“After World War II, the utilities expanded to meet the building boom and with increasingly efficient power plants even lowered rates for customers. High-voltage power lines were erected to carry power across greater and greater distances, connecting the regional systems together to provide backup during emergencies—and enabling the selling of excess power to neighboring utilities where it was needed. Two large regional “interconnections” emerged, in the East and West, with a third in ever-independent Texas, whose private utilities avoided interstate connections and thus federal regulation.

“What makes centralized power cheap and accessible also makes it vulnerable. Electricity must be consumed the instant it is generated. It is not a gas, liquid or solid but a force that cannot be bottled up. Despite determined efforts, we still have not developed an efficient battery capable of collecting electricity at a grid scale. “Rather than storing excess power we use generation to ‘balance’ generation,” explains Ms. Bakke. “Every time a cloud goes by and diminishes solar output for a second or two, we burn some fossil fuels to generate enough little jolts of electricity to even out the electron flow.” “Wheeling” electricity (the industry’s term for long-distance transportation) across regional networks to get where it’s needed requires coordination and automatic controls that can fail. Generating plants must maintain backup power, often natural-gas turbines, to meet “peak load” on hot summer days, even if those turbines remain idle the rest of the year. The interconnected grid is therefore marked by redundancies and inefficiencies. It can also succumb to cascading outages. Although the grid’s vulnerability to cyberattack and sabotage captures headlines, Ms. Bakke finds that the No. 1 cause of outages is foliage. Almost as problematic are squirrels, which gnaw through power-line insulation and burrow into substation equipment.

“Grid instability, however, is more than a physical problem. It is also a result of policy changes that began in the 1970s. The Public Utilities Regulatory Policy Act, or Purpa—passed in 1978 in the wake of the oil shock—was intended to promote energy conservation and increase domestic supply. Section 210, little noticed by industry lobbyists at the time, required utilities to purchase electricity from small power producers and cogeneration units (which recycle the heat produced by industrial activity into electricity). According to Ms. Bakke, Purpa “opened the door to honest-to-goodness competition in electrical power generation. As bidding auctions between small power providers gradually became the most effective way to integrate new forms of generation, and the companies making it, Purpa helped to prove that bigger wasn’t better and that monopoly-governed, vertically integrated, government-regulated megacompanies were far and away not the best way to make and manage American power.”

“The Energy Policy Act of 1992 was aimed at diversifying U.S. fuel sources and mandated competition in the wholesale power market. It was part of the larger deregulatory trend of the 1980s and ’90s. The act directed the Federal Energy Regulatory Commission (successor to the Federal Power Commission) to compel utilities to unbundle generation from transmission and provide fair access to their power lines for smaller companies. Although not mentioned by Ms. Bakke, the deregulation of natural-gas prices in the late 1980s, which encouraged gas exploration and made it an affordable means of electricity generation, set the stage for the 1992 act.

“Competition was transforming the grid. Utilities suddenly faced financial pressures, with sometimes disastrous results. Ms. Bakke notes, for example, that First Energy of Ohio, under budget pressure, reduced its tree-trimming schedule in 2001, which may have contributed to the line interference that caused the 2003 Northeast blackout. Wholesale electricity also became a tradable commodity, with its price set by supply and demand rather than by regulatory diktat, giving rise to companies, such as Enron, that specialized in energy trading and market manipulation. Sometimes the best price was far from the point of generation. The ensuing wheeling has overburdened high-voltage transmission lines, while opposition from property owners has often blocked the construction of additional ones.

“The growth of wind and solar power, encouraged by tax credits, presents further challenges to managing load on the grid because of their intermittency and tendency to produce the most power when it is least needed. Then there’s “grid defection,” a growing trend of small communities—such as colleges, military bases and even whole towns—that generate, store and distribute their own electricity on microgrids. These usually work only part of the time, and so utilities lose customers during some periods but must maintain the capacity and transmission facilities to deliver power to those same customers whenever they decide they want it.

“Ms. Bakke shows that investor-owned utilities, which serve 68% of the U.S. population, no longer control the relationship between generation and consumption. It might be premature to suggest, as Ms. Bakke does, that these utilities are on the verge of a “death spiral.” But she is correct that they are desperate to manage demand. Innovations such as smart meters (which allow the utilities to monitor your energy use in real time), plug-in electric cars (which are essentially just great big batteries) and time-of-use pricing strategies (which charge customers more at times of high demand) could enable operators to predict, shift and smooth-out electricity use.

“There are major hurdles, though. Many Americans worry that smart-meter monitoring is an invasion of privacy and could even give the utilities control of how much power each consumer uses. The idea that plug-in electric cars could provide a form of mass storage (“vehicle-to-grid”) is predicated on enormous infrastructure investments and the bulk purchase of plug-ins whose sales are today in decline. And new pricing structures involve legal and regulatory issues.”

According to the reviewer, the book does not address central issues such as retiring main power plants and “decarbonizing” the grid. Instead: “Ms. Bakke rightly concludes that the task will seem less daunting if we can find a way to place a concrete value on conservation and efficiency. ‘This is the real story behind contemporary grid reform,’ she declares: ‘not just valuing electrons made by unusual producers, but valuing electrons that we never needed to make at all—the saved power that we shouldn’t even notice has gone missing.’”

[SEPP Comment: The critical questions of who decides what electricity is needed or what determines need are not discussed by the reviewer or, apparently, by the author.]

**************

3. The Rise of the Kluges

From the electrical grid to Toyota’s software to online dating sites, the systems we live by are inelegant messes that no one fully understands.

By Amir Alexander, WSJ, July 19, 2016

http://www.wsj.com/articles/the-rise-of-the-kluges-1468968418

[SEPP Comment: Illustrates possible problems with complex models, including global climate models.]

SUMMARY: In reviewing Samuel Arbesman’s new book “Overcomplicated: Technology at the Limits of Comprehension,” points out that certain events occurring in software systems cannot be explained by a simple cause. Events such as “United Airlines grounded its flights because of a computer glitch; the New York Stock Exchange halted trading when its system went haywire; and The Wall Street Journal’s website inexplicably went dark. These failures were not, as some suspected, the result of a nefarious attack by Russian hackers, but to Mr. Arbesman they were not meaningless chance either. They were an early warning of the fragility of the systems that we rely on—a possible portent of disasters that lie ahead.”

“The problem, as Mr. Arbesman, who trained as computational biologist, argues convincingly, is that our systems have grown too complex to handle. In the past, when a system suffered a catastrophic failure we could find the specific cause. When the Challenger shuttle exploded in 1986, a committee of experts zeroed in on the culprit: the O-rings used to seal the solid-fuel tank, which lost their flexibility in cold weather.

“But two decades later, when Toyota was facing its own technological Dunkirk, no simple answers were forthcoming. After a number of deaths were apparently caused by its vehicles accelerating uncontrollably, Toyota’s software was examined by outside experts in a lawsuit against the company. Their conclusion was less than reassuring: At least some of the deaths were caused not by the failure of any particular part but by the unpredictable interaction of different pieces of software. Not exactly what Toyota owners wanted to hear.

“And Toyota, Mr. Arbesman argues, is not an outlier. From the electrical grid to Internet dating sites, the systems we live by have become “kluges”—overly complicated, inelegant, cobbled-together messes. Even experts can no longer fully understand or control them. And as long as this is the case, the rate of catastrophic failures will only keep increasing.

“What, then, is to be done? The obvious solution is to simplify our systems, but that is next to impossible. Once a system is in place it inevitably undergoes a process of accretion—feature is added to feature and new layers to old ones. Over time, systems are connected to other systems, and each learns to deal with an increasing number of “edge cases”—rare occurrences that barely affect the overall performance of the system but nevertheless have to be accounted for. And so, bit by bit, a kluge is born.

“If this was where “Overcomplicated” had ended, it would make for a depressing read. Driven by inexorable forces, we have become trapped in the Entanglement, an all-encompassing interconnected web that we neither understand nor control. Yet Mr. Arbesman is no doomsday prophet. While we cannot reverse the Entanglement, he argues, we can learn to live with it. The secret is to give up on the idea that we can or should fully understand a system by reducing it to an elegant model. That approach has worked wonders for physicists, as when Isaac Newton formulated the mathematical principle of universal gravity, thereby explaining everything from the fall of a stone to the motion of the planets. But for our world, this approach, elegant and powerful though it is, simply will not work.

“Our only hope, Mr. Arbesman argues, is to approach the Entanglement much as a biologist approaches the natural world. Biologists do not look for grand formulations. They conduct experiments and carefully observe. Over time they learn a great deal about the natural world, which becomes more predictable and even somewhat more controllable. Yet they will never delude themselves into believing that they can completely understand biological systems or capture life in a formula.”

[SEPP Comment: The issues discussed above about the grid are separate from the threat of an electro-magnetic pulse (EMP), natural or man-made, disabling critical components such as transformers. As suggested, problems arise from unexpected sources when complex systems, such as mathematical computer models, are constructed from different components, each well-tested. It is not known if the components will integrate well together to produce reliable results. Such integration must be tested.]

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s