Weekly Climate and Energy News Roundup #193

The Week That Was: August 22, 2015 – Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

Administration’s Power Plan: Independent analysts continue to provide details of the Obama Administration’s politically named “Clean Power Plan” (CPP). These studies make clear that the only forms of new electrical power generation the administration considers “clean” are solar and wind. Electric power generation from fossil fuels are condemned by the administration. Hydroelectric generation is out of favor, as explained by ex-EPA official Alan Carlin. There are no plans for federally supported new dam construction in the US. In fact, the thrust has been to tear down existing dams in the name of the environment.

Nuclear energy, which produces no carbon dioxide (CO2) is not an option. The administration mothballed the nuclear waste repository at Yucca Mountain and has not offered solutions for an alternative. Indeed, in 2009 the EPA published in the Federal Register a rule limiting radiation doses from Yucca Mountain for up to 1,000,000 years after it closes, demonstrating the absurd durations the administration considers its edicts are enforceable. Biomass burning on a large scale would require clearing the forests, as was done in the eastern US in the 18th and 19th centuries, which would be politically unacceptable.

This leaves only solar and wind as the major sources of electrical power generation. Both are unreliable, erratic, and expensive. The Administration’s concept would be more appropriately termed the unreliable power plan.

Even with its plans to prevent new, reliable electrical-power generation, a report by the Institute for 21st Century Energy of U.S. Chamber of Commerce finds the plan falls far short of the goals set by Mr. Obama.

“Even with these fairly generous estimates, these measures, which include some programs that haven’t even been announced yet, would fall about 800 MMTCO2 [Million Metric Tons of CO2], or 45%, short of the president’s goal. How does administration intend to plug the remaining gap? It hasn’t said. When asked by the Financial Times about the holes in the administration’s INDC [Intended Nationally Determined Contributions pledged for the UN-Conference of Parties (COP 21) in Paris in December], White House official Rick Duke chose to deny existence of a problem and instead change the subject: ‘Our numbers are quite clear. It’s other countries where we see more opportunities to clarify what the plans are.’” Boldface added.

We need other countries to define what our plans are? What will the administration do to fill the 45% shortfall is anyone’s guess? The report indicates that major industries should be on the alert. “Still, seeing as the entire industrial sector emitted a little over 800 MMTCO2 in 2013, even very steep cuts by industry won’t deliver nearly what’s needed”, according to the US Chamber.

Terry Jarrett, a former commissioner of the Missouri Public Service Commission, observed: “And if you’re skeptical of the threat posed by man-made CO2 in an ever-changing climate, then you’ll likely balk at the stunning price tag for this new set of rules, which the U.S. Chamber of Commerce estimates at an annual cost of $51 billion in lost GDP and 224,000 jobs lost.”

One can quibble about the numbers, but the direction is clear, the Administration is willing to damage an already weak economy (real growth rate of about 2% during the Administration), in order to fight global warming/climate change – an enemy so ill-defined that the Administration has failed to grasp the natural causes of climate change. See links under The Administration’s Plan – Independent Analysis, and The Administration’s Plan – Push-Back.

###################################################

Quote of the Week: “Making dire predictions is what environmental groups do for a living, and it’s a competitive market, so they exaggerate.” – Matt Ridley, WSJ

###################################################

Number of the Week: 119%

###################################################

Needed Research: On his web site, Roy Spencer, co-founder of the method of measuring atmospheric temperatures by satellites, the only comprehensive, virtually global measurements existing, reported that: “As part of a DOE grant we are testing climate models against satellite observations, particularly regarding the missing ‘hotspot’ in the tropics, that is, the expected region of enhanced warming in the tropical mid- and upper troposphere as the surface warms. Since 1979 (the satellite period of record), it appears that warming in those upper layers has been almost non-existent, despite some surface warming and increases in low-level humidity.”

It is unclear if “we” refers to the entire group that reports global temperatures, based at the University of Alabama in Huntsville or not.

The research is much need. In its Second Assessment Report, (AR2 – 1996), the UN Intergovernmental Panel on Climate Change (IPCC) erroneously asserted the “hot-spot” was the distinct human fingerprint, which it is not. In 2007, Douglass, Christy, Pearson and Singer found the “hot spot” exists in the models, but not in observations. No one has been able to produce data establishing the “hot-spot.” The issue is more fully discussed at: http://multi-science.metapress.com/content/k7v3v4173346317x/. Yet, it is a critical part of the EPA’s 2009 finding that human greenhouse gas emissions, particularly CO2, endanger human health and welfare. Without EPA’s finding, the Administration has no legal or scientific basis for severely restricting CO2 emissions as prescribed in its CPP. See link under Challenging the Orthodoxy.

*******************

Balancing the Load: One of the topics avoided by the promoters of wind and solar, including government officials, is the need for balancing the load on the electrical grid. That is, roughly equating consumption with generation. Too much electricity generated at one time will blow transformers, capacitors, and other devices designed to give the system stability. The system will fail and it may require some time before it can be repaired. Too little electricity generated at one time results in brown-outs, black-outs and other forms of failure. The load must be balanced constantly, and utility companies do so by engaging electricity providers, daily, on an as needed basis. The electricity provided is often far more expensive than electricity provided consistently. Conversely, excess electricity must be dumped at low prices.

The only major form of electricity storage in general use is pumped-hydro storage. This usually involves pumping water uphill from one reservoir at one elevation to another reservoir at higher elevation, (several hundred feet higher). From the second reservoir, the water can be drawn down through hydroelectric turbines to create power when needed. In general, the system loses about 20 to 30% of available power and requires large reservoirs. The largest such facility is in Bath County, Virginia. Unfortunately, EPA clean water regulations are making the new construction of such facilities very difficult, even where geologically feasible.

On her web site, Jo Nova has graphs showing the erratic nature Australian Wind Energy Production in July and first half of August. Similar patterns are found elsewhere such as West Denmark: http://www.emd.dk/el/ and the Pacific Northwest (Bonneville Power Authority) http://transmission.bpa.gov/Business/Operations/Wind/baltwg.aspx.Zero values are not unusual. From August 17 to August 23, 2015, wind power generation at Bonneville varied from zero to over 4,000 MW, most of the time near the bottom.

No amount of government edicts or regulations will stabilize the wind. In the US, the Administration’s and EPA’s power plan suppresses stable, reliable forms of electricity generation in favor of erratic and unreliable solar and wind; yet, other regulations by the EPA and Administration suppress the ability to stabilize erratic electrical power so generated. See links under Alternative, Green (“Clean”) Solar and Wind

*******************

Capacity Factors: Another topic that promoters of solar and wind seldom discuss is capacity factors, which is a measure of reliability. Nameplate capacity is often used by promoters who will often statements of maximum capacity, such as the facility will provide enough electricity to power 500,000 homes. But nameplate capacity is not particularly meaningful if the facility will power 500,000 home only 5 minutes a day. Preston Cooper of the Manhattan Institute discusses capacity factors of various energy sources in the U.S. By far, in 2013, the greatest average capacity factor was 90.9% for existing nuclear, meaning that the nuclear plants remain on line, generating electricity over 90% of the time. Of course, nuclear is being suppressed by the Administration.

The greatest capacity factor for renewables is geothermal at 67.2%. Certainly, geothermal works well in Iceland, but few urban areas are built where geological plates are separating. There are few locations for geothermal in the US.

Biomass burning at 67.1% has a higher capacity factor than coal (steam turbine) at 58.9%, or natural gas (combined cycle) at 50.3%. Unfortunately, these statistics can be misleading. Biomass is used largely at paper-making and wood pulp locations where the waste is burned at the location for electricity. Other than the paper and wood industries, Biomass means little. Since coal is used more for base-load, it has an indicated higher capacity than natural gas, which is often used for more inefficient shoulder and peak-load.

When used alone, the capacity factors of erratic sources of electricity such as solar and wind are misleading because the power is not always available when needed. See links under Alternative, Green (“Clean”) Solar and Wind.

*******************

The Hiatus Again? Science Magazine published an article by Kevin Trenberth of National Center for Atmospheric Research (NCAR), a well-known member of the climate establishment that helps generate the IPCC reports. Reading past the puffery, the article uses global mean surface temperatures to assert a staircase for rising temperatures since 1920. First a rise from 1920 to 1940, then a stable period (hiatus) to 1975, then a rise to 1998, then another inflection point reflecting a lower rate of rise from 1998 to present (all dates are approximate). Trenberth asserts that other than “human-induced climate change” the greatest driver of the temperature variation is the El Niño–Southern Oscillation in the Pacific Ocean. “The year 1998 was the warmest on record in the 20th century because of the 1997–1998 El Niño, the biggest such event on record.”

There are several issues with this analysis. One, a more traditional analysis would have the initial warming from about 1910 to 1940, followed by a modest cooling to 1975, followed an increase to 2003 (the 1997-98 El Niño year is ignored), followed by the current hiatus. The rate of the first warming is about the same as the rate of the second warming. This goes to the central point, what is the cause of the first warming? CO2 emissions were very low. A secondary point: is the cause of the second warming period different than the cause of the first warming period? The IPCC claims the second warming was caused by human greenhouse gas emissions, but offers no compelling evidence.

Another significant issue is the great inconsistency (since 1979) between global mean surface temperatures, with frequent adjustments, and the far more comprehensive satellite data, independently supported by measurements from weather balloons. Would Science Magazine published a similar study by Roy Spencer and John Christy using their data? All this undermines EPA’s claimed evidence that greenhouse gas emissions, particularly CO2, endanger public health and welfare. For the paper and other criticisms, see links under Defending the Orthodoxy.

*******************

Oil Glut? Watching those who predicted that oil prices will never fall back away from their predictions is more fun than watching those who predicated unprecedented and dangerous global warming back away from their predictions. With the first group, their predictions did not become part of US national policy. With the second group, their predictions are becoming an economically damaging part of US national policy.

Now, with the first group, instead of the world running out of oil, some analysts are forecasting an oil glut with the price dropping to $30 per barrel. The governments of petro-states, whose existence depends on high oil prices, should be worried.

With second group, western governments are insisting their policies are correct, the danger of human-caused global warming/climate change are established, regardless of the lack of evidence. The citizens of these countries, whose well-being depends on a rational, properly functioning government, should be worried. See Articles # 1 & # 2 and links under Energy Issues – Non-US

*******************

Merchants Again? Jo Nova and Luboš Motl reported that the film, “Merchants of Doubt”, failed at the box office’ but. It is now being considered for schools. In his youth, Motl lived under Communism and identifies the film as propaganda. In the US, Edward Bernays, “the father of modern advertising” “pioneered the scientific technique of shaping and manipulating public opinion, which he called ‘engineering of consent.’” During World War I, Bernays was an integral part of the US Committee on Public Information, which sold the war to the US public as necessary “to make the world safe for democracy.”

The opening chapter of his 1928 book, Propaganda, Bernays titles as “Organizing Chaos” with the opening paragraph stating:

“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”

Based on their book Merchants of Doubt by Oreskes and Conway and the recent film, one can ask: who are the masses that need to be manipulated? Could it be those who believe the book and the film? See links under Communicating Better to the Public – Use Propaganda and Communicating Better to the Public – Use Propaganda on Children

*******************

Number of the Week: 119%. The 2014 Annual Report by Energinet.dk, the government-owned company that transmits electrical power in Denmark, estimates that by 2020 it will have the capacity to import 8,000 MW of electricity, when needed. The expected maximum electricity consumption will be about 6,700 MW. Thus, the country has the ability to import 119% of expected maximum electricity consumption even if there is zero production from the 6,900 MW wind turbine and solar cell capacity and zero production from the 5,700 MW central and local power station capacity. How many countries has such generous neighbors such as Sweden and Norway that allow sharing, largely with pumped-hydro storage? Of course, the capacity is not free. No wonder Denmark has the highest consumer electricity costs in the EU! See link under Alternative, Green (“Clean”) Solar and Wind

###################################################

ARTICLES:

Please note that articles not linked easily or summarized here are reproduced in the Articles Section of the full TWTW that can be found on the web site under the date of the TWTW.

1. A Crude Victory

Banning America from exporting oil while allowing Iran to do it.

Editorial, WSJ, Aug 14, 2015

http://www.wsj.com/articles/a-crude-victory-1439592923

SUMMARY: The editorial points out that the Administration’s decision to allow oil exports to Mexico is technically a swap – some US oil for some Mexican oil. The editorial writers assert Washington forbidding exports of oil makes no sense: “when America is becoming one of the world’s leading oil and gas producers. Today the main effect of the ban is to discourage some American producers from drilling for more supply, while leading others to get around the ban by exporting their oil in the form of refined gasoline and diesel, which can be exported.”

“There are also good strategic reasons for lifting the ban. It’s absurd to keep American oil producers from selling crude oil on global markets at a moment when the U.S. is about to lift the limits imposed by sanctions on the Iranians. Expanding the supply of U.S. oil would also provide allies in Eastern Europe with alternatives to their current oil dependence on Vladimir Putin’s Russia.

“The great irony is that President Obama, who has not been shy about asserting executive power, has the authority to allow American crude exports without Congress. But he’s made no move to do so…” and remains quiet on a bill eliminating the ban on exports.

****************

2. Canadian Oil-Sands Producers Struggle

Drop in crude prices, high costs challenge whether deposits can be extracted at a profit

By Chester Dawson, WSJ, Aug 19, 2015

http://www.wsj.com/articles/oil-sands-producers-struggle-1440017716

SUMMARY: “Canada’s high-cost oil-sands producers are struggling as oil prices sink to fresh six-year lows, and even the most efficient drillers are losing money on every barrel they produce at current prices, according to a report published Wednesday. [Not linked.]

“Canadian oil-sands production has grown 30% in the past five years but the recent price slump has hit producers’ bottom lines and forced them to suspend development of new projects.

“Western Canadian heavy crude costs more to extract than other oil sources because it must be separated from deposits of sand. It also trades at a discount to other crudes, in part because of the distance it must be transported from remote boreal forests in Alberta.

“Benchmark West Texas Intermediate oil cost less than $41 a barrel in Wednesday trading, which although at multiyear lows was still well above the Western Canadian Select average of around $24 a barrel.

“More than half of current oil-sands production can’t break even unless WTI crude-oil prices rise above $44 a barrel, according to a TD Securities Inc. report published Wednesday.”

“While about 45% of oil-sands production comes from strip mines, the remainder is tapped via horizontally or vertically drilled wells. Operators pump steam into these wells to melt deposits of crude embedded in sand using techniques called steam-assisted gravity drainage, or SAGD, and cyclic steam stimulation, or CSS. Current prices may no longer allow operators to cover the costs involved in extracting those deposits of heavy crude, or bitumen.

“Oil sands too deep for strip mines that must be tapped by drilling wells account for about 80% of Canada’s remaining reserves, which form the world’s third-largest source of undeveloped oil. But chronic cost overruns amid the pressure of lower oil prices are calling into question how much of those reserves can be recovered profitably.

“Despite lower prices, the Canadian Association of Petroleum Producers expects oil-sands output to continue to grow another 30% through 2020 as multibillion-dollar projects already under construction start producing.

“All the oil-sands crude—most of which is exported to the U.S.—is forcing rival heavy-crude producers like Venezuela, Mexico and Colombia to lower their prices to compete.”

CONTINUE READING –>

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s