Weekly Climate and Energy News Roundup #282

The Week That Was: August 26, 2017 – Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

###################################################

Quote of the Week. “The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he already knows, without a shadow of a doubt, what is laid before him.” – Leo Tolstoy, 1894

###################################################

Number of the Week: 1,000,000 atomic bombs exploding per day

###################################################

History of Climate Change: In the second edition of “Climate, History, and the Modern World”, climate change research pioneer H.H. Lamb expressed disappointment with the direction the Climatic Research Unit (CRU) at the University of East Anglia was taking. Lamb had worked diligently to establish the unit to understand the causes of climate change, both warming and cooling, before any undue influence from increased

atmospheric carbon dioxide (CO2) could be found. Lamb feared a global cooling, and had studied abrupt climate change from changes in pollen and other proxy records. Lamb’s depiction of temperature change from proxy records in central England in the first report of the UN Intergovernmental Panel on Climate Change (IPCC) became an icon.

Unfortunately, the CRU and the IPCC became overwhelmed by the fashionable belief that CO2 will be a major cause of climate change. This belief appeared to be backed-up by the 1979 Charney Report published by the US National Academy of Sciences. Quickly forgotten, was that the Charney Report makes clear that the projections of significant change from CO2 were based on speculation from climate modelers – not established by physical evidence, hard data.

There were no comprehensive calculations of global temperatures available in 1979, although the collection of the necessary data by satellites had just begun. The technique of using the collected satellite data to estimate temperatures was published by Roy Spencer and John Christy in the early 1990s. Without comprehensive global atmospheric temperatures, there can be no calculation of the direct influence of CO2 on temperatures, the influence of which occurs in the atmosphere. Surface instrument data only records a possible secondary effect, and are extremely limited, largely land based, and confined to westernized locations.

Also, unfortunately, the IPCC’s promotion of Mr. Mann’s hockey-stick had a negative impact on the use of proxy data to understand climate history. The hockey-stick had several deficiencies, including improper mathematical manipulation, failure to properly calibrate instrument data with proxy data, and improper elimination of data that did not support the hypothesis, “cherry-picking.” In the west, even well-conducted proxy studies have a cloud.

Many western scientific societies have succumbed to political pressure from the IPCC and others; but, fortunately not all. The Chinese Academy of Sciences is to be congratulated for publishing a study of proxy data covering the past 2,000 years.

The proxy data are from tree rings, lake sediments, ice cores, stalagmites, corals and historical documents and show four distinct warm periods, epochs, over the past 2000 years. They show significant warming and cooling and changes in precipitation. There appears to be three distinct multi-year cycles. Generally, warm periods are associated with prosperous times. The current warm period is comparable to ones in 981 to 1100 AD and 1201 to 1270 AD. The current warm period is associated with El Niño Southern Oscillation (ENSO) and the Atlantic Multidecadal Oscillation. Cold periods were associated with sunspot minima. The difference between warm periods and cold periods is about 1.3°C (2.3°F). See links under Challenging the Orthodoxy.

***************************

New Technique? Using artificial neural networks (ANN), which are a form of machine learning (big data and artificial intelligence), Australians Jennifer Mahonhasy and John Abbot deconstructed 2000 years proxy data to reconstruct what temperatures may have been in the absence of human CO2 emissions. The proxy data they use include tree rings and coral cores. They state that the proxy records show cycles of warming and cooling within in a band of 2°C.

Interestingly, they date the Medieval Warm Period “from AD 986 when the Vikings settled southern Greenland, until 1234 when a particularly harsh winter took out the last of the olive trees growing in Germany” and the “end of the Little Ice Age as 1826, when Upernavik in northwest Greenland was again inhabitable – after a period of 592 years.”

Inhabited by Inuits, Upernavik is the northernmost town in Greenland with a population over 1,000 and the northernmost point in Greenland where Norse runic characters have been found, on a stone. Mahonhasy states “the modern inhabitation of Upernavik also corresponds with the beginning of the industrial age. For example, it was on 15 September 1830 that the first coal-fired train arrived in Liverpool from Manchester: which some claim as the beginning of the modern era of fast, long-distant, fossil-fuelled (sic) fired transport for the masses…So, the end of the Little Ice Age corresponds with the beginning of industrialization.” [local spelling]

They show a graph on the match between the ANN projections and the proxy temperatures from 1880 to 2000. Based on the analysis, they conclude that the influence of industrialization (CO2) emissions has been in the order of 0.2°C, not the approximately 1°C, claimed by the IPCC. It remains to be seen whether this technique holds up to independent analysis. See links under Challenging the Orthodoxy.

***************************

Hurricane Harvey: The record breaking period of almost 12 years without a major hurricane, category 3 or above, hitting the US is over. Hurricane Harvey made landfall on the Texas coast, between Port Aransas and Port O’Connor (east of Corpus Christi) on Friday night. Harvey is a strong, slow-moving storm, Category 4 at landfall, and National Weather Service predicted a storm surge up to 9 to 13 feet (2.7 to 4 meters) and heavy rainfall of 15 to 30 inches (38 to 76 cm) with up to 40 inches (102 cm) in some locations

Fittingly, Roger Pielke Jr. cautioned against drawing long-term conclusions from short-term weather trends when he wrote:

“The world has had a run of good luck when it comes to weather disasters. That will inevitably come to an end. Understanding loss potential in the context of inexorable global development and long-term climate patterns is hard enough. It is made even more difficult with the politicized overlay that often accompanies the climate issue.”

On August 9, the National Weather Service has upped its forecast for the season to 14 to 19 named storms, 5 to 9 hurricanes with sustained winds of at least 74 mph (33 m/s; 64 knots; 119 km/h) (Category 1), and 2 to 5 major hurricanes with sustained winds of at least 111-129 mph, (96-112 knots 178-208 km/h).

One of the reasons for the revised forecasts of more hurricanes is the absence of an El Niño, which inhibits the formation of hurricanes. The contrast between 2016 and 2017 is ironic. The two hottest years in the atmospheric record are 1998 and 2016, both strong El Niño years. The EPA, and other government groups, claim that CO2-caused warming endangers human health and welfare. Yet, El Niños, which are a cause of increases in temperatures, occur with global wind patterns that inhibit the formation of hurricanes, which are truly destructive. See links under Seeking a Common Ground and Changing Weather.

***************************

Changing Sea Levels: With Hurricane Harvey, no doubt we will see many more articles on the dangers of sea level rise, including government reports from NOAA and NASA claiming short-term trends are indicative of long-term trends. As stated in the May 13 TWTW, the recently deployed GRACE satellites may be calibrated incorrectly, attributing sea level rise to a presumed melting in the West Antarctic that may be more appropriately attributed to the melting of the great ice sheets covering much of the Northern Hemisphere.

Three recent studies cast further doubt on forecasts a significant increase in sea level rise; two for the Atlantic coast, and one for California. The Virginia Beach – Norfolk – Newport News area of Virginia may be one of the most vulnerable areas in the US to sea level rise. The Metropolitan Statistical Area (MSA) has a population of 1.7 million and has a large military presence, large ice-free harbor, shipyards, coal piers, miles of waterfront property and beaches, and significant industries.

Frequently, warnings of CO2-caused climate change resulting in increased sea level rise appear in the local papers, prompting Roger Bezdek to study the problem. He states:

 

“At the Sewells Point tidal station in Norfolk, Virginia, rising sea levels have been recorded since 1927: Sea level at Sewells Point rose at an average rate of 4.4 mm/yr. from 1927 to 2006, with a 95 percent confidence interval of ±0.27 mm/yr”

“It is important to get the cause of local sea level rise correct. For governments to identify an incorrect cause becomes a colossal waste of time and money. Far too often “climate change” is thrown about as the cause, but is meaningless and wasteful.”

The measured increase of 17 inches (44 cm) per century is considerably greater than generally accepted rise of 7 to 8 inches (18-20 cm) per century world-wide. There are three local conditions that may result in this increase: groundwater extraction resulting in aquifer-system compaction, geological conditions from a past meteor impact, and tectonic effects from the past ice age. Bezdek considers the latter two insignificant, but considers the former a significant problem that has been known for over 40 years.

“The two areas where subsidence rates were the most rapid roughly coincide with groundwater pumping centers at Franklin and West Point. Measurements of land subsidence are currently made at Continuously Operating Reference Stations (CORS) in the region. The National Geodetic Survey has computed velocities for three of these stations between 2006 and 2011 and found an average subsidence rate of 3.1 mm/yr (12 inches per century).

In Bezdek’s view, the primary problem of groundwater extraction and aquifer subsidence is solvable, as it was solved in the Houston-Galveston, Texas area.

Another study by Arnoldo Valle-Levinson, et al. of the Miami area finds:

“Tide gauge records reveal comparable short-lived, rapid SLR [Sea Level Rise] accelerations (hot spots) that have occurred repeatedly over ~1500 km stretches of the coastline during the past 95 years, with variable latitudinal position. Our analysis indicates that the cumulative (time-integrated) effects of the North Atlantic Oscillation determine the latitudinal position of these SLR hot spots, while a cumulative El Niño index is associated with their timing. The superposition of these two ocean-atmospheric processes accounts for 87% of the variance in the spatiotemporal pattern of intradecadal sea level oscillations.”]

Wind variance plays an important role in relative sea level rise, not CO2.

A study by Albert Parker and Clifford Ollier finds:

“that the sea level rises estimate by a local panel for California as well as the IPCC for the entire world are up to one order of magnitude larger than what is extrapolated from present sea level rise rates and accelerations based on tide gauge data sets.

“As the evidence from the measurements does not support the IPCC expectations or the even more alarming predictions by the local California panel, these claims and the subsequent analyses are too speculative and not suitable for rigorous use in planning or policy making.”

As stated in the NIPCC reports, to understand local sea level rise, one must examine local conditions, not regional or global models of what is speculated to happen. See links under Challenging the Orthodoxy – NIPCC and Changing Seas.

***************************

Why have US CO2 Emissions Fallen? Writing in Energy Matters, Roger Andrews follows-up on a prior paper by him and another by Euan Mears trying to develop a better estimate on this important matter. Many analysts have offered simple answers to this question, based on one single factor, such as fracking, renewables, the recession, etc. These answers do not try to assess the relative importance of a multiple factors.

Using US Energy Information Administration (EIA) data, Andrews finds that between 2007 and 2015, total annual US CO2 emissions decreased by 740 million tons (12%). Drilling deep into the EIA data he finds “that 35% of this decrease was caused by natural gas replacing coal in electricity generation, 30% by lower fuel consumption in the transportation sector, 28% by renewables replacing coal in electricity generation and 7% by other factors.” These estimates do not include the impact of the recession and as Andrews bluntly admits the estimates are speculative and cannot be easily confirmed.

Using his conversion factors of TWh for CO2 emitted by generation type, one can calculate that, in general, coal produces 2.55 times more CO2 per TWh generated than natural gas. [A Terawatt-hour (TWh), is a measure of electrical energy, 10 raised to the 12th power, watt-hours.]

Later, writing in the comments section, Andrews states:

“…the underlying purpose of this post was to obtain an estimate of how much of the recent reduction in US emissions could be attributed to renewables as opposed to ‘market’ forces. According to the results 28% can be attributed to renewables if we ignore the impacts of the 2008-9 recession, and arguably as little as 12% if we don’t. Either way this doesn’t represent much bang for the ~200 billion bucks the US has so far spent on developing renewable energy.” [Andrews lives in Mexico.]

See links under Energy Issues – US

***************************

Greenpeace and RICO: Prior to the last election, certain politicians suggested using the US Racketeer Influenced and Corrupt Organizations Act (RICO) against individuals whose views they do not like and whom they call “climate deniers” and “anti-science.” The environmental industry did not make a loud outcry against such action.

Now the builders of the Dakota Access Pipeline have sued Greenpeace for $300 Million in damages citing RICO. Under RICO, the damages award would triple. Greenpeace US attorney Tom Wetterer said that the suit was “not designed to seek justice, but to silence free speech through expensive, time-consuming litigation. This has now become a pattern of harassment by corporate bullies.”

The same can be said about politicians who threatened to use RICO. Nonetheless, the issue of damage to pipeline property remains open, and government officials refused to maintain order during the protests. What recourse do private citizens and companies have in such circumstances? See links under Litigation Issues

***************************

Word Counting and Evidence: Geoffrey Supran and Naomi Oreskes, at the Department of the History of Science, Harvard University, performed a textual content analysis (word counting) of Exxon documents. In the abstract of their paper they state:

“This paper assesses whether ExxonMobil Corporation has in the past misled the general public about climate change. We present an empirical document-by-document textual content analysis and comparison of 187 climate change communications from ExxonMobil…” [Boldface added.]

Apparently, at Harvard word counting is now considered empirical evidence in science. SEPP is still waiting for the hard, empirical evidence that CO2 is the primary cause of global warming, as the IPCC and many government officials and academics have claimed. Content analysis does not suffice, especially from the IPCC.

But, what can one expect from a Harvard historian who so poorly represented the writings of Paul Samuelson, perhaps the most outstanding graduate of Harvard in economics, and the first US Nobel laureate in economics? For years, Samuelson advocated that the economy of the Soviet Union was comparable to that of the US because its military was comparable. In attacking others, Oreskes claimed that it was common knowledge that the Soviet military was weak and could not be sustained, because its economy was weak.

This could be a very instructive learning moment – don’t trust authorities in one subject, if they assert claims, with no hard evidence, in another subject. See links under Defending the Orthodoxy

***************************

Number of the Week: One million atomic bombs. Roy Spencer calculated that the energy released by a hurricane such as Harvey is more than 1,000,000 atomic bombs exploding per day, of energy dropped on Hiroshima. See link under Changing Weather

***************************

ARTICLES:

1. Cuomo’s Natural Gas Blockade

New York’s Governor is raising energy costs for millions of Americans.

Editorial, WSJ, Aug 23, 2017

https://www.wsj.com/articles/cuomos-natural-gas-blockade-1503529234

SUMMARY: The editorial states:

“The U.S. shale boom has lowered energy prices and created hundreds of thousands of jobs across the country. But those living in upstate New York and New England have been left in the cold by New York Gov. Andrew Cuomo, whose shale gas blockade could instigate an energy crisis in the Northeast.

 

“Progressives once hailed natural gas as a “transition fuel” to renewables like solar and wind, but now they are waging a campaign to “keep it in the ground.” New York is ground zero. First, Mr. Cuomo banned hydraulic fracturing (i.e., fracking), and now he’s blocking natural gas pumped in other states from reaching Northeast markets.

 

The Empire State’s southern tier overlays the rich Marcellus and Utica Shale formations, among the most productive drilling regions in the country. Shale fracking has been an economic boon for Appalachia—and could have lifted upstate New York. Since 2010 natural gas production has surged 520% in West Virginia, 920% in Pennsylvania and 1880% in Ohio.

 

“Mr. Cuomo’s predecessor David Paterson imposed a moratorium on fracking in 2010. After winning re-election in 2014, Mr. Cuomo started laying the ground for a White House bid and made the ban permanent. Between 2010 and 2015, New York’s natural gas production plunged by half—which has translated into fewer jobs as well as less royalties for landowners and revenue for local governments.

 

“Last year the Governor compounded the economic damage by blocking the 120-mile Constitution pipeline transporting natural gas from Pennsylvania to upstate New York and New England. Although the Federal Energy Regulatory Commission (FERC) approved the pipeline in 2014, Mr. Cuomo’s Department of Environmental Conservation conducted a separate review and denied a water-quality permit putatively because the developers hadn’t provided sufficient information.

 

“Constitution’s developers challenged the denial in federal court. While the Clean Water Act lets states perform their own environmental reviews, New York appears to have abused its discretion. Last week the Second Circuit Court of Appeals deferred to state regulators while leaving a door open for the pipeline companies to challenge the timeliness of the state review in the D.C. Circuit Court of Appeals.

 

“While Constitution isn’t dead, environmentalists say the appellate-court decision will give New York and other states more latitude to block pipelines, which is no idle threat. Two major pipelines in the Northeast under development will need state approvals, and developers pulled two others in the past two years amid regulatory obstacles in New England.

 

“All of this is ominous since the region desperately needs more natural gas to make up for lost power from the impending shutdown of nuclear and coal plants. New England’s Independent System Operator projects that 14% of the region’s electric generation capacity will be retired within three years and says more pipelines are needed for grid stability.

The editorial discusses the premature closure of the Indian Point nuclear plant and that the governor has no back-up plant. Then it continues:

“Speaking of which, about a quarter of households in New York, 45% in Vermont and 65% in Maine still burn heating oil—which is a third more expensive than natural gas and produces about 30% more carbon emissions per million Btu. Yet many can’t switch due to insufficient natural gas and pipeline infrastructure.

 

“Mr. Cuomo’s natural gas blockade is harming residents and businesses throughout the Northeast while raising carbon emissions that he claims are imperiling the planet. The likely Democratic presidential aspirant may hope to ride this record to the White House, but millions of Americans are already paying a high price for his policies”

***************************

2. Why Georgia Sticks With Nuclear Power

It’s a hedge against a low-carbon future—and much more.

By Tim Echols, WSJ, Aug 17, 2017

https://www.wsj.com/articles/why-georgia-sticks-with-nuclear-power-1503011785

SUMMARY: Mr. Echols is a member of Georgia’s Public Service Commission. He writes:

“Georgia’s decision to continue building two new nuclear reactors—the only commercial ones now in development in the U.S.—means my state stands alone. Vermont’s Yankee plant went offline in 2014, and Massachusetts’ Pilgrim Station is scheduled to close in 2019. The company behind two half-finished reactors in South Carolina may abandon the project.

 

“Georgia has been down this road before. The first two reactors at the Vogtle Electric Generating Plant near Augusta were completed in 1987 and 1989, in the aftermath of the 1979 Three Mile Island accident. What was supposed to be a $1 billion project turned into an $8 billion one. Still, it was a great deal for ratepayers, delivering low-cost power for decades.

 

“Today, finishing the Vogtle plant’s two new Westinghouse AP1000 reactors is the right call—for their owners, including Southern Co. , as well as for Georgia and the U.S.

 

“Diversifying the energy supply makes sense, because no one knows what the future holds. The U.S. could institute a carbon tax, or even regulate frackers out of a job. No matter what happens, nuclear reactors will ensure Georgia’s electric rates stay competitive.

 

“They also will keep the U.S. from forfeiting its nuclear leadership. As other states have decommissioned reactors without replacing them, the world has begun looking to nations like China and Russia. The World Nuclear Association reports China is increasing its nuclear generation capacity 70% by 2021 and will surpass U.S. output by 2030. The only way for America to continue setting international standards for safety and security is to invest in reactors and technology.”

Mr Echols concludes with his firm commitment to the plant and the need for it.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s