Weekly Climate and Energy News Roundup #218

The Week That Was: March 12, 2016 – Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

Quest for Precision: One of the characteristics of scientific activities is the quest for precision to describe the physical world. Precision in understanding the error, or uncertainty, of one’s knowledge is an example of this quest. In some of his many essays on the philosophy of modern science Bertrand Russell, a prolific writer, used the ability to articulate uncertainty of knowledge as an example of what separates a scientist from an ideologue. The scientist defines with empirically established boundaries of the certainty of his findings. For example, a finding may be within plus or minus 5% using rigorous procedures that are well established. The ideologue is certain, absolutely, without boundaries of error.

Another issue is false precision, that is presenting numerical data in a manner that implies greater precision than is possible with the instrumentation or procedures used or knowledge current. Combining high precision data with low precision data and using the error range of the high precision data is a common example. To others, this practice gives the illusion of greater understanding and overconfidence in the accuracy of the results. Scientists and engineers have various techniques to correct for false precision.

Writing in American Thinker, physicist Tom Sheahen describes the enormous steps taken over the past 90 years to develop the precise instruments needed to discover and measure gravitational waves, a quest that has gone-on for 100 years since Einstein published the General Theory of Relativity. The detection of these waves shows that space-time deforms and bends, as predicted from the theory of General Relativity. This is one more verification, a very complex and delicate one, of the theory that overthrew classical mechanics developed by Isaac Newton. Near the end of the 19th century classical mechanics was considered a certainty.

The resources, including over $600 million, needed to measure gravitational waves were not suddenly thrown together. They came from decades of effort with many intermediate steps supporting the theory. An early step was measuring the bending of light during a solar eclipse in 1919. Other intermediate steps followed. The multi-decade process illustrates another characteristic of modern, empirical science – it is built step-by-step by empirical verification.

The discovery of gravitational waves illustrates one reason why SEPP does not accept the global climate models used by the UN Intergovernmental Panel for Climate Change (IPCC). Further, these models are relied upon by the US Global Change Research Program (USGCRP) and by the EPA’s endangerment finding, that increasing greenhouse gases, primarily Carbon Dioxide (CO2) endanger human health. The models have not been empirically verified or validated. In general, the models greatly overestimate the warming of the atmosphere, where the greenhouse effect takes place. There is no logical reason to assume they are able to predict future global warming caused by CO2 emissions as General Relativity was used to predict gravitational waves. There is no logical basis for using these models for establishing government policy, especially energy policy. See Article # 1.

###################################################

Quote of the Week: “It is not what the man of science believes that distinguishes him, but how and why he believes it. His beliefs are tentative, not dogmatic; they are based on evidence, not on authority or intuition.” ― Bertrand Russell

###################################################

Number of the Week: 16, 45, and 63 times more expensive.

###################################################

Sea Level Rise – Seeking Greater Imprecision? As shown in the first report of the Nongovernmental International Panel on Climate Change (NIPCC), Nature, Not Human Activity, Rules the Climate (2008), the first four assessment reports (ARs) of the IPCC had a reduction in the maximum estimates of global sea level rise to 2100. In AR-1 (1990) it was between a minimum of 10 centimeters (cm) to a maximum of 367 cm (4 to 144 inches). In AR-2 (1995) it was 3 to 124 cm (1 to 49 inches); in AR-3 (2001) it was 11 to 77 cm (4 to 30 inches); in AR-4 (2007) it was 18 to 59 (7 to 23 inches). (The draft of AR-4 had it between 14 to 43 cm (6 to 17 inches). AR-4 was current when the EPA Endangerment Finding was announced.

Based on historical data, Fred Singer of NIPCC estimated the rise will be 18 to 20 cm (7 to 8 inches). These estimates are global. The rise and fall of local sea levels depends upon numerous local conditions such as land subsidence and plate tectonics.

It is quite jarring to read a 2009 NOAA (updated in 2015) report stating “Scientists are very confident that global mean sea level will rise at least 8 inches (0.2 meter) but no more than 6.6 feet (2.0 meters) by 2100.” The upper bound is inflated dramatically, without justification.

One cannot state that the NOAA estimate is wrong, any more than one can state that the sea level rise will be between minus 2 meters and plus 10 meters is wrong. But one can state that some US government entities are misleading the public by inflating their reports. Such reports are being used by federal government representatives to insist that local officials must account the inflated reports when making land use decisions in low lying areas, which include extensive areas of the southeastern US near coastal regions.

A separate issue is the claim that there is an acceleration in sea level rise by using two sets of data. One dataset is from historic tidal gages, and the second dataset from satellite measurements. The two sets of measurements are not fully calibrated. To claim an acceleration in sea level rise by jumping from one dataset to another without fully explaining the effects of the jump is misleading. Also a claim by NOAA that the pace of sea level rise has increased since 1990 from acceleration and glacier and ice sheet melting is also questionable. See links under Challenging the Orthodoxy – NIPCC and Changing Seas.

*******************

Unreliable and Costly: Last week, TWTW discussed an experiment in the Canary Islands to make the electricity on one of the islands independent from fossil fuels, and dependent only on renewable generation from a combination of wind power with pumped hydro as back-up. After eight months of operation, the renewable system generated only about 32% of the electricity used, while diesel generators provided the remainder. It appears that, at best, the renewable system will generate a maximum of 50% of the total electricity needed. The major shortcoming is not the wind turbines. The shortcoming is inadequate storage in the upper reservoir that is needed for water to generate electricity when wind fails for prolonged periods. Since the islands are quit arid, and depend on desalination plants, nature cannot be relied upon to fill the reservoirs when needed. The estimated costs of the entire system were not available, much less the costs of expanding existing upper reservoirs.

Using data from the US Energy Information Administration (EIA) and EurObserv’ER, Ed Hoskins of the UK, makes some rough comparisons of capital costs needed to generate one Gigawatt (GW) of electricity from onshore wind, offshore wind, and solar photovoltaic (pv) on grid. The variance in estimated capacity factors is striking. The EIA estimate of onshore wind is 36%, offshore wind 38%, and solar pv on grid is 25%. The EurObserv’ER measured capacity (2014) is 21.8% for combined onshore and offshore wind, and 12.1% for solar pv. The Renewable Energy Foundations measured capacity factor for the UK (2002-2015) is 22.4% for onshore, 24.9% for offshore and 9.8% for solar pv.

The differences in the capacity factors for solar pv are not that surprising. In general, Europe is at more northern latitudes than the US. Further, northern Europe has cloudy weather. Certainly, solar pv there makes far less sense than installing solar pv in southwestern US. Yet, Germany has spent far more on solar pv than either Italy or Spain. All too often, capacity measurements are ignored by promoters of wind or solar, both in and out of government.

For comparison among alternative generating types, the EIA data gives an estimated levelized cost of electricity for each type of electricity generation per megawatt hour with 2013 cost of fuel. For example, the natural gas-fired conventional combined-cycle had a capacity factor of 87%, a livelized capital cost of 14.4 dollars, a fixed operation and maintenance of 1.7 dollars, variable operation and maintenance (including fuel costs) of 57.8 dollars, and transmission investment of 1.2 dollars for a system total of 75.2 dollars. As illustrated in 2013, fuel costs were the greatest component.

Using parity of one euro per one dollar and estimates of 2016 fuel prices along with actual EurObserv’ER capacity, Hoskins makes very rough estimates of the cost of renewables per gigawatt of generation. The approach needs to be refined, but it gives some idea of the enormous expense of solar and wind generated electricity as compared with natural gas. Further, the approach does not account for the inherent unreliability of solar and wind and the fact that they cannot be called upon to produce when needed.

Hoskins concludes: “If the objectives of using Renewables were not confused with ‘saving the planet’ from the output of Man-made CO2, their actual cost in-effectiveness and inherent unreliability would have always ruled them out of any consideration as means of electricity generation for any developed economy.” See Number of the Week and link under Questioning European Green.

*******************

Additions and Corrections: The February 27 TWTW contained the statement: According to historian Bernie Lewin, the hot spot was invented, without empirical support, at the 1995 UN Intergovernmental Panel on Climate Change (IPCC) scientific conference in Madrid by Benjamin Santer and promoted by IPCC Chairman John Houghton. This became a turning point for IPCC. https://enthusiasmscepticismscience.wordpress.com/2015/11/21/remembering-madrid-95-a-meeting-that-changed-the-world-2/#more-1838

It should have stated: According to historian Bernie Lewin, terming the hot spot “the distinct human fingerprint” was invented, without empirical support, at the 1995 UN Intergovernmental Panel on Climate Change (IPCC) scientific conference in Madrid by Benjamin Santer and promoted by IPCC Chairman John Houghton. This became a turning point for IPCC because it permitted the claim that the balance of evidence points towards a discernible human influence on global climate.”

https://enthusiasmscepticismscience.wordpress.com/2015/11/21/remembering-madrid-95-a-meeting-that-changed-the-world-2/#more-1838

*******************

Number of the Week: 16, 45, and 63 times more expensive. Although his analysis could use refinement, Ed Hoskins demonstrates that with today’s natural gas prices and actual capacities measured in Europe, onshore wind, offshore wind, and solar pv are roughly 16, 45, and 63 times more expensive than combined cycle natural gas, respectively. See link under Questioning European Green.

###################################################

ARTICLES:

1. The Discovery of Gravitational Waves

By Tom Sheahen, American Thinker, Mar 6, 2016

http://www.americanthinker.com/articles/2016/03/the_discovery_of_gravitational_waves.html

[SEPP Comment: Explaining the complexity and extreme delicacy of the instrumentation to detect gravitational waves. See commentary above.]

*******************

2. U.S. Targets Oil, Gas Wells to Cut Methane Emissions

U.S., Canada commit to reducing emissions by 2025

By Amy Harder, WSJ, Mar 10, 2016

http://www.wsj.com/articles/u-s-to-target-existing-oil-gas-wells-in-effort-to-cut-methane-emissions-1457586241

SUMMARY: According to the reporter:

The U.S. and Canada will commit to cut methane emissions from oil and gas by between 40% and 45% below 2012 levels by 2025, a commitment the Obama administration has previously made.

The U.S. Environmental Protection Agency, already working on rules cutting methane emissions from oil and gas wells not yet drilled, will begin devising regulations for existing wells and aims to release in April draft requirements for companies to provide information about equipment, emissions and control technologies from a broad range of oil and gas activities, including production, transmission, processing and storage, EPA Administrator Gina McCarthy said Thursday.

The step, being taken under the Clean Air Act, indicates the agency is preparing to write a regulation that is likely to affect hundreds of thousands of existing wells across the U.S.

The EPA is unlikely to complete a regulation before Mr. Obama leaves office at the end of 2016. Any proposal the EPA issues would likely stay on track if a Democrat wins the White House, while a Republican administration would likely withdraw it. Ms. McCarthy wouldn’t say Thursday whether the EPA would propose a rule before Mr. Obama leaves office.

The EPA says the oil and gas sector’s methane emissions account for almost 30% of all U.S. methane emissions, second to agricultural sources, which account for roughly 36%.

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s