Brought to You by www.SEPP.org The Science and Environmental Policy Project
By Ken Haapala, President
Sea Level Rise – What is Measured? Last week’s TWTW had an interview with Richard Lindzen a with statement questioned by some readers. The paragraph with the statement is:
“Since 1979 we have been able to measure sea level itself with satellites. However, the accuracy of such measurements depends critically on such factors as the precise shape of the earth. While the satellites show slightly greater rates of sea level rise, the inaccuracy of the measurement renders the difference uncertain. What the proponents of alarm have done is to accept the tide gauge data until 1979, but assume that the satellite data is correct after that date, and that the difference in rates constitutes ‘acceleration.’ They then assume acceleration will continue leading to large sea level rises by the end of this century. It is hard to imagine that such illogical arguments would be tolerated in other fields.” [Boldface added]
The issue is not the accuracy of the satellite sea surface data, but what is being measured, and how is it being interpreted (adjusted) by government entities. Several months ago, Ken Haapala attended a talk to the Physical Society of Washington, given by a spokesman for NASA, with senior NASA-Washington folks attending. The speaker claimed that sea level rise was accelerating as shown by satellite measurements. Haapala asked how did they calibrate the satellite measurements, because tidal gages do not show an acceleration? The answer was that the satellite data were calibrated using data from three drilling rig platforms in geologically stable areas of the globe.
The answer was unsatisfactory. In Euclidian geometry, three points may define the surface of a plane, but they are insufficient to define the surface of an uneven globe, much less the surface of globe that is undulating.
The NOAA web site on sea level rise contains a graph, from 1993 to 2017, that shows that satellite data may be very precise, but contain information that is in addition to a general trend in sea levels. For example, in 2010 – 2012, sea levels fell, as they did l998-99. This was most likely due to the end of the El Niño, which result from a shift in wind patterns and cause surface temperatures to decline. El Niño winds cause water to pile up in the western Pacific, which can be seen in Asian tidal gauge data. We see an inconsistent[? With what?] decline since 2016, most likely due to the end of the 2016 El Niño. (Unlikely it was due to US politics.)
The claimed acceleration became an issue about 2011, after the University of Colorado Sea Level Research Group claimed a correction to sea level measurements is necessary for glacial isostatic adjustment (GIA).
“The correction for glacial isostatic adjustment (GIA) accounts for the fact that the ocean basins are getting slightly larger since the end of the last glacial cycle. GIA is not caused by current glacier melt, but by the rebound of the Earth from the several kilometer-thick ice sheets that covered much of North America and Europe around 20,000 years ago. Mantle material is still moving from under the oceans into previously glaciated regions on land. The effect is that currently some land surfaces are rising, and some ocean bottoms are falling relative to the center of the Earth (the center of the reference frame of the satellite altimeter). Averaged over the global ocean surface, the mean rate of sea level change due to GIA is independently estimated from models at -0.3 mm/yr. (Peltier, 2001, 2002, 2009; Peltier & Luthcke, 2009). The magnitude of this correction is small (smaller than the ±0.4 mm/yr. uncertainty of the estimated GMSL rate), but the GIA uncertainty is at least 50 percent. However, since the ocean basins are getting larger due to GIA, this will reduce by a very small amount the relative sea level rise that is seen along the coasts. To understand the relative sea level effects of global oceanic volume changes (as estimated by the GMSL) at a specific location, issues such as GIA, tectonic uplift, and self-attraction and loading (SAL, e.g., Tamisiea et al., 2010), must also be considered. For more discussion on the GMSL and how it relates to tide gauges, see the GMSL and tide gauge FAQs.”
The issue is: is this adjustment necessary, or even desirable? In its Fifth Assessment Report (AR5, 2013,2014) the Intergovernmental Panel on Climate Change (IPCC) incorporated it starting in 1993, intensifying the claim that sea levels rise was accelerating.
However, the rebound effect from the melting of the great northern ice sheets should have been occurring for at least 10,000 years. The rebound effect is unlikely to change anytime in the future. Thus, the sea level rise data taken by coastal tidal guages, particularly those in bed rock, in tectonically stable areas, should already include any change in the holding capacity of the ocean basins. If so, further adjustment is redundant and mis-leading. It is the relationship between the seas and the shore that is important, which is measured by stable tidal gauges.
It is particularly disturbing that entities of NASA and NOAA have incorporated a GIA adjustment in tidal gauge data, which has resulted in outrageous claims of accelerating sea level rise, where the hard data show stability in rise. An excellent example is the myth that sea levels in North Carolina will extend 100 miles (160 km) inland from the present coastline in about 100 years, or that South Florida is about to be inundated. It is true, that the coast line during the last interglacial was about 100 miles inland in North Carolina and South Florida was inundated, but the process will take thousands of years, if it comes to pass, not a few decades. The “Tides and Current” tables show sea level in Wilmington, NC, is rising 0.75 feet per century with a 95% confidence interval of +/- 0.11 feet per century (2.3 mm/yr. +/- 0.34 mm/yr.). This is based on hard data, not speculation from climate models. Accelerating sea level rise is another example of how poorly climate modelers in NASA and NOAA test their assumptions.
On her web site, Climate Etc., Judith Curry is beginning an open discussion on the issue of sea level rise. It will be interesting to read how that develops. See links under Changing Seas.
Quote of the Week: “I tell you this story to put in perspective the value of being a Republic and having the protections of the US Constitution. Fortunately, thanks to the internet I can tell Americans these stories, so they can put things in perspective and count and defend their blessings.” Tim Ball, Canadian, See Litigation Issues.
Case Dismissed: The public nuisance lawsuits by the cities of Oakland and San Francisco have been dismissed by the Federal judge overseeing the case. Federal District Judge William Alsup decided that the Executive and Legislative branches of government were better suited with addressing the issues involved with greenhouse gas emissions than the Judicial branch. No doubt, the decision disappointed those who wished to have certain cities involved in setting climate policy and to punish oil companies for products that greatly enhance prosperity and economic development.
The decision no doubt disappoints those who wish to present the scientific case that humans are not the principal cause of global warming. The oil companies side-stepped the issue of the validity of the IPCC science, including increasing sea level rise. However, the decision demonstrates the futility of seeking redress through the Judicial branch, particularly as long as EPA’s unsubstantiated endangerment finding stands, that carbon dioxide and other greenhouse gases endanger public health and welfare.
According the Bloomberg news service, the Oakland and San Francisco city attorneys are weighing possible appeal. Oakland city attorney Barbara Parker stated:
“’Our lawsuit presents valid claims and these defendants must be held accountable for misleading the American people about the catastrophic risks to human beings and all forms of life on this planet caused by fossil fuel-driven global warming and sea level rise,’ she said in an email.”
Oakland claimed SEPP received moneys from Exxon to mislead the public on global warming but presented no evidence. Apparently, the attorney is incapable of checking IRS Form 990 and relies on rumor and inuendo instead. See links under Litigation Issues.
Law of the Air? On his web site, Bernie Lewin continues his history of several important parts in the formation of the IPCC and the fear of carbon dioxide. Lewin discusses the Changing Atmosphere Conference Statement, on June 30, 1988 following a three-day conference in Toronto. The claim was that:
“Humanity is conducting an unintended, uncontrolled, globally pervasive experiment whose ultimate consequences could be second only to a global nuclear war.”
Other than carbon dioxide emissions were increasing, there was no solid evidence supporting the drastic claim that carbon dioxide is a major cause of increasing temperatures. Afterwards, the method for calculating comprehensive temperatures of the atmosphere was published, criticized and corrected. These data are a bulwark against false claims regarding unprecedented global warming, etc.
Lewin states that: “A ‘law of the air’ was envisaged on the model of a ‘law of the sea’, but progress towards agreement on that treaty remained bogged down in negotiations over the funding of ‘technological transfer’ to poorer countries.”
Amusingly, the law of the sea treaty stemmed from CIA cover story featuring Howard Hughes and the Glomar Explorer built to recover the sunken Soviet submarine K-129, lost in March 1968 in very deep water, at a depth of 16,500 feet (5.029 kilometers), northwest of Hawaii. The cover story included details on how the Glomar Explorer would gather precious metals on the ocean floor, which, as claimed, existed as clumps waiting for the picking. Many countries believed the story and demanded a share of the bounty through the UN, even though they were landlocked. Thus, one can argue that the IPCC and the Paris agreement stem from a CIA cover story. See links under Challenging the Orthodoxy.
Past Fears: SEPP Chairman emeritus Fred Singer has a brief essay in the American Thinker reviewing the fear of a nuclear winter, and an associated fear of the burning oil wells in Kuwait leading to a cold period. See link under Challenging the Orthodox.
Tidal Power: The £1.3 billion alternative electricity scheme based on tidal power to be located in a tidal lagoon at Swansea Bay, Wales, has been rejected by the UK government. Writing in Energy Matters, Roger Andrews covers the main deficiencies of the project, including its varying, four power spikes a day. The differences in power output between spring tides, and neap tides are considerable. (Spring tides occur when the Sun, Moon and Earth are aligned, neap tides occur when the sun and moon are perpendicular to the earth.)
Andrews exposes deficiencies in the government’s thinking:
“But there’s a problem with the government’s approach. The three technologies [nuclear, wind, and tidal] are being compared based on the lifetime cost of electricity regardless of when it’s generated, and the generation patterns are completely different.”
“But no one worried about the prohibitive amounts of energy storage needed to convert intermittent renewables into dispatchable power in those days. And they still don’t. Searching for ‘storage’ in the government’s Tidal Lagoon Programme Factsheet yields zero hits.”
Unfortunately, the costs of energy storage are often overlooked by those promoting and approving intermittent power. See links under: Alternative, Green (“Clean”) Energy — Other
Number of the Week: $3 per MMBTU? Over a year ago, TWTW estimated the total transportation cost of liquid natural gas (LNG) from a pipeline on the gulf Coast to pipeline in Japan was about $6 per MMBTU (one million British Thermal Units (BTU)). These costs include liquification, transport, and re-gasification. The opening of the widened Panama Canal and technological improvements have lowered these costs. Based on reports from energy companies involved in LNG, it appears that current costs are less than one-half the previous estimate or under $3 per MMBTU. See links under Energy Issues – Non-US
1. World’s Biggest Battery Proposed in California
PG&E is asking for approval to build storage projects, including a battery behemoth to be owned by Vistra Energy Corp.
By Erin Ailworth, WSJ, June 29, 2018
SUMMARY: “A California utility is seeking permission to have a company build the world’s largest battery, joining a growing list of power companies investing in storing electricity.
“Pacific Gas & Electric Co., part of PG&E Corp. detailed plans for four storage projects totaling nearly 570 megawatts in a Friday filing to regulators, including a 300-megawatt battery installation at a natural-gas-fired power plant owned by Vistra Energy Corp.
“That battery, the largest installation of its kind, would be owned by Vistra, and it would be capable of running for four hours while putting out the same amount of power as a small natural gas plant.
“In addition, the utility is seeking approval for a 182.5 megawatt system built by Tesla Inc. that would be owned by PG&E, as well two smaller projects. The California Public Utilities Commission must approve the storage projects.
“Cost estimates for all of the projects were not immediately available.
“’Recent decreases in battery prices are enabling energy storage to become a competitive alternative to traditional solutions,’ said Roy Kuga, vice president of grid integration and innovation at PG&E.
“A 100-megawatt battery installation built by Tesla in Australia is currently the world’s largest in operation.
After a discussion of other projects, the articles continues:
“The storage projects proposed by PG&E come several months after the California Public Utilities Commission directed Pacific Gas & Electric Co., the state’s largest investor-owned utility, to solicit bids for renewable energy resources, including battery storage, to replace three costly fossil fuel plants.
“That directive has helped put pressure on natural-gas-fired power plants in California, which are finding it harder to compete as the state looks to satisfy its aggressive clean energy goals and as the cost of renewable resources continues to decline.
“PG&E says it expects the 182.5-megawatt system to come online by the end of 2020, pending the commission’s approval. The company did not estimate the full cost of the project but predicts it will look to request about $41.2 million from ratepayers for the project in the first year it is operational.
“The 300-megawatt battery project would be built at Vistra’s Moss Landing power plant, which has produced electricity since 1950. The battery would hook up to the grid via existing connections at the site, where two older generating units were retired at the end of 2016.
“Curt Morgan, chief executive of Vistra, said he sees more battery system investments in his company’s future, especially wherever those installations can take advantage of power infrastructure already in place.
“’I don’t think it’s any secret that California over time is trying to move away from fossil fuel power plants, so batteries are kind of a natural to provide power at the peak periods,’ he said.”
2. The Texas Well That Started the Fracking Revolution
Two decades ago, an engineer tried a new way to get gas out of the ground. Energy markets and global politics would never be the same.
By Russell Gold, WSJ, June 29, 2018
SUMMARY: DISH, Texas—Twenty years ago this month, a well was drilled here that changed the world.
“Nothing at the time suggested the unassuming well in this rural town north of Fort Worth would hobble OPEC, the powerful oil cartel that had governed prices of the world’s most important commodity for more than a generation. Or that it would help turn the U.S. into a global energy exporter, or shuffle the geopolitical deck.
“But it did all of that—and more. The well used hydraulic fracturing to crack the incredibly tight shale rocks below. It fired the first shot in the fracking revolution—a blast soon felt in Riyadh, Tehran and Moscow.
“‘I had no idea it would cause so much change. I was just trying to keep my job,’ said Nick Steinsberger on a recent visit to the well pad. He was the engineer who obtained permission to try a new approach to completing the well that had been drilled a mile and a half deep into a thick grey wedge of rock known as the Barnett Shale.
“Mr. Steinsberger, now 54, called the experiment ‘my slick-water frack.’ It was the first commercially successful use of sand, water and chemicals, pumped into the shale under high pressure, to break open the rock and unleash the natural gas trapped inside. It was the beginning of modern fracking.
“‘It was a good well, cost $600,000 or $700,000,’ Mr. Steinsberger said, walking over the pad to the chain-link fence that surrounds the well. A sign identifies it as the S. H Griffin Estate 4.
“Today, most wells drilled in the U.S. use some variation of Mr. Steinsberger’s fracking technique. It has unleashed an unimaginable wealth of natural gas, gas liquids and crude oil, turning the U.S. from an energy pauper into a muscular exporter. It also started an often acrimonious environmental debate about the potential impacts and trade offs of fracking.
“‘It is one of the most extraordinarily important, disruptive, technologically driven changes in the history of energy,’ said Ed Morse, global head of commodity research at Citigroup. ‘It was revolutionary for the U.S. economy and it was revolutionary geopolitically.’
“Mr. Steinsberger’s modest experiment demonstrated that the oil and gas industry had the tools to fracture the rocks where fossil fuels were slowly baked over the millennia. A huge trove of natural gas was accessible at an economical cost.
“It was such a novel idea that it spread slowly at first, as doubters couldn’t believe that anyone could successfully tap the source rocks. After a few years, more companies began to copy the wells drilled by Mr. Steinsberger’s employer, Mitchell Energy , the firm founded by the late George P. Mitchell.
“It started in the Barnett Shale. Then other gas-bearing shales were discovered. The Marcellus Shale in Appalachia turned out to be larger and more fecund than the Barnett.
“In 2008, more than a decade after Mr. Steinsberger’s well, the industry made another quantum leap: Not only could fracking liberate small natural gas molecules from rocks, it also worked on the longer hydrocarbon chains that make up crude oil. Companies such as EOG Resources Inc. began to drill and frack shales bearing crude oil and natural gas liquids in North Dakota and Texas. The technique has since spread to other countries such as Argentina.
“The proliferation of oil and gas production transformed the U.S. energy landscape. A looming dearth of natural gas had led companies to build import terminals. Now there is so much gas the U.S. exports the fuel around the world.
“The low-cost fuel has become the leading source of power generation in the U.S. Its rise has reshaped electricity markets, leading to the closure of more than 200 coal plants, as well as a number of nuclear plants. The Trump Administration’s current proposal to subsidize coal and nuclear plants is an indirect result of fracking.
“The impact on oil markets might be, if anything, more significant. U.S. oil production had fallen persistently for years, dropping below five million barrels a day. And then: fracking. This year, it hit a new all-time high, reaching 10.9 million barrels a day in June. It is now the world’s largest producer of crude and other valuable petroleum liquids, ahead of Russia and Saudi Arabia.
“The surge has weakened the Organization of the Petroleum Exporting Countries. Facing a growing supply of oil from the U.S., the group stumbled and fought over what to do. It unsuccessfully tried to crush frackers by ramping up production in 2014 to drive down the price of oil, before making its peace with them. Last week, the cartel’s members coordinated with Russia to produce more barrels to prevent oil prices from rising further. Shale output was outside of their control.
“The U.S. emerged as a newly confident energy powerhouse. It was no longer fearful that an embargo could maim its economy. This attitude was reflected in a more aggressive foreign policy, as shown by its willingness to take a tough negotiating posture with Iran.
“‘The fracking boom was the biggest energy story around the world. But it was also the biggest geopolitical story and the biggest environmental story,’ said Michael Webber, deputy director of the Energy Institute at The University of Texas at Austin.
“The proliferation of natural gas, displacing coal, helped the U.S. lower its overall greenhouse gas emissions by 13.4% in the last decade, while growing its gross domestic product, according to BP PLC’s Statistical Review of World Energy.”
The article discusses some downsides, such as traffic, earth tremors, unsubstantiated claims of water pollution and the splitting of the environmental movement on fracking.
“Meanwhile, fracking continues to evolve. Supersized fracks have become commonplace.
“Fracking uses grains of sand to prop open the newly formed cracks to allow gas or oil to flow out. While Mr. Steinsberger’s well required 229,000 pounds of sand, a large contemporary well might require 30 million pounds of sand. The amount of water needed has increased as well.
“The S. H. Griffin well has continued to produce gas for two decades. Over the years, more than 2.6 billion cubic feet have flowed out, worth some $8 million at today’s prices. A new well with a supersized frack can produce as much in a day as the original could in two months.
“The proliferation of large wells has kept gas below $4 per million British thermal units since December 2016, after topping $10 in 2008. Mr. Steinsberger, who still oversees eight to ten fracks a year, doesn’t see that changing for a long time.
“‘One day, there might be lasers shooting at the rock’ thousands of feet underfoot, he said. ‘I can’t predict that. But I can tell you natural gas prices will be low for the rest of our lives.’”