The Week That Was: August 1, 2020
Quote of the Week: “The right to search for the truth implies also a duty; one must not conceal any part of what one has recognized to be true.” – Albert Einstein. [H/t Michael Dourson]
Number of the Week: 33 to 1
July Summary Part IV; Changing Ocean Chemistry and Sea Levels: Three weeks ago TWTW reviewed Richard Lindzen’s new paper summarizing what we know with reasonable certainty, what we suspect, and what we know is incorrect about climate change, the greenhouse effect, temperature trends, climate modeling, ocean chemistry, and sea level rise. Key parts included:
1) The climate system is never in equilibrium.
2) The core of the system consists of two turbulent fluids interacting with each other and unevenly heated by the sun, which results in transport of heat from the equator towards the poles (meridional) creating ocean cycles that may take 1,000 years to complete.
3) The two most important substances in the greenhouse effect are water vapor and clouds, which are not fully understood and are not stable.
4) A vital component of the atmosphere is water in its liquid, solid, and vapor phases and the changes in phases have immense dynamic consequences.
5) Doubling carbon dioxide, (CO2), creates a 2% disturbance to the normal flow of energy into the system and out of the system, which is similar to the disturbance created by changes in clouds and other natural features.
6) Temperatures in the tropics have been extremely stable. It is the temperature differences between the tropics and polar regions that are extremely important. Calculations such as global average temperature largely ignore this important difference.
Two weeks ago, TWTW used the work of William van Wijngaarden and William Happer (W & H) to summarize what we know with reasonable certainty, what we suspect, and what we know is incorrect about the greenhouse effect. Both the gentlemen are experts in Atomic, Molecular, and Optical physics (AMO), which is far from simple physics, but is necessary to understand how greenhouse gases interfere with (delay) the radiation of energy from the surface into space – that is, to understand the mechanisms by which the earth loses heat every night.
1) There is no general understanding of the greenhouse effect sufficient to develop elegant equations.
2) The optical depth or optical thickness of the atmosphere (transparency) changes as altitude changes. The depth is measured in terms of a natural logarithm and, in this instance, relates to distance a photon of a particular frequency can travel before it is absorbed by an appropriate molecule (one that absorbs and re-emits photons of that frequency).
3) Unlike other natural greenhouse gases, water vapor, the dominant greenhouse gas, is not evenly distributed in the atmosphere. [SEPP Comment: The variability of water vapor during the daytime and the formation of clouds from H2O, etc., combine to make impossible theoretical computations of “climate” dynamics with any value. Because H2O is known to be “all over the map” the Charney Report recognized a decent calculation was impossible. So, it went down the erroneous path of ignoring H2O, and assumed a CO2 value; and then came back in later with a “feedback” argument to try to account for H2O. It didn’t work then, it doesn’t work now, and won’t work in the future.]
4) There is a logarithmic relationship between greenhouse gases and temperature.
5) “Saturation” means that adding more molecules causes little change in Earth’s radiation to space. The very narrow range in which methane (CH4) can absorb and emit photons is already saturated by water vapor (H2O), the dominant greenhouse gas, below the tropopause, where the atmosphere is thick. Thus, adding methane has little effect on temperatures.
6) Their (W & H) calculations show that a doubling of CO2 will increase temperatures by no more than 1.5 ⁰ C – an upper bound.
Last week, TWTW reviewed the problems with models as discussed by established Japanese climate modeler Mototaka Nakamura and as demonstrated in a new paper by Ross McKitrick and John Christy. Previously, Tony Thomas summarized some of the main problems identified by Nakamura:
- Ignorance about large and small-scale ocean dynamics.
- A complete lack of meaningful representations of aerosol changes that generate clouds.
- Lack of understanding of drivers of ice-albedo (reflectivity) feedbacks: “Without a reasonably accurate representation, it is impossible to make any meaningful predictions of climate variations and changes in the middle and high latitudes and thus the entire planet.”
- Inability to deal with water vapor elements.
- Arbitrary “tunings” (fudges) of key parameters that are not understood.
Further, Nakamura rejects the IPCC concept that the influence of humans adding CO2 can be predicted by models. He states:
“I want to point out a simple fact that it is impossible to correctly predict even the sense or direction of the change of a system when the prediction tool lacks and/ or grossly distorts important nonlinear processes, feedbacks in particular, that are present in the actual system.” [Boldface added.]
Nakamura further states that two major problems in the models are ocean flows (ocean circulation) and water in the atmosphere. Both problems are stated by Lindzen.
McKitrick and Christy tested the values calculated from 38 new CMIP6 models for the time period 1979 to 2014 with datasets from three different types of observations. 1) Four different sets of Radiosonde (or sonde) data obtained from weather balloons. 2) Four different sets of data obtained by microwave sensors onboard polar orbiting satellites which measure intensity of microwave emissions from atmospheric oxygen which are directly proportional to temperature. 3) Four different datasets known as Reanalyses, two from Europe, one from Japan and one from the US, NASA.
The 12 datasets cover 35 years and have been available for at least 5 years. The three different types of datasets from observations are grouped tightly both for global and the tropics. For most of the models, the mean for satellite observations is below the lower part of the 95% confidence interval, for that model, indicating that the model cannot estimate atmospheric temperature trends. As Nakamura has written, the global climate models have no predictive value. The UN IPCC and its followers have clearly departed from the scientific method into the world of wild speculation.
Changing Ocean Chemistry: As discussed in the July 14, TWTW, environmentalist Jim Steele has directly taken on the claims of ocean acidification, a lowering of the alkaline level (pH) of the oceans. The concept of pH was not created until the early 1900s and refined in the 1920s. Yet climate modelers routinely claim it has been declining since the 1850s. There is no way of knowing.
In general, the pH of the oceans is, and will remain, above 7, alkaline. The general range is 7.8 to 8.5. The term ocean acidification is a deliberate effort by some scientists to shock the public. Thus, they are deliberately engaged in propaganda, not science, a far too common practice in persons and organizations claiming to embrace science.
Steele discusses recent alarm:
“For example, for nearly a decade the media has hyped the 2006-2008 die-off of larval oysters in hatcheries along Washington and Oregon. They called it a crisis caused by rising atmospheric CO2 and the only solution was to stop burning fossil fuels. But it was an understanding of natural pH changes that provided the correct solutions. Subsurface waters at a few hundred meters depth naturally contain greater concentrations CO2 and nutrients and a lower pH than surface waters. Changes in the winds and currents periodically bring those waters to the surface in a process called upwelling. Upwelling promotes a burst of life but also lowers the surface water pH. Not fully aware of all the CO2 dynamics, the hatcheries had made 3 mistakes.
“First, they failed to recognize not all oyster species are well adapted to the low pH of upwelled water. The larvae of native Olympia oysters naturally survive intense upwelling events along the Washington coast because that species “broods” its larvae. The larvae initiate their shells protected inside their parents’ shells where pH is more controlled. However, the Olympia oysters were over-harvested into near extinction in the 1800’s.
“So, fishermen imported the Japanese oyster, which is now the mainstay of the Washington and Oregon fisheries. Japanese oysters did not evolve within an intense upwelling environment similar to Washington’s coast. Each Japanese oyster simply releases over 50 million eggs into the water expecting their larvae to survive any mild changes in pH during initial shell formation. Hatcheries didn’t realize the Japanese oyster’s larvae had a 6-hour window during which the larvae’s initial shell development and survival was vulnerable to low pH.
“Second, because cooler waters inhibit premature spawning, hatcheries pumped cool water from the estuary in the early morning. As measured in coral reefs, photosynthesis raises pH during the day, but nighttime respiration drops pH significantly. By pumping early morning water into their tanks, they imported estuary water at its lowest daily pH. Finally, they failed to recognize natural upwelling events transport deeper waters with naturally low pH into the estuary, further lowering the pH of water pumped into their tanks.
“Now, hatcheries simply pump water from the estuary later in the day after photosynthesis has raised pH. Scientists also developed a metering device that detects intrusions of low pH waters, so hatcheries avoid pumping water during upwelling events. As for most shellfish, once the shell is initiated, a protective layer prevents any shell corrosion from low pH conditions. Problem easily solved and crisis averted!
“The simplistic idea that burning fossil fuels is causing the surface ocean to become more acidic is based on the fact that when CO2 interacts with water a series of chemical changes results in the production of more hydrogen ions which lowers pH. Unfortunately, all catastrophic analyses stop there. But living organisms then reverse those reactions. Whether CO2 enters the surface waters via the atmosphere or from upwelling, it is quickly utilized by photosynthesizing plankton which counteracts any “acidification”. A percentage of the organic matter created in the sunlit waters sinks or is actively transported to depths, further counteracting any surface “acidification’. Some organic matter sinks so rapidly, CO2 is trapped at depths for hundreds and thousands of years. The dynamics that carry carbon to ocean depths largely explains why the oceans hold 50 times more CO2 than the atmosphere.
“To maintain marine food webs, it is essential that upwelling bring sunken nutrients back into the sunlight to enable photosynthesis. Upwelling also brings stored CO2 and low pH water to the surface. Wherever upwelling recycles nutrients and lowers surface pH, the greatest abundance and diversity of marine life is generated.”
Thus, the public has been led to worry about non-existent “ocean acidification,” rather than informed about natural upwelling processes that make certain waters, including the US West Coast, very rich in marine life. Interestingly, field-based research just reviewed by CO2 Science reports that the Pacific oyster (Crassostrea gigas) native to Japan can adapt to the waters of the Northwest, but it may be only during normal upwelling periods, not extreme upwelling periods, such as what occurred in 2006-2008. See links under Challenging the Orthodoxy and Review of Recent Scientific Articles by CO2 Science.
Sea Level Change: Prevailing winds over long stretches of ocean push the water surface so as to affect sea level.When dominant winds change, measured sea levels change, particularly in the Western Pacific, along Asia. Further, all too frequently, changes in instruments and how they are calibrated are not reported in papers wishing to cause alarm. Many journals, which claim to be peer-reviewed, readily report such alarming results. Using a century of sea level measurements at geologically stable Newlyn, Cornwall, England, the April 25 TWTW discussed how the failure to accurately report trends can mislead the public.
The Newlyn study scrupulously discusses how different instruments and different time frames give totally different trends. Figure 8 in the study link below shows these trends and the text states:
“The record of monthly MSL [Mean Sea Level] at Newlyn during the past century. The average rates of change of MSL for the complete record and for the recent period 1993–2014 are 1.8 [tidal gauge] and 3.8 mm/year [satellites] respectively and are shown by the black lines.
“However, the observed rate of sea level change at Newlyn over 1993–2014 has been much larger at 3.8 mm/year (we use 1993 somewhat arbitrarily for the start of the modern era in sea level monitoring as that was when precise altimeter information from space became available). This highest rate in the record may represent the start of a long-term acceleration in sea level due to climate change (Church et al. 2014), or simply be a feature of the decadal variability in MSL that has been evident throughout the Newlyn record (and indeed in all tide gauge records). Figure 8 shows that high rates were observed in previous 22-year periods, including those centered on approximately 1926, 1950, and 1980 (with rates of approximately 3 mm/year), with the lowest rates centered on 1934 and 1968 (approximately 0 mm/year), with such accelerations and decelerations in the record similar to those seen in other parts of the world (Woodworth et al. 2009b). The variability and long-term trend in the Newlyn MSL record are similar to those at Brest (Wöppelmann et al. 2006), although some differences become apparent in a detailed comparison (Douglas 2008), and at other stations in the North Sea area (Wahl et al. 2013)”
Note that 1.8 mm per year works out to be about 7 inches per century, which NIPCC 2008 reported.
All too frequently, “scientists” wishing to alarm the public create a graph superimposing one dataset onto another dataset and truncate any divergence in the datasets. A paper published in Nature took sea level rise even further, claiming “sea level rise is a well-accepted consequence of climate change.” Cooling is climate change, but a new ice age will cause sea levels to drop.
The Nature paper then uses statistical techniques to claim a relationship between sea levels and carbon dioxide emissions and projects extreme sea level rise in many parts of the world. Paul Homewood compares the claims with what is actually occurring in the UK to demonstrate how absurd this paper is. See links under Challenging the Orthodoxy – NIPCC, Challenging the Orthodoxy, Changing Seas, and Communicating Better to the Public – Make things up.
Hydrogen Maybe? The President of MIT presented, “MIT’s Plan for Action on Climate Change,” claiming there is a climate crisis and a need for breakthrough innovations. Ernest Moniz, former Secretary of the Department of Energy co-authored an article on steps to secure the US electricity. “We also need to address the long-term electricity storage imperative, including support for building a hydrogen infrastructure and expanding agricultural production of renewable natural gas.”
Thanks to the Climate Change Act 2008, the UK is further along the path of changing its sources of electricity than the US. Nature Energy published a study by Malte Jansen, et al. of the Imperial College London. [Some readers may recognize that the same school produced the highly influential Covid-19 report but there is no reason the two are related.] News release of the study claimed that offshore wind power is now so cheap it could pay money back to consumers.
The Global Warming Policy Forum (GWPF) issued a report on the wind power study which addressed a major problem in the paper in Nature Energy – namely that auction prices do not necessarily reflect actual eventual costs. The GWPF cites other studies stating:
“Taken together these studies and the data sources they presented raised important and troubling questions about the effectiveness of the Contracts for Difference auctions in reducing decarbonisation costs, and indicate, as Hughes et al. have observed, that the system was being gamed.
“It is therefore as surprising as it is disappointing that Nature Energy has chosen to publish a study, (M Jansen et al, “Offshore wind competitiveness in mature markets without subsidy”, Nature Energy, 27 July 2020), that attempts to take the discussion back to a more primitive and inadequate level of analysis, in which the bid prices at various auctions in Northern Europe are taken as a reliable indicator of underlying cost.”
Gaming is a common trick for contractors, bid low to get the deal then hope to re-coup the real costs with cost overruns, changes, etc. The GWPF report goes on to state:
“Anyone who wants to make claims about costs – whether of renewable energy or any other infrastructure service – must first collect and analyse data on actual costs, as Hughes et al (2017) and Hughes (2019) and Aldersey-Williams et al. (2019) did. The fact that Jansen et al. (2020) actually cite this work but ignore its implications is extraordinary and raises questions about the quality of peer-review at Nature Energy.
“The topic of wind power costs and particularly offshore wind costs is a live and important area of serious concern. Jansen et al’s paper is a retrograde step both methodologically and, in its conclusions, and government cannot take comfort from its optimistic assertions. On the contrary, government should note that if enthusiasts for the offshore wind industry can do no better than Jansen et al. (2020) then there is clearly a serious problem with the underlying cost trends of this sector.”
The National Grid, a UK energy company also in New York, issued a report entitled “Future Energy Scenarios 2020 (FES).” As discussed by Paul Homewood, this report has a number of problems and appears to say only what the government wants to hear. It does not include realistic estimates of costs, or even whether excess wind power, from overbuilding, is saleable. This is a major problem occurring in Germany and Denmark: other countries do not want excess wind power which may destabilize their grid.
What is particularly interesting is the part in the FES on hydrogen storage. “Hydrogen and carbon capture and storage must be deployed for net zero. Industrial scale demonstration projects need to be operational this decade.” There is no proof of concept on the cost of hydrogen storage. As Homewood writes about Natural Gas Reforming/Gasification, “Converting gas to hydrogen is an extremely energy inefficient process. Natural gas input of 654 TWh only produces 527 TWh of hydrogen, a loss of 20%. In my view, that is extremely optimistic, given that carbon capture would also have to be added” [to capture the carbon dioxide given off in the process.]
As the green-new world is opening, it appears to include a great deal of speculation, similar to the speculation that accompanied President Carter’s claim that the US would run out of oil and natural gas around the year 2000. Some profited from what was economic misery for many. See links under Challenging the Orthodoxy, Defending the Orthodoxy, Energy Issues – Non-US, Alternative, Green (“Clean”) Energy — Storage
Vote for Aprils Fools Award: The voting for the SEPP’s April Fools Award will be continued until August 10. Due to changes in schedules, there are no conferences held before then to announce the results. So, get your votes in now.
Number of the Week: 33 to 1. John Robson emphasizes a slide in the extensive ‘deck’ of slides posted by Michael Shellenberger.
It “compares the annual revenue of two major American alarmist entities, the Environmental Defense Fund and the National Resources Defense Council, and two of these lavishly funded skeptical outfits, the Competitive Enterprise Institute and the Heartland Institute. And guess what? The alarmists outspend the skeptics 33:1.”
See links under Funding Issues
Trump Helps the Environment by Enraging Environmentalists
His plan to reform NEPA would speed replacement of old, dirty projects with cleaner new ones.
By Richard A. Epstein, WSJ, July 30, 2020
TWTW Summary: The law professor at New York University, a senior fellow at the Hoover Institution and a senior lecturer at the University of Chicago describes the extent to which well-intended legislation has morphed into bureaucratic nightmare benefiting lawyers and preventing environmental improvements.
The Trump administration recently published the first comprehensive revision of federal regulations under the National Environmental Policy Act of 1970. Environmental groups predictably denounced the initiative. Among its many detractors, the Wilderness Society insists that these regulations will ‘essentially gut’ NEPA by putting ‘polluters in charge of environment protection.’ This objection wholly overlooks NEPA’s deeply dysfunctional features.
From its inception in 1970, NEPA had two basic objectives: first, to require all new projects to receive a thorough and transparent vetting of potential environmental risks; second, to expand democratic participation in the review process via public hearings.
Five decades later, it is clear that NEPA has achieved neither. The most obvious sign of institutional distress is the long time—4.5 years on average—to complete the elaborate environmental impact statement before work can commence. Today’s NEPA behemoth is far from its 1970 origins, which is why the Trump administration’s update is overdue.
Environmentalist critics work on the flawed assumption that the longer the review period, the greater the environmental protection. But that’s untrue for the large majority of important projects. As I detail in a report for ConservAmerica, these new projects typically replace older, more dangerous projects and use superior technologies unavailable generations ago. When NEPA review delays a state-of-the-art pipeline, for instance, that requires greater shipment of fossil fuels by rail and truck, which is far more likely to cause major spills with extensive collateral damage.
As drafted, NEPA requires the government agency in charge of a review to consider ‘every significant aspect of environmental impact,’ a clear impossibility for complex multibillion-dollar projects. Typically the truncation of that open-ended inquiry leads officials to become preoccupied with small defects and overlook the major improvements in both consumer welfare and environmental safety that new roads, bridges, airports and other projects promise.
Ostensibly, the report released in July by the Trump administration is concerned only with NEPA’s extensive procedural provisions. It contains many useful proposals on how to coordinate and streamline a cumbersome process often divided haphazardly among multiple agencies. One set of needed changes is strict timelines and page limits on environmental impact statements to speed up and focus the review process.
But by far the most important proposal is to soften the devastating consequences that flow from any asserted NEPA violation. Courts have wrongly created a strong presumption that any deviation from NEPA’s exacting requirements, however trivial, requires that the permit be denied.
The author states that the long approval process for the Atlantic Coast Pipeline and Dakota Access Pipeline exemplifies the abuse typical to NEPA review. He goes on to state:
NEPA rules also deviate from sound judicial practice. An injunction is appropriate only when a plaintiff can show irreparable harm, which can’t be demonstrated solely by showing that the project developer does not yet have in place a perfect plan for containing oil spills that are unlikely to occur in the first place. Ironically, NEPA’s laborious process undercuts the statutory objective of making informed public decisions. Trying to decide everything at the initial stage of review requires speculation and invites errors in judgment. A far more sensible process would allow work to begin while these details are ironed out through project upgrades, backed by public and private inspections, strong liability protections and extensive insurance policies. These sensible precautions would sharply cut down both the frequency and severity of adverse environmental impacts.
As drafted, NEPA was intended to invite all segments of the public to submit comments to improve decision making. But in 1971, in Calvert Cliffs’ Coordinating Committee v. U.S. Atomic Energy Commission, the U.S. Circuit Court of Appeals for the District of Columbia invited a flood of new litigation by holding that any disappointed party may challenge an agency approval in federal court. Even if the bulk of informed opinion supports a new project, an extreme outlier can sue to stop it. NEPA includes no provision establishing a private right of action, but the practice has become so ingrained that it can’t be undone by regulation.
Congress should act to stop the hijacking of the permit process to block the use of fossil fuels throughout the economy. Informed, democratic decision making requires consistent environmental regulation, not a patchwork of dubious judicial decisions that turn NEPA into a legal swamp that now must be drained.