By Ken Haapala, President, Science and Environmental Policy Project
Brought to You by www.SEPP.org
Which Trends? Last week’s TWTW may have given a wrong impression to some readers. In discussing the estimate by Christy, et al., of the rate of warming of the bulk atmosphere over the past 38 years of 0.10 ± 0.03°C per decade, the rate of warming was used to project the warming from a doubling of CO2. Not emphasized sufficiently strongly is that the projection embodies the highly speculative assumption that the rate will continue for about two centuries. Most assuredly, it will not.
The rate of warming or cooling will change constantly, and we cannot forecast, which direction, how much, or why. The principle source of energy is the sun, providing irradiance, solar wind (plasma), and solar magnetism. The changing intensity of the sun, changing orbit of the earth, and the slowly changing tilt of its rotation axis all affect the earth’s climate. Also, galactic cosmic rays may play a role. These are external sources of instability and uncertainty.
Complicating the issue even further is the earth’s rotation produces gyres and flows in two dynamic fluids, the atmosphere and the oceans, resulting in a turbulent system. Oceans cover over 70% of the earth’s surface. Fluid dynamics is not well understood. Coupled with the rotation of the earth, these two dynamic fluids create internal variability in the climate system. The exchange of energy within or between the oceans and the atmosphere can cause one or the other to warm or cool without any change in the energy provided by the sun.
Thus, we have natural variation without human influence, some of it is poorly understood or misunderstood. For example, carbon dioxide (CO2) is the atmospheric gas most readily absorbed by water, rain. Cold water more readily absorbs gases than warm water. When the oceans cool they absorb more atmospheric CO2, when they warm they release CO2, increasing its concentration in the atmosphere.
This variation in the composition of the atmosphere is seen in ice cores, such as those taken at Vostok in Antarctica. Many people believe variation in CO2 caused the variation in temperatures recorded in ice cores, thus declaring CO2 was the cause of the temperature change. However, the correlation was the opposite. Temperature change caused by other mechanisms preceded the change in CO2 concentrations. Careful examination of the ice cores at the end of the last warm period, the Eemian, show that falling temperatures lead falling CO2 concentrations for about 14,000 years. Both fall, but temperatures first.
Some of the difficulties of predicting / projecting global warming / climate change are discussed in a paper written by Jay Lehr, Kenneth Haapala, Patrick Frank, and Patrick Moore in addressing the eight questions asked by Judge William Alsup in the public nuisance lawsuit brought by Oakland and San Francisco against oil companies. The paper was published by The Heartland Institute, which published the reports of the Nongovernmental International Panel on Climate Change (NIPCC). See links under Challenging the Orthodoxy – NIPCC and Challenging the Orthodoxy.
Quote of the Week.“There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.”– A.N. Whitehead, [H/t Tim Ball]
Number of the Week: $1.00 per month
Exponential Functions: Exponential functions are ones in which a constant is increased by an exponent, in effect, multiplied by itself, perhaps several times. They are used frequently in reports of the Intergovernmental Panel on Climate Change (IPCC) and by advocates of these reports. For example, as discussed in last week’s TWTW, in his judicial tutorial on behalf of the California Cities, Myles Allen, a lead author in the Fifth Assessment Report (IPCC, AR5, 2013 & 14) used exponential functions to project future increases of sea level in the Bay Area.
This use is highly misleading. It gives the impression that the rate of sea level rise will become infinite sometime in the future. Unless there are compelling reasons given, any such projections should be heavily discounted. [Some investment analysts have considered the use of exponential functions sufficient reason to dismiss an investment prospectus.] Previously, the NASA-GISS (Goddard Institute for Space Studies) web site had an exponential function for 21st century sea level rise. The bulk of the rise would be in the last decade of the century, with a rate of rise equal or greater than the rate of rise during the melting of the great ice sheets covering the northern part of the Northern Hemisphere during the last Ice Age. It made no sense. [Note that a recent search of the web site failed to uncover it.]
In recent reports on local sea level rise, NOAA has resorted to similar tricks to deceive the public. Sea level rise projections of up to 7 feet (2 meters) in the Tampa-St. Petersburg, Florida, are but one example. Such exaggerations are indicative of an irresponsible government agency.
Please note that decades of experiments in different laboratories clearly establish that the ability of carbon dioxide (CO2) to absorb infrared radiation, energy given off from the earth to space, is logarithmic. Logarithmic is the opposite of exponential, and like exponential should only be used based on strong evidence. Unless there is compelling evidence indicating otherwise, TWTW will assume that adding CO2 to the atmosphere will modestly increase atmospheric temperatures, while recognizing that the CO2 influence may be overwhelmed by natural variation. See last week’s TWTW, “In Defense of Oakland and San Francisco” and referenced links.
NASA – No Data from Space? A review of NASA-GISS material produced two articles with James Hansen, former head of NASA-GISS, as the lead author. One was published in 1988, the other in 2008, twenty years later. Of particular interest was the use of atmospheric temperature data from satellite measurement – there was none. In 1988, the technique for using satellite observations for estimating temperatures had not been developed. But, in 2008 a 29-year database existed. Yet, it was ignored by NASA-GISS. Perhaps, one can attribute this ignorance to inertia in bureaucratic science. See links under Defending the Orthodoxy.
Multi-Layered Defense: Our friends in the New Zealand Climate Science Coalition (NZCSC) have spent months to get the Royal Society of New Zealand to substantiate the claim that mankind is causing dangerous global warming. They final received an answer – a link to a February 17, 2014 web post by The Royal Society in the UK titled: “Climate change: evidence and causes.”
The project background section states:
“The Royal Society and the US National Academy of Sciences, with their similar missions to promote the use of science to benefit society and to inform critical policy debates, offer this new publication as a key reference document for decision makers, policy makers, educators, and other individuals seeking authoritative answers about the current state of climate change science. The publication makes clear what is well established, where consensus is growing, and where there is still uncertainty. It is written and reviewed by a UK-US team of leading climate scientists. It echoes and builds upon the long history of climate-related work from both national science academies, as well as the newest climate change assessment from the United Nations’ Intergovernmental Panel on Climate Change.”
The answer to the key question: “How do scientists know that recent climate change is largely caused by human activities?” is:
“Scientists know that recent climate change is largely caused by human activities from an understanding of basic physics, comparing observations with models, and fingerprinting the detailed patterns of climate change caused by different human and natural influences.
“Since the mid-1800s, scientists have known that CO2 is one of the main greenhouse gases of importance to Earth’s energy balance. Direct measurements of CO2 in the atmosphere and in air trapped in ice show that atmospheric CO2 increased by about 40% from 1800 to 2012. Measurements of different forms of carbon (isotopes, see Question 3) reveal that this increase is due to human activities. Other greenhouse gases (notably methane and nitrous oxide) are also increasing as a consequence of human activities. The observed global surface temperature rise since 1900 is consistent with detailed calculations of the impacts of the observed increase in atmospheric CO2 (and other human-induced changes) on Earth’s energy balance.
“Different influences on climate have different signatures in climate records. These unique fingerprints are easier to see by probing beyond a single number (such as the average temperature of Earth’s surface), and looking instead at the geographical and seasonal patterns of climate change. The observed patterns of surface warming, temperature changes through the atmosphere, increases in ocean heat content, increases in atmospheric moisture, sea level rise, and increased melting of land and sea ice also match the patterns scientists expect to see due to rising levels of CO2 and other human-induced changes (see Question 5).
“The expected changes in climate are based on our understanding of how greenhouse gases trap heat. Both this fundamental understanding of the physics of greenhouse gases and fingerprint studies show that natural causes alone are inadequate to explain the recent observed changes in climate. Natural causes include variations in the Sun’s output and in Earth’s orbit around the Sun, volcanic eruptions, and internal fluctuations in the climate system (such as El Niño and La Niña). Calculations using climate models (see Learn about… computer models) have been used to simulate what would have happened to global temperatures if only natural factors were influencing the climate system. These simulations yield little warming, or even a slight cooling, over the 20th century. Only when models include human influences on the composition of the atmosphere are the resulting temperature changes consistent with observed changes.” [Boldface added.]
So, the quest for hard evidence is dismissed with bureaucratic science capped with circular reasoning and models that produce infinite solutions, while ignoring the atmosphere, where the greenhouse effect occurs.
See links under Challenging the Orthodoxy and Defending the Orthodoxy.
Real Threats: The electrical grid is an energized system serving all those on it, producers and consumers alike. It can be likened to a nervous system, serving all organs in the human body. Failure, for whatever reason, such as South Australia Black Event or the US Northeast blackout of 2003, can be dangerous and expensive. In the US and southern Canada, independent system operators (ISOs) or regional transmission organizations (RTOs) are responsible for maintaining the grid at a constant level – voltage and frequency. Frequency is kept to tight tolerances.
Serving about 65 million people, the PJM Interconnection is the largest ISO in the US, in terms of population. It serves parts or all of Delaware, Illinois, Indiana, Kentucky, Maryland, Michigan, New Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia, and the District of Columbia.
During the harsh storm of December 27, 2017, to January 8, 2018, called the Bomb Cyclone, the PJM Interconnection almost failed. John Constable of the Global Warming Policy Foundation discusses an important study of this near failure, and the near failure of other ISOs in the Northeast and Midwest. Failure was prevented by coal-fired power plants increasing electrical generation by 63%, natural gas plants by 20%, nuclear by 5.3% and seldom used oil by 26% over planned generation. These forms of electrical generation are condemned by the environmental industry.
Generation from highly praised and highly subsidized wind and solar fell by 12%. So-called sustainable energy cannot be sustained in bad weather.
Modern civilization requires reliable electricity. Power engineers call it dispatchable, it can be dispatched when and where needed. Pretending that wind and solar can substitute for thermal electricity generation (coal, oil, gas, and nuclear) is promoting a myth, just like the myth promoted by the EPA that carbon dioxide emissions endanger public health and welfare. However, based on these myths, the environmental and energy policies by many in the federal government and many states are endangering public health and welfare. These are the real threats, not a gas needed for life. See links under Challenging the Orthodoxy and Energy Issues – Real Threats.
Clean Coal in China: China, and much of Asia, has a real health threat from the smog produced by burning coal in the winter for heat, particularly in residences. The source of the threat has been misinterpreted by much of the western press as coming from coal-fired power plants.
China is a world leader in building modern, high-efficiency coal-fired power plants, when equipped with proper filters, emit few smog-causing compounds. Further, when Chinese leaders speak of “clean coal” they are not referring to the absurd western concept of burying carbon dioxide into the ground at the cost of billions of dollars, which may work in highly specialized circumstances only. They are referring to modern, high-efficiency power plants. See Article # 2 and links under Energy Issues – Non-US.
Number of the Week: $1.00 per month. The largest hydro-electric system in the US is the Columbia-Snake River system operated by the Bonneville Power Authority. It continues to be attacked by environmental groups for maintaining dams built forty to eighty years ago. Federal courts have ruled the dams built on the lower Snake River, hundreds of miles from the ocean, are threat to sea life – killer whales and salmon. A group called NW Energy Coalition produced a study claiming:
“Balanced portfolios of clean energy resources, including solar, wind, energy efficiency, demand-response, and storage can replace the power the four LSR Dams contribute to the Northwest region.”
Further, they claim that the portfolio of solar, wind, etc. will be reliable and cost consumers about $1.00 per month. Of course, the real objective is shutting down the complete system, regardless of the economic harm and lives lost from flooding, etc. See links under Energy Issues – Real Threats
1. Big Oil’s New Favorite Toy: Supercomputers
BP, others boost raw computing power in digital arms race to find oil and trim costs
By Sarah Kent and Christopher Mathews, WSJ, Apr 20, 2018
SUMMARY: For some years, designing programs for super computers has been one of the hottest employment opportunities for young, highly qualified programmers. The data collected by sensors on the drilling heads are enormous. The problem is detecting clear signals from the complex data. The reporters start:
“Xukai Shen, a geophysicist working at BP had a hunch he could solve a riddle that had vexed the company: whether there was a lot of oil hidden beneath a salt dome 7,000 feet underwater in the Gulf of Mexico. So he asked to use the company’s supercomputer exclusively for two weeks to check it out.
“Using an algorithm, the 33-year-old with a Stanford PhD harnessed the computer’s massive power last year to produce a clearer seismic image of what lay beneath. The result: a potentially massive oil find. With a clearer picture of the area, BP estimated 200 million barrels of crude lay hidden in the Atlantis oil field, a region the company had been plumbing for decades.
“’Basically, we found a field within a field,’ said Ahmed Hashmi, BP’s head of technology for exploration and production, during a recent tour of the company’s Houston supercomputer, as the machine hummed nearby.
“BP is now in love with beefy computer power—and it’s far from the only one in the oil patch. Italy’s Eni has built a computing facility the size of a soccer field outside of Milan, crediting its help in all its most recent oil and gas discoveries. France’s Total SA recently upgraded its Pangea supercomputer, nearly tripling its computing power.
“While big oil companies were early adapters of supercomputers, some have poured hundreds of millions into upgrades, and now possess some of the most powerful commercially owned computers on the planet.
After a bit of history, they continue:
“The computers are costly, but can reduce the oil exploration process by months and save companies tens of millions of dollars by avoiding misplaced wells. To harness their potential, the companies are increasingly seeking to compete with Silicon Valley firms for top data and computer scientists.
“’We’re going all in,’ said Bernard Looney, BP’s head of exploration and production. ‘We’re only scratching the surface today of what’s possible.’
“BP is in the middle of a five-year, $100 million investment in its Houston supercomputer. It’s built a 15,000-square-foot room in a 3-story, flood-proof building to house the titan, which currently takes up about 50% of the space and has the computing power of around 50,000 iPhone 7s.
After a brief discussion in changes in the oil market, which caused independent producers to cut back on their computer teams, the authors conclude with.
“BP says it’s already reaping the benefits of experiments with advanced technology.
“In Alaska, the company said it crunched 40 years of data on its operations, weather patterns and pipeline corrosion and found ways to maintain its 1,300 miles of pipelines in the state more efficiently by reducing on-site inspections.
“The physical inspections—as many as 100,000 locations a year—only found issues 2.5% of the time, and were difficult to perform in a state where temperatures can get so cold that workers can only be in the elements for minutes at a time. With the analysis, BP has managed to reduce inspections 25% and better predict corrosion, Mr. Looney said.
2. The New Science of Smog
Other things pollute the air more than gasoline exhaust does.
Editorial, WSJ, Apr 8, 2018
SUMMARY: In defending the Trump administration’s plans to reduce the fuel-mileage rules promulgated by the Obama administration the editors cite recent studies:
“A recent study in the journal Science traced and measured volatile organic compounds (VOCs) in Los Angeles. In the presence of sunlight, these compounds react with NOx to form ozone and smog. Car exhaust was once a greater relative contributor of VOCs and NOx, but engines are now much cleaner.
“Researchers at the University of Colorado found that petroleum-based chemicals such as those found in deodorant, soap, hair spray, household cleaners, pesticides and other commercial products account for about half of VOCs emissions in industrial cities. Gasoline fuel and exhaust make up about 32%.
“Environmentalists have long blamed L.A.’s car-driving culture for its smog. But even if most gas-burning vehicles were replaced by electric cars, L.A. would still have a smog problem because of its pollutant-trapping topography and sunny weather.
“People could perhaps reduce pollution by showering and cleaning their homes less, but wait . . . trees and people also emit VOCs. The study didn’t consider the biological sources of VOCs, which are to blame for the blue haze in the Appalachian Mountains. When Ronald Reagan quipped that trees cause more pollution than automobiles do, he had a point.
“Another big source of pollution: Dirt. According to a recent study in the journal Science Advances, cropland and natural sources contribute up to 40% of California’s NOx emissions—about 10 times as much as the California Air Resources Board has estimated. Motor vehicles make up about 30%.”
The editors conclude by stating the real target of those protesting revised standards are carbon dioxide emissions, which do not create smog