The Week That Was: September 19, 2020, Brought to You by www.SEPP.org
By Ken Haapala, President, Science and Environmental Policy Project
Quote of the Week: “It is one thing to impose drastic measures and harsh economic penalties when an environmental problem is clear-cut and severe. It is quite another to do so when the environmental problem is largely hypothetical and not substantiated by careful observations. This is definitely the case with global warming.” – Frederick Seitz, Introduction to Fred Singer’s Hot Talk, Cold Science (1999)
Number of the Week: 64%
NO TWTW NEXT WEEK: There will be no TWTW the week of September 26. TWTW will resume the weekend of October 3.
Greenhouse Effect – Critical for Life: Last week, TWTW discussed that in 1859, physicist John Tyndall began experiments on gases that interfere with the loss of electromagnetic energy (heat) from the surface of the earth to space. These gases, known as greenhouse gases, keep the earth warmer than it would be otherwise, particularly at night. Tyndall recognized that water vapor is the dominant greenhouse gas, and without it land masses would freeze at night, making vegetative growth virtually impossible.
William Happer and W. A. van Wijngaarten specialized in Atomic, Molecular and Optical Physics the field that includes spectroscopy, which is the study of the interaction between matter and electromagnetic radiation. They used the high-resolution transmission molecular absorption database (HITRAN), which simulates the transmission and emission of light in the atmosphere, to calculate the influence of a doubling of CO2 and a 6% increase in water vapor in the atmosphere. They arrive at an upper bound of 1.5 degrees K (C), more likely to be around 1 degree C, which is significantly less than the lowest estimate by the UN Intergovernmental Panel on Climate Change (IPCC) and its climate modelers.
Happer and van Wijngaarten have estimated that without the greenhouse effect the temperature of the surface of the earth would be about 16 ºF (minus 9 ºC) well below freezing. Further, even assuming that liquid water would exist, at night the land masses would be bitterly cold, preventing any growth of vegetation.
This work is the cumulation of over 150 years of research in which there were a number of blind alleys or dead ends. For example, in 1896 Svente Arrhenius very roughly estimated that doubling CO2 might cause a temperature rise of maybe up to 5 ~ 6 C. Ten years later, he retracted that estimate, making a new estimate half that size. Others showed that the later estimates were still too high.
There is disagreement among those who have studied the problem on the extent to which water vapor accounts for the entire greenhouse effect. It may be 75% to 90%. But there is no disagreement it is the dominant greenhouse gas.
Further, what is vital in understanding the greenhouse effect is that as the concentration of a gas increases, its ability to cause a change in temperature diminishes to a condition called “saturation,” and it is accurate as well as convenient to represent the change by a logarithmic curve. In the case of CO2, its importance begins to decline even below 100 parts per million (ppm), and at 400 ppm the influence of carbon dioxide (CO2) is close to full saturation. – having little additional effect. Thus, enormous increases in CO2 are needed to have even a minor influence on temperature. Based on experiments and observations, so called “run-away greenhouse” on earth is impossible.
Unfortunately, some climate analysts who may not be familiar with mathematics may have confused a logarithmic relationship with its inverse, an exponential relationship. Exponential relationships between greenhouse gases and temperatures have been used in publications by the UN Intergovernmental Panel on Climate Change. For example, the Summary for Policymakers of the Third Assessment Report (SPM, AR3 (2007)) showed exponential relationships between radiative forcing (an estimate for changes in temperatures) and carbon dioxide; methane; and nitrous oxide. In each instance, this was accomplished by grafting two different datasets onto one another, ice cores and atmospheric samples, without calibrating them. (pp 3 & 4)
Water vapor creates a major problem in modeling the atmosphere, and the climate. It varies significantly by latitude, season, day, and hour. For example, the MODTRAN (MODerate resolution atmospheric TRANsmission), version 6, computer code gives various values for concentrations of atmospheric water vapor: These include high values for the tropics, and different values for mid-late summer; mid-late winter, sub-Arctic summer, sub-Arctic winter, and US standard. The highest value is tropical, 5119 moles per unit at standard pressure and temperature; the lowest value is sub-Arctic winter with 518 moles at standard pressure and temperature, the US standard value is 1762 moles.
These values, based on satellite observations, are used to estimate how the changing greenhouse gases in the atmosphere can distort electromagnetic radiation.
Thus, it is highly questionable if climate modeling without rigorous testing against physical data of the atmosphere obtained through observations can be effective in estimating how the climate changes with changing greenhouse gases. The famous 1979 Charney Report presented no physical data supporting its estimates. See links under Challenging the Orthodoxy, https://www.ipcc.ch/site/assets/uploads/2018/02/ar4-wg1-spm-1.pdf, and
Photosynthesis – Critical for Life: The second major greenhouse gas, carbon dioxide, is critical for life above, on, and near the surface of the earth. Using energy from the sun, the process of photosynthesis converts H2O and CO2 into plant life. That in turn becomes food for other life forms. Photosynthesis is present from simple one-cell life (such as some bacteria and blue-green algae) all the way to huge plants as big as redwoods.
About 200 meters (650 ft) deep in the oceans, sunlight is not sufficient for the energy needed for photosynthesis. However, bacteria and other simple life forms use energy released by inorganic chemical reactions to produce food (sugar) in a process called chemosynthesis. In geothermal vents bacteria oxidize hydrogen sulfide, and a complex web of life has evolved around them. The floor of the Gulf of Mexico teems with life around hydrocarbon seeps, which release methane, asphalt, and other hydrocarbons. Thus far, it appears that only simple cell life such as bacteria can use chemosynthesis to produce food, while complex life such as mussels and tubeworms consume the bacteria or their subsequent products.
Thus, the current demonization of carbon dioxide and the greenhouse effect is a demonization of the foundations of life as we generally recognize it. As the late Frederic Seitz states in the quotation of the week:
“It is one thing to impose drastic measures and harsh economic penalties when an environmental problem is clear-cut and severe. It is quite another to do so when the environmental problem is largely hypothetical and not substantiated by careful observations. This is definitely the case with global warming.”
We must ask what are the careful observations used to justify the claims that increasing carbon dioxide is causing dangerous greenhouse gas warming? See links under Challenging the Orthodoxy and https://oceanexplorer.noaa.gov/facts/photochemo.html.
Nomination of David Legates: The Professor of Climatology at the University of Delaware has been nominated to become Deputy Assistant Secretary of Commerce for Environmental Observation and Prediction at NOAA. An author or co-author of over 100 papers on climate, precipitation and similar topics, Legates has openly questioned the rigor of US National Climate Assessment reports. For this, he has earned the wrath of several members of Congress, including the Chair of the House Committee on Natural Resources and the Chair of the House Subcommittee on Water, Oceans, and Wildlife. They wrote the Acting Under Secretary of Commerce for Oceans and Atmosphere, National Oceanic and Atmospheric Administration, a letter stating:
“Dr. Legates has testified before our Committee downplaying or downright refuting the anthropogenic drivers of our current climate crisis. He stated: ‘Climate has always changed, and weather is always variable due to complex, powerful natural forces. No efforts to stabilize the climate can possibly be successful.’ He went on: ‘…transition[ing] from fossil fuels to so called clean energy to protect us from climate change is a recipe for personal and economic disaster that will have virtually no impact on the Earth’s climate.’ NOAA’s own data, compiled over the 50 years of the agency’s existence, points to exactly the opposite. Specifically, the third National Climate Assessment coauthored by NOAA stated ‘Global climate is changing, and this is apparent across the United States in a wide range of observations. The global warming of the past 50 years is primarily due to human activities, predominantly the burning of fossil fuels.’
“Dr. Legates’ appointment is an extreme risk to the American public and an insult to the quality science and scientists at NOAA. This is yet another example of a disturbing trend in the infusion of a political agenda into science. Dr. Legates has gone on the record in opposition to sound science strictly for personal gain and the advancement of a political agenda. Such behaviors undermine the scientific integrity at NOAA and should not have a place in your agency.”
Apparently, these members of Congress believe that the climate system, which has never been in equilibrium can be stabilized; that change is primarily due to use of fossil fuels, when doubling carbon dioxide causes a disturbance of less than 2% of atmospheric energy flows; and that the (US) Third Assessment Report (2014) is a standard of excellence for NOAA. It is useful to review this standard of excellence. Its key findings include:
It is generally recognized that photosynthesis increases with increasing carbon dioxide. The greening of the earth is well established. The slide on ragweed pollen seasons demonstrates that the 2013 National Climate Assessment report is trivial and should be so treated. The decline in corn and soybean yields with increasing temperatures is directly contradicted by American farm yields and by the fact that tropical Brazil is a leading competitor in exports of corn (maize) and leads the US in exports of soybeans. See links under Defending the Orthodoxy and Change in US Administrations – Appointment.
Updated US National Assessment: Writing for the UK Global Warming Policy Foundation, Paul Homewood examines the latest US National Climate Assessment (2018). As a career accountant in industry, Homewood believes in data, not speculative models. Using data mainly from NOAA he finds:
“• Average temperatures have risen by 0.15°F/decade since 1895, with the increase most marked in winter.
• There has been little or no rise in temperatures since the mid-1990s.
• Summers were hotter in the 1930s than in any recent years.
• Heatwaves were considerably more intense in decades up to 1960 than anything seen since.
• Cold spells are much less severe than they used to be.
• Central and Eastern regions have become wetter, with a consequent drastic reduction in drought. In the west, there has been little long-term change.
• While the climate has become wetter in much of the country, evidence shows that floods are not getting worse.
• Hurricanes are not becoming either more frequent or powerful.
• Tornadoes are now less common than they used to be, particularly the stronger ones.
• Sea-level rise is currently no higher than around the mid-20th century.
• Wildfires now burn only a fraction of the acreage they did prior to the Second World War
In short, the US climate is in most ways less extreme than it used to be. Temperatures are less extreme at both ends of the scale, storms less severe and droughts far less damaging. While it is now slightly warmer, this appears to have been largely beneficial.”
When it comes to what is happening in the US, look at the data, not what NOAA and politicians say is happening. See links under Challenging the Orthodoxy.
Show Me Evidence! Australian Senator Malcolm Roberts from Queensland may be one of the loneliest politicians around. He is taking on the claims of Australia’s national science agency, CSIRO, the Commonwealth Scientific and Industrial Research Organization on its models and its claims about the dangers of global warming.
In so doing, he is posting a series of papers and interviews with noted scientists. The interviews include Astrophysicist Willie Soon, who is a director of SEPP, atmospheric temperature measurement pioneer John Christy, and Australian climate modeler David Evans. In his interview Soon states that science is a wonderful tool but is often misused. He exposes what he calls “scientific malpractice” by Science Magazine and the Australian Broadcasting Company about a paper called Marcott, et al. 2013. This paper, based on proxy data, claimed that current warming is unprecedented with temperatures higher than 90% of the entire Holocene (starting about 11,700 years ago). Current temperatures are close, but do not yet exceed the peak warming of the Holocene. The paper was quickly debunked but may appear again in an UN IPCC report.
Christy discussed models and how poorly they perform against actual atmospheric data – they are too sensitive to additional CO2 increasing the greenhouse effect and do not include the real processes by which the earth loses energy to space. Demonstrating the absurdity of the modeling and efforts to justify their sensitivity, Christy stated one model was run with the presumed conditions of the Early Eocene 55 million years ago, discussed in last week’s TWTW. The model warmed the planet to 55 ºC (131 ºF).
He also stated that the “the climate sensitivity” (temperature rise due to a doubling of atmospheric CO2) in current models vary by a factor of 3. It is hard to say that this type of climate science is based on physics when the models vary so widely. [TWTW note: Imagine calculating the lunar landing if gravity models varied by so much.] One of the problems is that models are based on parametrizations and speculated theories (hypotheses). Yet our ignorance of the climate system is enormous. For a Congressional hearing on controlling CO2, Christy used a model to remove all emissions from the US – made them all disappear, and the result was less than the actual natural monthly variation.
Climate modeler David Evans questioned whether it was possible to simulate the earth’s atmosphere with the multiple layered grid system used in models. Weather values, such as temperature and humidity, are represented by numbers, but many key features are much smaller than the cells in the grid – such as clouds and updrafts. TWTW would add that modeling water vapor, discussed above, and the three phases of water are other major problems. It would be interesting to see a US Congressional committee hearing conducted in a manner as Roberts conducted his interviews. Far too often the members of congress are too interested in parading their views than in learning what others think or the actual facts. See links under Challenging the Orthodoxy.
Urban Heat Island Effect: The urban heat island effect pertains to how urbanization is increasing the measures of surface temperature. This may be one source of the great overestimates in the models of atmospheric temperatures from rising CO2. Two studies (indicating that this effect is not properly accounted for in climate models) are presented in the links under Review of Recent Scientific Articles by CO2 Science and Measurement Issues – Surface
Significant Feat: Earlier TWTW had little use for “The Limits of Growth” by the Club of Rome. The models were poorly designed and tested. But an article by energy writer Daniel Yergin changed this view. After George Mitchell of Mitchell Energy read The Limits, he was concerned whether he could supply Chicago with natural gas as he was contracted to do. According to Yergin:
“In 1981, he read the draft of a journal article by one of his geologists. The article offered a hypothesis that ran counter to what was taught in geology and petroleum engineering classes. It suggested that commercial gas could be extracted deep underground from very dense rock—denser than concrete. This was the source rock, the ‘kitchen’ in which organic material was ‘cooked’ for several million years and transformed into oil or gas. According to the textbooks, the oil and gas then migrated into reservoirs, from which it could be extracted.
“It was thought at the time that oil and gas might still remain in the shale but could not be produced on a commercial basis because they could not flow through the dense rock. The draft article disagreed. Mitchell, beset by worries about the contract for Chicago, became convinced that here might be the road to his company’s salvation. There had to be a way to prove the received wisdom wrong.”
The rest is history as seen by the financial success of hydraulic fracturing combined with horizontal drilling. See link under Oil and Natural Gas – the Future or the Past?
Number of the Week: 64%: Many alarmists are claiming greater storms and more severe weather. For the Atlantic, part of the problem is that now NOAA is naming storms that previously were ignored or not even known to exist. The most useful data for estimating storms and their severity are Accumulated Cyclone (Hurricane) Energy. Paul Homewood looks at the data (Sep 9) and writes:
“While the Accumulated Cyclone Energy is running 28% above average in the Atlantic, globally it is well below, at 64%.
“Global hurricane numbers are also lower this year, as are the number of major hurricanes. There is no evidence of increasing hurricane frequency or intensity, despite the BBC’s attempts to mislead otherwise.”
Do not believe that what is happening in the Atlantic is happening everywhere. See links under Changing Weather and http://climatlas.com/tropical/
A California Town’s Fire-Protection Plans Hit Red Tape, Then it Burned to the Ground
The North Complex fire decimated Berry Creek just as work was to start on forest-thinning projects
By Jim Carlton, WSJ, Sep 17, 2020
The article starts:
“BERRY CREEK, Calif.—As the smoke from the nearby fire-devastated town of Paradise cleared in 2018, local officials were trying to gain approval of forest-thinning projects to help this mountain community avoid a similar fate.
“Nearly two years after they first applied for approval from the state of California, the contract for one of the projects went out for bidding on Sept. 4. At the same time, work was about to start on the other after a months long wait caused in part by the coronavirus pandemic.
“Then just four days later, the North Complex fire roared through this community of 2,500, wiping out most of the homes and leaving at least 10 people dead, more than one-third of the total fatalities from wildfires this year in California.
“‘I tried to work with anyone who would listen to avoid what happened,’ Denise Bethune, a Berry Creek fire safety coordinator, said from her son’s residence in the nearby city of Chico, where she and her husband have been living since their two-bedroom home burned to the ground.
“Among the factors contributing to this year’s exceptional wildfire season are climate change, which causes higher temperatures and longer droughts, and poor forest management, according to scientific experts. State and local governments haven’t done enough to thin dry brush and dead trees that burn easily around populated areas, they say.
“Few places have felt the consequences more severely than Butte County, a rural area north of Sacramento where 19 large fires have scorched some 400,000 acres since 1999.
“Earlier this week, Loren Gill drove into what remains of Berry Creek in an SUV to pick through some of the rubble of what had been a park community center. Nearby lay the ruins of Berry Creek Elementary, with a sign of its mascot, a Native American brave, charred.
“‘This is unbelievable,’ Mr. Gill, 78 years old, said as he drove past dozens of leveled homes, some with only the chimneys still intact. ‘Just a nightmare.’
“The Butte County Fire Safe Council on Oct. 26, 2018, received an $836,365 grant from the California Department of Forestry and Fire Protection, or Cal Fire, to remove trees and brush from 234 acres. The plan was to create fire breaks along two of the main evacuation roads and for a development of homes around Lake Madrone.
“Getting approval to spend the money would take almost two years.”
After discussing who was to blame for the delay, and speculation to the extent in which the fire would have been contained, the articles concludes:
“It took 17 months to get the final go ahead from Cal Fire in March. Then a new problem hit: the coronavirus pandemic, which Ms. Bethune said put some of that work on hold.
“After approval for the plan to create fire breaks along the evacuation roads came in July, Ms. DeAnda said her group had hoped to start both projects this fall and complete them within a month.
“‘We were just waiting on the contractor to get started,’ she said.
“The fire has effectively negated the need for that fuel work, now. The forest of pine and oaks surrounding the town was burned so severely that in some places only skeleton trees are left, devoid of leaves.
“Volunteer Fire Chief Reed Rankin, who lost his own home, said all he and other first responders could do was evacuate people ahead of a firestorm driven by 40 miles-per-hour winds.
“‘We’ve got to rethink how we are going to manage the forests,’ said Mr. Rankin, after sifting through the rubble of a destroyed fire station. ‘I’m still at a loss, still numb.’”