The Week That Was: October 26, 2019, Brought to You by www.SEPP.org
By Ken Haapala, President, Science and Environmental Policy Project (SEPP)
Alarmists in Local Media – Using Surface Data: The huge propaganda push by the UN Intergovernmental Panel on Climate Change (IPCC) on the need for “climate protection” has resulted in many strident claims in the local media, many becoming colorful slogans such as “climate crisis”, “climate chaos”, etc. Joseph D’Aleo, a Certified Consulting Meteorologist, and a fellow of the American Meteorological Society (AMS) has long addressed false ideas about climate change both in the AMS and in the public. D’Aleo was a founder of the Weather Channel and of WeatherBell Analytics, LLC. He is a pioneer in seasonal forecasts based on evidence and statistical modeling. As with many well-known skeptics who rebut the unsubstantiated claims that carbon dioxide is causing dangerous global warming, D’Aleo has been called a shill for oil companies and suffered many other politically motivated attacks.
On his blog, ICECAP, D’Aleo discusses some of the challenges he has faced in public and how he has addressed them. In the presentation he has a link to “Alarmist Claim Rebuttals” where he and others confront many of the myths produced by the IPCC with physical evidence – something sorely lacking in the IPCC summaries and in the US National Climate Assessment. Scientific knowledge is built with physical evidence, not with speculative mathematical models that fail basic testing against physical evidence. Among the many myths D’Aleo and his group address are heat waves, hurricanes, tornadoes, droughts and floods, wildfires, are carbon pollution as a health hazard, etc.
D’Aleo relates that when James Hansen was at NASA in 1999, he recognized that in the U.S. temperature record: “’In the U.S., the warmest decade was the 1930s and the warmest year was 1934’.” D’Aleo says that due to political pressure, the US surface record has been adjusted to reflect a warming trend that does not exist in the measurements. The adjustment is disguised by various tricks such as:
“The alarmists in National Climate Assessment finessed the issue by creating a ratio of record highs to record lows. Both have been declining since the 1930s, but record lows are declining faster thanks to urban heat island nighttime temperature contamination. Colors were appropriately chosen to give the illusion heat records are rising rapidly although as we have shown they are declining.”
“Most of the warming in daytime average readings is with nighttime lows and related to urbanization….”
“Importantly, NOAA “adjustments’ to the data shown below have cooled the past, producing an upward trend where the Measured data linear trend would be basically flat. Clearly, this data adjustment yielded Reported data that increased the chance of subsequent months and years will routinely rank among the records.”
Comparison of the graph of USCHN temperature trends from 1890 to about 2016 with that of NOAA-adjusted data shows the stark difference between measured temperatures and reported adjusted temperatures. Adjusted temperatures are significantly lower early in the record by as much as 1.5 degrees F, giving a false impression of warming. (Tony Heller frequently cites specific measurements on his web site.)
As D’Aleo states:
“Overall, the urban heat island is a far more significant anthropogenic factor than CO2. You hear it every night on local TV forecasts. It was adjusted for in the first US data set in 1989. Tom Karl Director of NCDC said if they didn’t, an artificial warming of 6F/century would result. He was pressured to remove it in version 2 a decade ago to get the appearance of warming the government wanted.”
D’Aleo gives an excellent review of many of the assertions by those promoting “climate protection” that are contradicted by physical evidence. Also, he exposes a number of the false claims promoted by highly politicized sections of US government entities. His discussion on surface temperatures covers some of the reasons why TWTW does not rely on surface temperatures or on reports from these specific entities of NOAA and NASA. See links under Challenging the Orthodoxy and https://alarmistclaimresearch.files.wordpress.com/2019/05/ac-rebuttal-heat-waves-081819.pdf (p. 5).
The Greenhouse Effect –Equilibrium Climate Sensitivity (ECS): On his blog, Roy Spencer takes up the issue presented by statistician Nic Lewis on the paper presented Gregory et al., that the sensitivity of earth to changing CO2 can be derived from studies of historic climate change. These involve the problem of using statistical techniques to estimate the relationship of top-of-the-atmosphere changes in radiation as measured by satellites with temperature trends. As mentioned in the last TWTW, Lewis demonstrates that the statistical techniques used in the Gregory paper were weak and that the authors of the Gregory paper may not fully understand the weaknesses of the statistics they use.
Spencer discusses that ten years ago he and Danny Braswell:
“…published a series of papers pointing out that time-varying radiative forcing generated naturally in the climate system obscures the diagnosis of radiative feedback. Probably the best summary of our points was provided in our paper “On the diagnosis of radiative feedback in the presence of unknown radiative forcing” (2010). Choi and Lindzen later followed up with papers that further explored the problem.”
“The bottom line of our work is that standard ordinary least-squares (OLS) regression techniques applied to observed co-variations between top-of-atmosphere radiative flux (from ERBE or CERES satellites) and temperature will produce a low bias in the feedback parameter, and so a high bias in climate sensitivity. [Spencer states he provides a simple demonstration.] The reason why is that time-varying internal radiative forcing (say, from changing cloud patterns reflecting more or less sunlight to outer space) de-correlates the data (example below). We were objecting to the use of such measurements to justify high climate sensitivity estimates from observations.
Our papers were, of course, widely criticized, with even the editor of Remote Sensing being forced to resign for allowing one of the papers to be published (even though the paper was never retracted). Andrew Dessler objected to our conclusions, claiming that all cloud variations must ultimately be due to feedback from some surface temperature change somewhere at some time (an odd assertion from someone who presumably knows some meteorology and cloud physics).
So, even though the new Gregory et al. paper does not explicitly list our papers as references, it does heavily reference Proistosescu et al. (2018) which directly addresses the issues we raised. These newer papers show that our points were valid, and they come to the same conclusions we did — that high climate sensitivity estimates from the observed co-variations in temperature and radiative flux were not trustworthy. [Boldface was italics in the original.]
Spencer writes the new Gregory paper is important. It:
“…is extensive and makes many good conceptual points which I agree with. Jonathan Gregory has a long history of pioneering work in feedback diagnosis, and his published research cannot be ignored. The paper will no doubt figure prominently in future IPCC report writing.”
Unfortunately, a huge problem remains – natural climate variation which tends to be eliminated by the approaches used in many studies, illustrated by Gregory.
“In other words, their methodology would seem to have little to do with determination of climate sensitivity from natural variations in the climate system, because they [Gregory]have largely removed the natural variations from the climate model runs. The question they seem to be addressing is a very special case: How well can the climate sensitivity in models be diagnosed from 30-year periods of model data when the radiative forcing causing the temperature change is already known and can be subtracted from the data? (Maybe this is why they term theirs a “perfect model” approach.) If I [Spencer] am correct, then they [Gregory] really haven’t fully addressed the more general question posed by their paper’s title: “How accurately can the climate sensitivity to CO2 be estimated from historical climate change?” The “historical climate change” in the title has nothing to do with natural climate variations.
“Unfortunately — and this is me [Spencer] reading between the lines — these newer papers appear to be building a narrative that observations of the climate system cannot be used to determine the sensitivity of the climate system; instead, climate model experiments should be used. Of course, since climate models must ultimately agree with observations, any model estimate of climate sensitivity must still be observations-based. We at UAH continue to work on other observational techniques, not addressed in the new papers, to tease out the signature of feedback from the observations in a simpler and more straightforward manner, from natural year-to-year variations in the climate system. While there is no guarantee of success, the importance of the climate sensitivity issue requires this.” [Boldface in original]
“And, again, Nic Lewis is right to object to their implicit lumping the Lewis & Curry observational determination of climate sensitivity work from energy budget calculations in with statistical diagnoses of climate sensitivity, the latter which I agree cannot yet be reliably used to diagnose ECS.”
Spencer goes on to give a simple demonstration of the problem of diagnosing feedbacks. This discussion goes to the critical issue: the physics is not simple and there is no generally accepted theory of the greenhouse effect, regardless of what the IPCC and its followers claim.
The discussion of the Gregory paper on his blog led Spencer to discuss a more general question: Does the Climate System Have a Preferred Average State? To which TWTW would add: If It Does, Can Humans Comprehend It? This is an issue for the near future. See links under Challenging the Orthodoxy.
Issues With Atmospheric Temperature Trends: Issues with satellite measurements of atmospheric temperatures have been discussed since the 1990s. The most significant issue was orbital decay which resulted in a cooling trend. These errors were quickly corrected when confirmed. The scientists at the Earth System Science Center at the University of Alabama in Huntsville publicly release their findings monthly, making them and the need for any corrections open for public review and criticism.
In 2017, a group of climate scientists published a paper in Nature, Geoscience, explaining why they think models differ from observations in estimates of temperature trends in the troposphere (the part of the atmosphere where water vapor is the dominant greenhouse gas and where weather occurs). Climate Scientist Zeke Hausfather who, as an environmental economist worked on the Berkeley Earth program of evaluating land surface temperature records, published a review of the 2017 paper in Carbon Brief. Among the issues Hausfather discusses are:
“Much of our historical temperature data comes from weather stations, ships, and buoys on the Earth’s surface. Since 1979 temperature records of the atmosphere are also available from satellite-based microwave sounding units (MSU). These measure the “brightness” of microwave radiation bands in the atmosphere, from which scientists can estimate air temperatures.
“However, the bands measured by the satellite instruments cannot easily provide the temperature of a specific layer of the atmosphere. Researchers have identified particular sets of bands that correspond to the temperature of the lower troposphere (TLT) spanning roughly 0 to 10 km, the middle troposphere (TMT) spanning around 0 to 20 km and the lower stratosphere (TLS) spanning 10 to 30 km.
“Unfortunately, these bands tend to overlap a bit. For example, TMT estimates will include part of the lower stratosphere, while TLT estimates will include some surface temperature. These overlaps matter because different parts of the atmosphere are expected to react very differently to climate change.”
The article criticizes the UAH data for showing less warming than estimates from Remote Sensing Services (RSS), two executives of which participated in the Nature study. Carl Mears, the co-founder of RSS stated:
“’In general, I think that the surface datasets are likely to be more accurate than the satellite datasets. The between-research spreads are much larger than for the surface data, suggesting larger structural uncertainty’.”
An explanation for the claim is not given and it is questionable. As D’Aleo states above, the nighttime warming from the Urban Heat Island effect creates great uncertainty in the surface data. Hausfather’s essay concludes with:
“Ultimately, the [Nature] paper finds that while there is a mismatch between climate models and observations in the troposphere since the year 2000, there is little evidence to-date that the model/observation differences imply that the climate is less sensitive to greenhouse gases. The results suggest that while these short-term differences between models and observations are a subject of great scientific interest, it does not diminish the reality of long-term human-driven warming.”
Additional papers by Christy, et al. present physical evidence compared to models that the difference is real and significant. See links under Models v. Observations and http://berkeleyearth.org/team/zeke-hausfather/
Sue Them All! The attorney general of New York has taken Exxon to court in what trial lawyer Francis Menton calls “A Serious Contender For Stupidest Litigation In The Country Goes To Trial.” Menton’s biograph as a contributor to the Federalist Society states, in part:
“Francis J. Menton, Jr. is a partner in the Litigation Department and Co-Chair of the Business Litigation Practice Group of Willkie Farr & Gallagher LLP in New York. Mr. Menton specializes in complex and technical commercial litigation, principally contract and securities claims. He has a nationwide trial practice and has tried cases in state and federal courts including Colorado, Kansas, Massachusetts, Michigan, New York, Puerto Rico, Texas, Virginia, and Washington.”
Other candidates for Menton’s title of “Stupidest Litigation in the Country” include: the one that was “brought in Oregon by a group of minor children claiming a ‘constitutional right’ to a ‘stable climate,’ and seeking a nationwide injunction forcing the phase-out of all use of fossil fuels in the United States”; ones by municipalities in California and New York seeking compensation for sea level rise and other “climate harms.” Some of the cases have been dismissed, others are pending.
Menton concludes the essay with:
But it’s not just that this case, now in its third iteration, has no discernible relationship to its supposed purpose of investor protection. (Are we really to believe that the NY AG is trying to help you make more money by investing in Exxon?) And then consider the category of supposedly addressing “climate change.” That is nowhere stated as a purpose of the litigation in its current configuration, and I would really challenge anyone to come up with an articulation of how this case might have anything to do with that cause. Other than, I suppose, “we hate Exxon.”
As with any litigation, it is impossible to predict with reliability how it will go. New York has a different set of rules than other states. Menton sat through part of the oral arguments and no doubt will report on developments. Note for disclosure: several years ago, Ken Haapala was on some telephone conference calls that included Menton. See links under Litigation issues and https://fedsoc.org/contributors/francis-menton
California Fires: Last week’s TWTW briefly discussed regulated utilities and how ill-considered policies implemented by politicians divert funds needed for maintenance and public safety to special causes, such as alternative electricity generation. On October 25, The Wall Street Journal published an editorial detailing some of the special politicized demands politicians have placed on the now-favorite target of “corporate greed” – Pacific Gas and Electric (PG&E). This week, PG&E declared another black-out, cutting power to over 900,000 consumers. Another major fire driven by Fall winds is threatening cities in Sonoma County, north of San Francisco.
According to the EIA in 2017, the average retail price for electricity was 16.06 cents/kWh in California, while the national average was 10.48 cents/kWh. In the lower 48 states, California prices were exceeded only by New England. Customers in black-out California have yet to feel the full effect of state law which mandates that utilities obtain 33% of electric generation from renewables such as wind and solar by 2020 and 60% by 2030. See links under California Dreaming, Article # 1, and https://www.eia.gov/electricity/state/
Number of the Week: $11.69 billion up 31%. According to a report in the District of Columbia newspaper, The Hill, the Interior Department reported that revenue collected from energy production on public lands rose in Fiscal Year (FY) 2019 by 31 percent over FY 2018 to $11.69 billion. About $2.44 billion was dispersed to states and Indian tribes.
The increase can be attributed, chiefly, to opening public lands for oil and gas production, including hydraulic fracturing. Some of the increase may have come from more accurate accounting of fair market value in resource sales. Unfortunately, the news report did not link or identify the Department of Interior report.
Separately, the Oil and Gas Journal reported that:
“American gas end-users have realized $1.1 trillion in savings since 2008 as a result of increased gas production in the Shale Crescent USA region (Ohio, Pennsylvania, and West Virginia), according to a new economic analysis released Oct. 21.”
Americans are enjoying the benefits of plentiful, reliable, and affordable oil and gas, except in states that penalize such production, processing and use. See links under Washington’s Control of Energy, Oil and Natural Gas – the Future or the Past?, and https://www.eia.gov/electricity/state/
1. Fires and Blackouts Made in Sacramento
Newsom tries to deflect blame, but PG&E is the agent of his policies.
Editorial, WSJ, Oct 25, 2019
TWTW Summary: Amplifying a theme discussed in last week’s, that politicians and their appointed regulators have abridged the concept of a regulated utility, serving the public and customers with reliable electricity and / or gas at the lowest possible cost, the WSJ editorial discusses specific examples of this political failure:
“After again shutting power to hundreds of thousands this week, California’s utility PG&E disclosed Thursday that it had discovered a broken jumper cable by the ignition site of a wildfire blazing across Sonoma County. The company has warned of more blackouts this weekend and perhaps for the next decade as it refurbishes its aging grid.
“Gov. Gavin Newsom is trying to deflect political blame. “It’s about dog-eat-dog capitalism meeting climate change. It’s about corporate greed meeting climate change. It’s about decades of mismanagement,” Mr. Newsom declared. But Democrats for years have treated PG&E as their de facto political subsidiary. The wildfires and blackouts are the direct result of their mismanagement.
“The state Public Utilities Commission is in charge of enforcing state safety laws and regulations, which can carry penalties of up to $50,000 per violation per day. Yet PG&E received no safety fines related to its power-grid management over the last several years. The commission has instead focused on enforcing the Legislature’s climate mandates.
“State law mandates that utilities obtain 33% of electric generation from renewables such as wind and solar by 2020 and 60% by 2030. Utilities must spend hundreds of millions of dollars each year to reduce the cost of green energy for low-income households. PG&E has prioritized political obeisance over safety.
“In 2018 PG&E spent $509 million on electric discounts for low-income customers in addition to $125 million for no-cost weatherization and efficiency upgrades for disadvantaged communities. Utilities also receive allowances from the state’s cap-and-trade program—$7.5 billion since 2012—to pay for other “ratepayer benefits” that reduce emissions.
“For instance, the Legislature in 2015 mandated that utilities spend $100 million annually on solar systems in low-income communities. This is on top of the $2.2 billion in customer rebates for rooftop solar installations, which utilities charged to ratepayers between 2007 and 2016. Under the state’s net-metering program, solar customers also get a break on their bills.
“Last year PG&E invested more than $150 million in battery storage and “sustainable” technologies, which was paid for by a special charge on ratepayers. PG&E is also spending $130 million over three years to install 7,500 electric-car charging stations and offers drivers a $800 “clean fuel” rebate.
“All of this has been part of a Democratic political strategy to use PG&E to advance their climate agenda without raising taxes. But Californians have instead paid through higher electric rates—PG&E rates are twice as high as in Oregon and Washington—while utilities have had to redirect capital and ratepayer revenue away from fortifying the grid and tree-trimming.
“Is it any wonder that electric equipment is malfunctioning? PG&E filed for bankruptcy in January amid tens of billions in liabilities for dozens of wildfires linked to its equipment. The utility says it doesn’t know if the failed jumper cable caused the Sonoma fire and that it had done repairs and inspections on the site.
“But PG&E customers are rightly furious. They’ve suffered inconvenience and financial losses due to power outages that start with little warning and may go on for days. Who can run a business or household this way? Sorry, kids, you’re going to have to do your homework by candlelight.
“Gov. Newsom is demanding that PG&E pay rebates to customers affected by the blackouts. The utility has declined, citing its bankruptcy debts, though it may have to follow the Governor’s orders if investors want to avoid getting wiped out. San Francisco has proposed buying some utility assets, and San Jose wants to turn it into a customer-owned cooperative.
“Democrats are accusing PG&E of putting profits over safety, but the utilities commission approves its return on equity based on what’s needed to attract private investment. Utility shareholders are typically older folks who rely on dividends for a reliable stream of income—not billionaire hedge funds.
“PG&E has prioritized serving its political overlords above all else. California’s return to the dark ages is a direct result of the Democratic political monopoly in Sacramento.” [Boldface added.]
2. Trump’s Gift to California
The EPA liberates more water from smelt and salmon regulations.
Editorial, WSJ, Oct 24, 2019
TWTW Summary: After starting with wildfires and rolling blackouts, the editorial states [Boldface added]:
“… the Trump Administration this week brought welcome relief to the Golden State by allowing more water to be sent to farmers and folks in the south. Will California liberals accept the deregulatory gift?
“Federal biological opinions designed to protect smelt and salmon have limited how much water can be exported from northern California through the Sacramento-San Joaquin River Delta. Environmentalists say the delta pumps can suck in fish and indirectly limit their food supply, increase contaminants and boost predator species.
“There’s little scientific evidence for any of this, but the Ninth Circuit Court of Appeals deferred to federal regulators. More than 100 billion gallons of water each winter—enough to sustain millions of households—have since been flushed into San Francisco Bay. During the recent seven-year drought, folks in Southern California experienced water restrictions while farmers in the fertile Central Valley—where nearly all of the country’s pistachios, walnuts, garlic and plums are grown—resorted to pumping groundwater.
“Increased groundwater pumping has caused land to subside and water to become contaminated. Environmental Protection Agency Administrator Andrew Wheeler last month wrote a letter to Gov. Gavin Newsom warning that 202 water systems in the state violate public health standards, including 67 that tested high for arsenic. Most are in low-income rural communities.
“Despite regulatory protections, smelt and salmon have continued to disappear. A state survey last fall found no smelt in the delta though some are probably still swimming around. In 2008 there were 23. Enter the Trump Administration, which this week revised the smelt and salmon biological opinions to help the fish recover while meeting demands for water.
“The new opinions allow pumping to be adjusted based on real-time monitoring of fish rather than time of year. Pumping will be reduced if scientists in the delta determine that fish are in danger. The U.S. Bureau of Reclamation will also withhold more water in Shasta Reservoir during wet years to help migrating salmon and spend $1.5 billion over the next decade for species recovery.
“Environmentalists are furious, but the new rules will especially benefit rural communities now represented by Democrats in Sacramento and Washington. Last month Mr. Newsom vetoed a bill that would have enshrined Obama -era environmental regulation in state law largely because it would have restricted his flexibility to increase water to farmers and to southern California. He should also send a thank you note to President Trump.”