Weekly Climate and Energy News Roundup #286

Brought to You by www.SEPP.org The Science and Environmental Policy Project

By Ken Haapala, President,

Quote of the Week. “Do not feel absolutely certain of anything.” – Bertrand Russell, “A Liberal Decalogue” from his autobiography.

Number of the Week: 87 Seconds

Which Science – Empirical? Modern empirical science is a great accomplishment of civilization and a gift to it. It is a process for evaluating many diverse ideas, weeding out those ideas that fail necessary tests, and modifying those ideas that need improvement. Sometimes the process may take one hundred years or more. Such is the case of Einstein mathematically speculating the existence of gravitational waves – ripples in space and time created by the motion of massive objects in the universe.

The testing was daunting, and for four decades the National Science Foundation (NSF) has been allocating funds supporting it. The Laser Interferometer Gravitational-wave Observatory (LIGO) is an elaborate experiment, costing some $1.1 billion and using some highly sensitive instrumentation, especially designed for the project. It involved the construction of twin buildings 4,000 m (13,000 feet) long and about 2400 miles apart. In September 2015, the equipment detected gravitational waves, substantiating indirect evidence first observed in 1974. The three scientists who led the development of LIGO were honored with the 2017 Nobel Prize in Physics. All involved and the NSF are to be praised for their diligence in funding this multi-decadal experiment, whose construction started in 1994. See links under Other Scientific News


Which Science – Institutional, Bureaucratic? Unfortunately, sometimes the leaders of scientific institutions become stuck on an idea the needs significant modification, but the leaders fail to do so. Such is the case of Eric Davidson, President of the American Geophysical Union (AGU), and Marcia K. McNutt, President of the National Academy of Sciences when they dismissed the proposed Red/Blue Team debate, “Red/Blue and Peer Review” in EOS, a publication of AGU. Previously, Ms. McNutt was editor-in-chief of Science magazine, the leading publication of the American Association for the Advancement of Science (AAAS).

The well written essay by Davidson and McNutt may be very convincing to many readers. It asserts a consensus that does not exist. Its history of greenhouse gas theory begins with Arrhenius in the 1890s, then jumps forward:

“Modern climate science started in the 1960s, when general circulation models under development were modified to incorporate the effects of CO2 and water vapor to understand their impact on climate [Forster, 2017]. Not long thereafter, scientists systematically considered what else might explain the new warming trends that started in the 1970s and that continue today.”

The essay mentions various hypotheses that have been tested and rejected and asserts:

“• theoretical calculations of the greenhouse effect based on well-known physics and chemistry


“• fingerprinting the detailed patterns of climate change caused by different human and natural influences, such as differences among regional patterns of land surface warming, ocean heat content, and sea ice extent that are consistent with an anthropogenic effect


“• growing confidence in globally distributed measurements of climate change and its impacts and greater skill in matching those observations to increasingly sophisticated computer models that include the various land, ocean, and atmospheric greenhouse gas sources, sinks, and feedbacks.”

The authors further assert

“The scientific community has gradually shifted, on the basis of evidence, from being predominantly skeptical in the 1970s that the human fingerprint on climate could be demonstrated to today being convinced that there are no other plausible explanations besides the cumulative effect of the last 150 years of burning fossil fuel for the recent extent of changing climate. Natural climatic variation is ongoing, but it cannot explain the current speed and amount of observed change. The science does not stand still; studies on how clouds may moderate the rate of climate change and how aerosols (particulate pollution) affect clouds and can offset some warming, for example, are still areas of active research in which hypotheses are being tested and challenged with characteristic scientific skepticism.”

The essay briefly illustrates the work of the UN Intergovernmental Panel on Climate Change (IPCC), which has been adopted by many US scientific communities. Essentially, the essay asserts that a Red/Blue Team Debate is unnecessary and concludes with:

Our peer-review and Academy report processes are not flashy or entertaining, but they are inclusive and tried and true and have helped build great institutions of science. They provide the evidence-based analysis of climate science and all other scientific disciplines that are so important for informing the public policy decisions that we rely upon to protect our security, health, safety, environmental integrity, and economic prosperity.

Overall, it is a powerful argument. But the skeptic may assert: show me your physical evidence that the principal cause of climate change is greenhouse gas emissions, particularly carbon dioxide, CO2. Models that have not been verified and validated are not physical evidence.

The omissions in the essay are especially telling. The rigor of peer-review process is highly doubtful. In the early 1990s, Science published the seminal work of Roy Spencer and John Christy showing how data collected by satellites can be used to estimate atmospheric temperatures around the globe, except for the polar regions. Subsequently, minor errors in calculations were found, which were promptly corrected. Greenhouse gas warming, from water vapor and CO2, occurs in the atmosphere, where there has been little warming.

By the mid-1990s it became obvious that the atmosphere was not warming as the climate modelers had projected – and that a doubling of CO2 would lead to an increase in global surface temperatures of 3 degrees C, plus or minus 50% – as speculated in the Charney Report published by NAS in 1979. The influence of CO2 on temperatures was less than speculated, even if amplified by water vapor. Yet, under editor Donald Kennedy, Science chose to ignore all work questioning this erroneous projection, thus biasing climate science, as published in the US.

Rather than fully exploring possibilities that the total influence of CO2 may be far less than that speculated in the Charney Report, the IPCC has stuck with the estimate through Five Assessment Report, starting in 1990. Its reports are mired by using surface temperature data, rather than the far more comprehensive atmosphere data, which covers the actual greenhouse effect. Its science of understanding the influence of CO2 has not advanced in twenty-five years, and the science discussed in the essay has not advanced since the 1979 Charney report – thirty-seven years ago. It remains 3 degrees C, plus or minus 50%, and the so-called distinct human fingerprint is yet to be found. The differences between climate model forecasts and observations are increasing.

Ironically, EOS, was among the first to publish a survey of the fabled 97% consensus, based on highly questionable classifications made after the survey was taken. Science published a fawning review of “Merchants of Doubt”, which personally attacked Fredrick Seitz, the former President of NAS, and three other scientists, based on accusations with little evidence. When Fred Singer, the only one of the four surviving, wrote a rebuttal, Science refused to publish it, citing lack of space. Perhaps, Mr. Davidson and Ms. McNutt are unaware of the history of these publications. [The late Fredrick Seitz was Chairman of SEPP.]

If anything, this essay demonstrates the desperate need for a frank, open debate on the empirical science of climate change and what work is needed to reach prudent policy conclusions on the use of fossil fuels. The omissions in the essay are what is important for our understanding; assertions of a false consensus are meaningless. See links under Defending the Orthodoxy.


TWTW Bias: Please note that TWTW recognizes an inherent bias in its reports. Generally, TWTW does not have the space or time to report science that support vague claims that CO2 emissions will cause significant warming sometime in the future. Instead, it focuses on major issues regarding the need to control of CO2 emissions, as claimed by some, particularly the lack of physical evidence. As one who critiqued “state-of-the-art” energy models in the 1970s, used by the US government to declare the earth would soon run out of oil, this author is very skeptical of accepting long-term projections from models that have not been thoroughly tested.


CLOUD at CERN: Dennis Avery, co-author with Fred Singer of “Unstoppable Global Warming”, discussing the need for the EPA to review its finding that greenhouse gases, mainly CO2, endanger public health and welfare. In addition to other “omissions” in forming the “consensus” global warming, Avery states:

“Last year CERN (the multi-billion-dollar Centre for European Nuclear Research) told CERN Courier subscribers that all the climate models must be re-done. CERN reported that its CLOUD experiment had used its huge particle accelerator and a giant cloud chamber to demonstrate that the sun and cosmic rays are the real ‘mystery factors’ in earth’s climate. The research supports the contention that CO2 is only a bit player.”

The article in the CERN Courier reveals the thinking of those experimenters on the importance of understanding the role of clouds on climate:

“’This is a huge step for atmospheric science,’ says lead-author Ken Carslaw of the University of Leeds, UK. ‘It’s vital that we build climate models on experimental measurements and sound understanding, otherwise we cannot rely on them to predict the future. Eventually, when these processes get implemented in climate models, we will have much more confidence in aerosol effects on climate. Already, results from CLOUD suggest that estimates of high climate sensitivity may have to be revised downwards.’”

Very simply, the science advocated by the IPCC, NAS, AGU, etc. does not have an adequate understanding of the role of atmospheric clouds to draw any conclusions, much less the conclusions these organizations have made. See links under Challenging the Orthodoxy.


Objections from the Orthodoxy? Nature Communications published a paper on a possible alternative source of energy from natural evaporation. Perhaps unknowingly, the paper questions the standard explanation for energy flow from the surface to space (OLR, Outgoing Long-Wave Radiation) in the Kiehl-Trenberth model, that is generally accepted by the orthodoxy.

The opening line of the abstract states: “About 50% of the solar energy absorbed at the Earth’s surface drives evaporation, …” If this is correct, then the standard Kiehl-Trenberth diagram of the Earth’s Global Energy Budget and the coefficients used by the National Center for Atmospheric Research (NCAR), and others, have issues. The diagram shows far less than 50% of solar energy at the surface going to evaporation. If so, there is not a consensus even in the Orthodoxy.

Writing about the response by the orthodoxy to the Millar et al. paper asserting that emission budgets which may be consistent with limiting warming to 1.5C, as discussed in the September 23 TWTW, David Whitehouse of GWPF sums up a major problem among those advocating control of CO2.

“The Nature Geoscience affair has mixed science and politics into an unsatisfactory cocktail. There is nothing wrong in writing papers on climate change that ask questions, that annoy advocates and alarmists. There is something demeaning for science when scientists try to ‘clarify’ a message that they made clear in the first place. There is something antiscientific about suppressing debate.”

See links under Defending the Orthodoxy and Questioning the Orthodoxy.


Flexibility Not Instability: The devastation caused by the Hurricane Maria that hit Puerto Rico was complicated by the failure of the local representatives of the Federal Emergency Management Agency (FEMA) to have enough fuel on-hand to run the diesel generators to provide electricity to critically needed hospitals and other medical facilities. Almost all the island lost electrical power, and with-it communications. Drinking water was in short supply, and sanitary sewer almost non-existent. Almost immediately, the finger pointing began, as if major hurricanes can be stopped.

Based on reports, the ports full of relief supplies, but the problem is to distribute the cargo to those who need it – trucks and fuel.

Add to this, that about 50% of the economy is based on manufacturing, especially pharmaceuticals and electronics. The island’s economy requires stable, reliable electricity. Unfortunately, after years of mismanagement, the government-owned electricity utility, a monopoly, is in bankruptcy, and the territory government has a large debt burden.

Mark Mills argues against the faddish demand that green electricity is the key to rebuilding the electricity infrastructure.

“But Puerto Rico didn’t go dark because of how it produces electricity. The power plants survived. The wires distributing power got destroyed.

“A greener grid would do nothing to minimize suffering after the next hurricane.”

The folly of building solar generation was demonstrated by TWTW reader Dick Hose who provided photos of the marine library of the Marine Science Institute (MSI) in Port Aransas, Texas, where Harvey made landfall. The two-story main building is concrete, built by the Corps of Engineers in the early 1940s, with initial additions designed to withstand hurricanes – these facilities are near the shore of the Gulf of Mexico. Later additions included solar panels on the roof.

After Harvey, the University of Texas library system sent a response team, which discovered some wet flooring, but the books were dry, with no apparent mold. The photos show the skeletal remains of the mounting brackets for the solar panels. The panels are part of the storm debris.

It is highly doubtful that low density, unreliable solar power is suitable for hardening an electrical grid, where high winds can be expected. See Article # 2 and links under Challenging the Orthodoxy and Changing Weather.


Number of the Week: 87 seconds. According to the final report of the Australian Energy Market Operator (AEMO), it took 87 seconds for the electrical system of South Australia to go black, with 850,000 customers losing power. The gas turbines producing electricity continued to operate. The claim by some that Australia’s gas exports contributed to the problem is “green smoke.”

Perhaps, the Black System prompted the Trump administration to request the US Federal Energy Regulatory Commission (FERC) to consider new rules to pay more to power plants that are resilient – producing reliable and stable electricity with a dependable fuel supply. See Article #3 and links under Change in US Administrations and Energy Issues – Australia.



1. First, They Came for the Biologists

The postmodernist left on campus is intolerant not only of opposing views, but of science itself.

By Heather Heying, WSJ, Oct 20, 2017


After a comment on profession football, the author continues:

The revolution on college campuses, which seeks to eradicate individuals and ideas that are considered unsavory, constitutes a hostile takeover by fringe elements on the extreme left. Last spring at the Evergreen State College, where I was a professor for 15 years, the revolution was televised—proudly and intentionally—by the radicals. Opinions not fitting with the currently accepted dogma—that all white people are racist, that questioning policy changes aimed at achieving “equity” is itself an act of white supremacy—would not be tolerated, and those who disagreed were shouted down, hunted, assaulted, even battered. Similar eruptions have happened all over the country.

What may not be obvious from outside academia is that this revolution is an attack on Enlightenment values: reason, inquiry and dissent. Extremists on the left are going after science. Why? Because science seeks truth, and truth isn’t always convenient.

The left has long pointed to deniers of climate change and evolution to demonstrate that over here, science is a core value. But increasingly, that’s patently not true.

The battle on our campuses—and ever more, in K-12 schools, in cubicles and in meetings, and on the streets—is being framed as a battle for equity, but that’s a false front. True, there are real grievances. Gaps between populations exist, for historical and modern reasons that are neither honorable nor acceptable, and they must be addressed. But what is going on at institutions across the country is—yes—a culture war between science and postmodernism. The extreme left has embraced a facile fiction.

Postmodernism, and specifically its offspring, critical race theory, have abandoned rigor and replaced it with “lived experience” as the primary source of knowledge. Little credence is given to the idea of objective reality. Science has long understood that observation can never be perfectly objective, but it also provides the ultimate tool kit with which to distinguish signal from noise—and from bias. Scientists generate complete lists of alternative hypotheses, with testable predictions, and we try to falsify our own cherished ideas.

Science is imperfect: It is slow and methodical, and it makes errors. But it does work. We have microchips, airplanes and streetlights to show for it.

In a meeting with administrators at Evergreen last May, protesters called, on camera, for college president George Bridges to target STEM faculty in particular for “antibias” training, on the theory that scientists are particularly prone to racism. That’s obvious to them because scientists persist in using terms like “genetic” and “phenotype” when discussing humans. Mr. Bridges offers: “[What] we are working towards is, bring ’em in, train ’em, and if they don’t get it, sanction them.”

Despite the benevolent-sounding label, the equity movement is a highly virulent social pathogen, an autoimmune disease of the academy. Diversity offices, the very places that were supposed to address bigotry and harassment, have been weaponized and repurposed to catch and cull all who disagree. And the attack on STEM is no accident. Once scientists are silenced, narratives can be fully unhooked from any expectation that they be put to the test of evidence. Last month, Evergreen made it clear that they wanted two of its scientists gone—my husband, Bret Weinstein, and me, despite our stellar reputations with the students they claimed to be protecting. First, they came for the biologists . . .

Science has sometimes been used to rationalize both atrocity and inaction in its face. But conflating science with its abuse has become a favorite trope of extremists on the left. It’s a cheap rhetorical trick, and not, dare I say, very logical.

Science creates space for the free exchange of ideas, for discovery, for progress. What has postmodernism done for you lately?


2. Rebuild Strong, Not Green, in Puerto Rico

Sell the bankrupt power authority and harden the grid.

By Mark P. Mills, WSJ, Oct 5, 2017


“Hurricane Maria devastated Puerto Rico’s electric grid, destroying half the island’s long-distance transmission lines and compromising most local distribution capacity. Virtually all cell towers went dark. Restoring service under these conditions would be a daunting challenge under any circumstances. But Puerto Rico Electric Power Authority filed for bankruptcy last July.

“Environmentalists are already lobbying for Prepa to build a greener grid, one less dependent on ‘old’ fuels like oil. But Puerto Rico didn’t go dark because of how it produces electricity. The power plants survived. The wires distributing power got destroyed.

“A greener grid would do nothing to minimize suffering after the next hurricane. What Puerto Rico and others need is a harder grid so that far fewer people are blasted back to the 19th century when disaster strikes and service can be restored faster after a blackout.

“Engineers can do it. Previous calamitous outages have pointed to solutions: stronger poles and wires; waterproofed substations with sturdier, higher walls; pre-emptive tree removal near wires; and, for essential parts of the system, buried wires. New classes of materials for radically stronger poles and wires are emerging, as is software that can model extreme events and radically improve system designs. Other ideas: low-power sensors that can operate in blackouts by scavenging power from nature and gather critical information for repair and recovery, and swarms of drones for rapid damage assessment.

“All this would cost far less than going green. And while the two paths are not mutually exclusive—hybrid solar-diesel emergency generators, for example—federal and state governments have spent hundreds of billions of dollars pushing for “smart” and green grids instead of resilient and restorable ones.

“Now is the time to build an extreme grid in Puerto Rico. Given the scale of the disaster, federal funding is essential, but will ignite predictable political squabbles. There is a way of approaching the problem that could benefit both Puerto Ricans and people living on exposed grids everywhere—that is, practically everyone.”

The author continues, advocating auctioning the utility to private organizations, with the promise matching funds.

Mr. Mills is a senior fellow at the Manhattan Institute and faculty fellow at Northwestern University’s McCormick School of Engineering.


3. Energy Department Urges Pricing Shift That Could Bolster Coal, Nuclear

Proposal could pay some plants for their costs and a ‘fair return on equity,’ even though they sell into competitive markets

By Timothy Puko, WSJ, Sep 29, 2017


According to the article: The Trump administration is urging independent energy regulators to change how electricity is priced, proposing new rules that would bolster revenue for coal-fired and nuclear power plants.

The Energy Department is mandating that the Federal Energy Regulatory Commission consider new rules that would effectively raise power prices to pay more to plants considered more resilient. The department suggests nuclear and coal-fired plants as potential recipients, and charges FERC with tweaking electricity markets so they give more of a reward to plants that have at least three months of fuel on site and can run uninterrupted through extreme weather, disasters or other emergencies.

Nuclear and coal-burning technologies are probably the only ones able to meet the requirements, experts said. Under the proposal, eligible plants would get paid enough to cover their costs and a “fair return on equity” whenever they run, even though they sell into competitive markets.

FERC, which regulates wholesale power markets, is under no obligation to make these changes, only to consider them. FERC officials are reviewing the proposal, a spokeswoman said. Any changes made would likely be implemented by the grid operators that run deregulated power markets under FERC’s oversight.

The article continues with discussion of what may occur over the next two months when possible new rules are being considered.



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s