Weekly Climate and Energy News Roundup #400

The Week That Was: February 22, 2020, Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project

Quote of the Week: “I never guess. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.” – Sir Arthur Conan Doyle. [H/t Eric Wagner]

Number of the Week: £108.5 million (about $140 million) in 2018

The Scientific Method: There appears to be no clear, widely accepted definition of science or the scientific method. Professor of Applied Mathematics and philosopher Christopher Essex considers science to be an adventure. A long game of generations and part of the ascent of Man. Not just a fad invented in the 17th century. In an unpublished paper, “The Scientific Adventure,” he wrote for the 100th anniversary of Einstein’s 1905 discoveries, he stated:

“Others try to embrace it as a recipe. They say, to be scientific, do this, then do that, but not the other way around. They talk of the scientific method as if there is just one; as if scientific discovery were clean, orderly and uncontroversial, supervised by grizzled elders of authority. But the search for scientific discovery is anything but. It is messy, contentious, factional, but also wondrous, inspired, and above all serendipitous. It is human.”

“… Scientists are human beings. They struggle and are flawed like anyone else, but their occasional successes and flashes of brilliance are a credit to us all.

“Many nonscientists, especially reporters and politicians, look to authority to decide scientific matters at the frontiers of knowledge. But there is no authority in scientific thinking, because it is thinking with your own head, instead of someone else’s. It is the resolute acceptance of each our own human fallibility and the conviction that we may rise above it by painstaking care, always checking what we think we know, putting our dearest presumptions to the test. No human is more dangerously fallible than an academic peacock tripping over his or her own plumage.”

“Appealing to authority in a wide-open intellectual frontier, where whole research fields may be hanging in the balance, presents a dilemma that few realize. What makes a person qualified to decide who is qualified? How do you know who is an expert and who is not without being an expert yourself? How do you know when there are no true experts at all? You don’t know. Appealing to authority does not work here. You must think with your own head. As soon as you do, the scientific journey begins for you.

“Scientific thinking is what has produced the collection of knowledge that we loosely call science. That collection is a legacy of the generations that belongs to all of us. It is the greatest human treasure. Even though we keep testing and checking it, we could not rediscover it or figure it out in an afternoon or a generation if we misplaced it. Yet, we sometimes ask children in classrooms to rediscover, unguided, structures that took humanity a thousand generations to figure out. Nonetheless when the labours of a thousand generations come to bear, focused white hot into the moments of our lives, we are transformed into something grander than our individual biology could ever allow.

“We mark an Einstein centenary for the wondrous year 1905. Some of Einstein’s famous five articles published that year were both controversial and slow to be accepted. That response, unlike the articles themselves, was business as usual. Science did not begin then, and it has not ended today. Many have trouble imagining that there is anything more to learn, as they fancy that someone, somewhere knows anything that can be imagined. But we can imagine a lot. We know very little even of that, never mind what we cannot imagine. The frontier of the unknown is close by—very close by. It is all around us, waiting.” [No link provided]

TWTW cannot improve on what Essex expressed, but may shorten it by stating:

Centuries of research and thinking have created a dynamic process for integrating new ideas and incorporating new physical evidence into concepts about the physical world. Through this process, some previous errors are eliminated but errors may remain, and improvements are welcome if properly explained and supported. Called the scientific method, this process is not exclusive to physics.

The scientific method requires repeated testing of concepts against all appropriate physical evidence gathered by observations and experiments. If the concepts fail to describe the evidence, the concepts must be changed or discarded. In short, science progresses by identifying error and forming new ideas that are subjected to repeated testing.

To TWTW, non-transparent manipulation of data—data tampering—by government entities is outrageous because it thwarts the ability to eliminate error. Instead, data tampering promotes error. In his posts, Tony Heller is providing a service to those who pursue scientific knowledge by exposing data tampering by government entities. See links under Challenging the Orthodoxy and Communicating Better to the Public – Make things up.


Obstructing Mid-range Weather (2-Week) Forecasting: Writing in his blog, ICECAP, Meteorologist Joe D’Aleo, the first Director of Meteorology at The Weather Channel, explains how he spent his career trying to attribute changing weather to other natural events, such as changes in El Niños and La Niñas affecting global weather. Temperatures are a critical part in understanding such relationships. Unfortunately, NOAA and NASA-GISS have been making many small alternations to the data records, making historic surface data unreliable. D’Aleo writes:

In fact, the magnitude of their historical data adjustments, which removed their cyclical temperature patterns, are completely inconsistent with published and credible U.S. and other temperature data. Thus, despite current assertions of record-setting warming, it is impossible to conclude from the NOAA and NASA data sets that recent years have been the warmest ever.

“Models had become the principal tool to create data to fill in the large gaps that existed in the historical data as well as adjust the data. By mixing data with models you can more easily manipulate trends to better agree with theories. Professional societies and their journals became an important part of the climate cabal, controlling what gets published. With help from the compliant media every event is proclaimed to be our fault and made a reason to abandon low cost and reliable energy.” [Boldface in original]

D’Aleo goes on to discuss the real danger.

“Current climate policies- based on these unreliable temperature records – threaten our economic and national security interests. As the proposed climate policies grow more extreme, the consequences of allowing this record to remain unchallenged gravely threatens an onslaught of litigation based on the greenhouse gas endangerment finding. Importantly, this litigation imposes significant impediments to the mineral land leasing and pipeline infrastructure build out necessary to maintain and enhance energy independence and economic prosperity.

“Furthermore, the US financial sector has already dramatically curtailed its support of conventional energy source development in large part due to the continued calls for regulatory destruction of the fossil fuel industry based substantially on NOAA and NASA’s now invalidated global surface temperature records. This situation is putting our Nation’s energy security at grave risk – which means our economic and national security are also in great peril.”

See links under Challenging the Orthodoxy.


Inadequate Computer Resources? According to Meteorologist Cliff Mass, who is not a “global warming skeptic,” the “U.S. global numerical weather prediction has now fallen into fourth place, with national and regional prediction capabilities a shadow of what they could be.” US numerical weather prediction is operated under NOAA’s National Weather Service. In his blog, Mass writes:

There are several reasons for these lagging numerical weather prediction capabilities, including lack of strategic planning, inadequate cooperation between the research and operational communities, and too many sub-optimal prediction efforts.

But there is another reason of equal importance: a profound lack of computer resources dedicated to numerical weather prediction, both for operations and research.

The bottom line: U.S. operational numerical weather prediction resources used by the National Weather Service must be increased 10 times to catch up with leading efforts around the world and 100 times to reach state of the science.

Why does the National Weather Service require very large computer resources to provide the nation with world-leading weather prediction?

Immense computer resources are required for modern numerical weather prediction. For example, NOAA/NWS TODAY is responsible for running:

A global atmospheric model (the GFS/FV-3) running at 13-km resolution out to 384 hours.

· Global ensembles (GEFS) of many (21 forecasts) forecasts at 35 km resolution

· The high-resolution Rapid Refresh and RAP models out 36 h.

· The atmosphere/ocean Climate Forecast System model out 9 months

· The National Water Model (combined WRF and hydrological modeling)

· Hurricane models during the season

· Reanalysis runs (rerunning past decades to provide calibration information)

· Running the North American mesoscale model (NAM)

· Running the Short-Range Ensemble Forecast System (SREF)

This is not a comprehensive list. And then there is the need for research runs to support development of the next generation systems. As suggested by the world-leading European Center for Medium Range Weather Prediction, (ECMWF) research computer resources should be at least five times greater than the operational requirements to be effective.

Mass describes that the European Center is building a new $89 million computer center in Italy and that the UK Met Office is set to build a ‘billion-pound’ supercomputer ($1.3 billion). Mass thinks the current US political leadership is willing to make US weather prediction first-rate. He further states:

“NOAA/NWS must develop a detailed strategic plan that not only makes the case for more computer resources but demonstrates how such resources will improve weather prediction. Amazingly, they have never done this. In fact, NOAA/NWS does not even have a document describing in detail the computer resources they have now (I know, I asked a number of NOAA/NWS managers for it–they admitted to me it doesn’t exist).

“With such a plan Congress should invest in the kind of computer resources that would enable U.S. weather prediction to become first rate. Ten times the computer resources (costing about 100 million dollars) would bring us up to parity, 100 times would allow us to be state of the science (including such things as running global models at convection-permitting resolution, something I have been working on in my research).” [Boldface in original.]

Mass goes on to conclude:

“U.S. citizens can enjoy far better weather forecasts, saving many lives and tens of billions of dollars per year. But to do so will require that NOAA/NWS secure vastly increased computer resources and reorganize weather model development and operations to take advantage of them.”

According to the BBC, in answering the question “What will the supercomputer actually do?” The response was: “It’ll run what the Met Office calls its ‘digital twin’ of the Earth’s atmosphere, a highly detailed ‘model’ of everything from the winds to the temperatures to the pressures.

And here is the problem. The US government has spent tens of billions of dollars on global climate models that cannot describe the atmosphere and how the atmosphere responds to increasing carbon dioxide (CO2). US Global climate modeling is a failure. Further, as Joe D’Aleo explains above, by their actions NOAA and NASA-GISS have invalidated global surface temperature records, which US climate modelers use to test their models. Is there any reason that these agencies will not divert funds needed for improved weather prediction to serve their own ideological beliefs? See links under Seeking a Common Ground.


National Climate Assessment: According to E & E News, work is beginning on the Fifth National Climate Assessment (NCA) under the Office of Science and Technology Policy, which is overseen by Trump’s science adviser, Kelvin Droegemeier. The report is scheduled for release in 2022.

In the view of TWTW, the previous assessment released in November 2018, after President Trump took office, lacked the physical evidence to substantiate many of its claims. For example, it claimed that global warming would force movement of major US crop production farther north. The report ignored that the major world competitor for the two largest crop exports, maize and soybeans, is tropical Brazil. Why the US would become too hot to produce these crops needs to be explained.

The report in E & E News emphasized the views of the leaders of the previous National Climate Assessment and issues on whether a red team / blue team approach would be beneficial. Apparently, the Trump administration was not pleased with the Fourth National Climate Assessment. According to its website the US Global Change Research Program (USGCRP):

“USGCRP was established by Presidential Initiative in 1989 and mandated by Congress in the Global Change Research Act (GCRA) of 1990 to develop and coordinate ‘a comprehensive and integrated United States research program which will assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.’”

Prior National Climate Assessment reports have ignored the natural process of global change and emphasized human influence, particularly CO2. According to the USGCRP website the executive director is still Mike Kuperberg, who assumed office in July 2015, during the Obama Administration. Further, the last budget stated in the website was the President’s proposed budget for Fiscal Year 2019, which ended last September. Obviously, the website needs to be updated. See links under Change in US Administrations


Data-Free Science? A relatively new publication, “Molecular Brain is an open access, peer-reviewed journal that considers manuscripts on all aspects of studies on the nervous system at the molecular, cellular, and systems level providing a forum for scientists to communicate their findings.”

In an editorial, editor-in-chief of Molecular Brain, Tsuyoshi Miyakawa, describes how he classified 41 of the 180 manuscripts he has handled since early 2017 as “Revise before review” and requested the authors to provide raw data. He has been unable to obtain sufficient raw data on 41 of 180 manuscripts. Based on the responses to his request Miyakawa concludes:

“…Thus, more than 97% of the 41 manuscripts did not present the raw data supporting their results when requested by an editor, suggesting a possibility that the raw data did not exist from the beginning, at least in some portions of these cases.

“Considering that any scientific study should be based on raw data, and that data storage space should no longer be a challenge, journals, in principle, should try to have their authors publicize raw data in a public database or journal site upon the publication of the paper to increase reproducibility of the published results and to increase public trust in science.” See links under Seeking a Common Ground.


Mission Creep: Often applied to military operations, mission creep involves gradual or incremental expansion of a mission by an organization. It is a characteristic of many bureaucracies, the UN being one. According to its website:

“UNESCO is the United Nations Educational, Scientific and Cultural Organization. It seeks to build peace through international cooperation in Education, the Sciences and Culture.”

Now UNESCO has added:

“UNESCO’s programmes contribute to the achievement of the Sustainable Development Goals defined in Agenda 2030, adopted by the UN General Assembly in 2015.”

As discussed by Donna Laframboise, under this “expanded mission” UNESCO has published a new report for sustainable development: “No Plan, No Planet.” How did the planet and humanity survive before the UN existed? See links under Defending the Orthodoxy and https://en.unesco.org/about-us/introducing-unesco.


Number of the Week: £108.5 million (about $140 million) in 2018. The 2018 budget for The European Center for Medium Range Weather Prediction was $140 million. The FY 2018 enacted budget for the USGCRP was over 18 times that, $2,546 million. The fiscal year ended September 30, 2018. This amount “represents the funds self-identified by USGCRP agencies as their contributions to USGCRP research activities.” It does not represent amounts used by agencies that may benefit USGCRP. “For example, many of the satellite systems and observing networks that are foundational to USGCRP research were originally implemented by their sponsoring agencies for operational purposes, and thus typically are not included in the research crosscut.”

Included in the $2.5 billion budget was $320 million from the Department of Commerce, which includes NOAA and National Weather Service. Other big line items were $1.5 billion from NASA, $250 million from the National Science Foundation, $240 million from DOE and $175 million from the Department of Agriculture.

The problem with NOAA/NWS is not the lack of money available, but it is the lack of proper allocation of money and proper oversight of its use. The country does not need $2.5 billion programs claiming to assess climate change that cannot properly describe the atmosphere and that rely on tampered data. See links under Seeking a Common Ground and https://www.globalchange.gov/about



Can Solar Power Compete With Coal? In India, It’s Gaining Ground

Electricity from sunlight costs less, a hopeful sign for developing nations building out their power grids

By Phred Dvorak, WSJ, Feb 17, 2020


TWTW Summary: In a glowing report, the reporter discusses the virtues of solar power. He begins:

“In a dusty northwest India desert dotted with cows and the occasional camel, a solar-power plant is producing some of the world’s cheapest energy.

“Built in 2018 by India’s Acme Solar Holdings Ltd., it can generate 200 megawatts of electricity, enough to power all the homes in a middle-size U.S. town. Acme sells the electricity to distributors for 2.44 rupees (3.4 cents) a kilowatt-hour, a record low for solar power in India, a country that data trackers say has the world’s cheapest solar energy.

“More remarkable, the power costs less to generate in India than the cheapest competing fossil fuel—coal—even with subsidies removed and the cost of construction and financing figured in, according to the Indian government and industry trackers.

“Price-conscious Indian utilities are eager to snap up that power. ‘We are infamous for low cost,’ says Sandeep Kashyap, Acme’s president.

“Solar power has entered a new global era. The industry was long dependent on subsidies and regulatory promotions. Now, technological innovation and falling solar-panel prices have made solar power inexpensive enough to compete on its own with other fuel sources in some regions, when it comes to newly built plants. That could turbocharge growth of renewables in the global energy industry, especially in fast-growing Asian markets where much of the world’s energy infrastructure expansion will take place.

“Governments in many solar markets—including China, the biggest—are phasing out or reducing supports. Solar-plant development is going mainstream, with finance provided by global investors like Goldman Sachs Group Inc., Singaporean sovereign-wealth fund GIC and huge Western pension and private-equity funds.

“So far, the renewable-energy push hasn’t halted the growth of global energy emissions. But the success of countries like India in feeding their rising power demands with clean energy will still be key to blunting the growth of global challenges like pollution and climate change.

“The price declines in solar panels and the power they produce are jolting the industry. In the past decade, solar has grown from less than 1% of the world’s electric-power capacity to an estimated 9% by the end of this year, according to the International Energy Agency, an intergovernmental organization focused on energy policy. By 2040, the IEA expects that to grow to 24%, which would make solar the largest single energy source.

“India is at the forefront of the trend, with a cost of building solar capacity that has dropped 84% in eight years, according to the International Renewable Energy Agency, an intergovernmental organization focused on renewable energy. Other countries are close behind, with costs falling fast in Australia and China.

“India has increased the amount of solar power it has installed 10-fold in the past five years, to 32 gigawatts, and the government is hoping to triple that in the next few years—one of the fastest paces of growth anywhere. India’s prime minister last year said he wants 450 gigawatts of renewable energy including solar installed by 2030.

“If India manages that, which many analysts say is a real stretch, it would account for nearly all the additional electric capacity the country’s Central Electricity Authority has projected it would add by then, and more than the country’s total from all power sources now. India has pledged as a climate goal that 40% of its electric capacity will come from non-fossil fuels by 2030; the latest renewable targets would likely put that percentage at over half.

“In 2018, India’s ‘levelized’ cost of solar-power generation—an analysis removing the impact of direct subsidies and figuring in the costs of construction and financing for a new plant—fell to 14% below that of coal, the first time anywhere in the world that generating solar was cheaper than coal on that basis, according to international energy consulting firm Wood Mackenzie.

“India’s national energy plan doesn’t anticipate construction of new coal power plants for at least several years. Even state-controlled Coal India, one of the world’s largest coal-producing companies, in November said it planned a pilot solar project as it navigated a future with less coal.

“Across Asia, a region expected to account for two-thirds of the world’s new power demand during the next two decades, price declines will make wind and solar combined 17% cheaper than coal by 2030 on a levelized basis, says Wood Mackenzie. In India, solar generation will be almost 50% cheaper, it projects.

“‘This is a revolution in power generation costs,’ says Wood Mackenzie analyst Alex Whitworth. ‘What it means is there will be a lot more solar investment in India, and in countries like India.’

“Solar’s big problem: It generates power only when the sun shines. Wind power, similarly, works only with wind. So displacing fossil fuels could require cheaper ways to store energy. And the more renewables in the power-transmission grid, the more the grid will need to be rebuilt to accommodate those special characteristics.

“That inefficiency is why the IEA forecasts the amount of power solar generates to rise to only 11% of the world’s total by 2040, around half that of coal or natural gas.

“In India, which has some of the world’s best conditions for generating solar power, the mismatch is pronounced because demand for electricity swells after people go home and switch on air conditioners in the evening, when solar plants aren’t working.”

The article goes on to discuss coal-fired plants then gets to what may become a major difficulty:

“Challenges will likely multiply when solar power in India’s grid rises from the current 9% to around 20% or 30%—a level at which it may start replacing conventional power plants, say experts like Rahul Tongia, a fellow at the India arm of think tank Brookings Institution.

“‘What happens after that point when the low-hanging fruit is done?’ says Mr. Tongia.

“India’s solar push started in 2010, when its government outlined plans for a modest boost in capacity during the next decade. Solar was a good fit for India’s growing energy needs. Plants are easy to build—essentially solar panels lined up in racks—and labor is inexpensive. India has big stretches of sparsely populated land and intense sun, good for vast spreads cranking out power.”

The article discusses building new transmission lines and similar issues, but does not address the key issue: “What happens after that point when the low-hanging fruit is done?” Except to quote a solar power advocate:

“Experts like ReNew’s Mr. Sinha say it will likely be several years before India builds so much solar capacity that the technology’s daytime power surges and nighttime plunges could affect the country’s overall electricity supply. By that time, says Mr. Sinha, other new technologies such as batteries and systems for shipping electricity may be available to smooth out irregularities.”

Experts have been looking for the solution for over 100 years, and the only solution that works thus far is pumped-hydro storage for reliable power plants, not erratic ones.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s