Weekly Climate and Energy News Roundup #261

Brought to You by www.SEPP.org

By Ken Haapala, President, The Science and Environmental Policy Project

Major Climate Model Issues — Curry: In her paper: Climate Models for the layman” presented last week, Judith Curry discusses major issues with Global Climate Models (GMCs) and why the predictions / projections from them are not reliable. Some of these issues have been discussed by others, such as David Evans on Jo Nova’s blog, but the key points deserve repeating. In general, these issues occur in models used by the UN Intergovernmental Panel on Climate Change (IPCC) and its followers such as the US Global Change Research Program, EPA, etc. The weaknesses in the procedures used are generally buried in details by the IPCC and largely ignored or dismissed by its followers. These weaknesses should be the center of discussion, if the models are being considered for government policy.

In her discussion “What is a global climate model?” Curry states: “While some of the equations in climate models are based on the laws of physics such as Newton’s laws of motion and the first law of thermodynamics, there are key processes in the model that are approximated and not based on physical laws.” [Boldface added.] In its finding that greenhouse gases endanger human health and welfare, the EPA emphasizes that models are based on physical laws and ignores that the models are also based on approximations – educated guesses.

Curry further explains:

“Because of the relatively coarse spatial and temporal resolutions of the models, there are many important processes that occur on scales that are smaller than the model resolution (such as clouds and rainfall; see inset in Figure 1). These subgrid-scale processes are represented using ‘parameterisations’, which are simple formulas that attempt to approximate the actual processes, based on observations or derivations from more de-tailed process models. These parameterisations are ‘calibrated’ or ‘tuned’ to improve the comparison of the climate model outputs against historical observations. [Boldface added.]

“The actual equations used in the GCM computer codes are only approximations of the physical processes that occur in the climate system. While some of these approximations are highly accurate, others are unavoidably crude. This is because the real processes they represent are either poorly understood or too complex to include n the model given the constraints of the computer system. Of the processes that are most important for climate change, parameterisations related to clouds and precipitation remain the most challenging, and are responsible for the biggest differences between the outputs of different GCMs [Boldface added.]

“GCMs are used for the following purposes:

Understanding how the climate system works: sensitivity experiments are used to turn off, constrain or enhance certain physical processes or external forcings (for example, carbon dioxide, volcanoes, solar output) to see how the system responds

Reproducing past climate states: understanding the causes of past climate variability and change (for example, how much of the change can be attributed to human causes, such as carbon dioxide, versus natural causes such as solar variations, volcanic eruptions, and slow circulations in the ocean).

Global climate change: simulation of future climate states, from decades to centuries, for example simulations of future climate states under different emissions scenarios.

Attributing extreme weather: prediction and attribution of the statistics of extreme weather events (for example, heat waves, droughts, hurricanes).

Regional climate change: projections of future regional climate variations to support decision-making-related adaptation to climate change.

Guidance for emissions reduction policies.

Social cost of carbon: the output from GCMs provides the raw data used to calculate the social cost of carbon. [The boldface was italics in the original.]

 

The specific objectives of a GCM vary with purpose of the simulation. Generally, when simulating the past climate using a GCM, the objective is to correctly simulate the spatial variation of climate conditions in some average sense. When predicting future climate, the aim is not to simulate conditions in the climate system on any particular day, but to simulate conditions over a longer period – typically decades or more – in such a way that the statistics of the simulated climate will match the statistics of the actual future climate.

There are literally thousands of different choices made in the construction of a climate model (for example, resolution, complexity of the submodels, or the parameterisations). Each different set of choices produces a different model having different sensitivities. Further, different modelling groups have different focal interests, for ex- ample long paleoclimate simulations, details of ocean circulations, nuances of the inter actions between aerosol particles and clouds, or the carbon cycle. These different interests focus their limited computational resources on a particular aspect of simulating the climate system, at the expense of others.

Is it possible to select a ‘best’ model? Well, several models generally show a poorer performance overall when compared with observations. However, the best model depends on how you define ‘best’, and no single model is the best at everything.

It is certainly understandable that using models for better understanding how the climate system works and reproducing past climate states is a necessary first step. However: “Capturing the phenomena in hindcasts and previous forecasts is a necessary, but not sufficient, condition for the model to capture the phenomena in the future.”

The models have been “tuned” or “calibrated” to available 20th century surface data. As shown in the 2008 report of the Nongovernmental International Panel on Climate Change (NIPCC, p. 9), even the available surface temperature data is very sparse and erratic. The only temperature data available of appropriate density are from satellites, which are largely ignored by the IPCC and its followers. There is no logical reason to assume that the projections / forecasts from global climate models are reliable. Taking an average of forecasts from unreliable models does not produce a reliable forecast.

When the results of these unreliable models, which have not been verified and validated, are applied to government policy, real economic harm occurs. The harm can be seen in carbon dioxide (CO2) emission reduction programs, the contrived “social cost of carbon”, exaggerated sea level rise, etc. Sharply increasing electricity costs in the U.K. and Germany are but one example. Upcoming TWTWs will discuss several other major issues concerning global climate modeling. See links under Challenging the Orthodoxy – NIPCC, Challenging the Orthodoxy, and Questioning European Green.

###################################################

Quote of the Week. There’s a mighty big difference between good, sound reasons and reasons that sound good. – Burton Hillis (William Vaughn, American columnist)

###################################################

Number of the Week: Up 22%; while down 33% and 13%

###################################################

Sea Level Rise Forecasts – Compounding Unreliability: The February 25 TWTW linked to efforts in the City of Boston to prepare for sea level rise of up to 8.2 feet (2.5 meters) by 2100. Such an effort would cost billions of dollars. In January, NOAA’s Center for Operational Oceanographic Products and Services (CO-OPS) released a report stating the basis of such claims: 1) it uses unreliable global climate models to speculate on future temperatures; 2) it uses these to project speculative estimates of future sea levels; then, 3) it uses the speculated “collapse of the West Antarctic Ice Sheet” to top-off the speculation. The first two parts come from the IPCC, the third part is added on by CO-OPS.

As discussed in the January 22 TWTW, retired NASA meteorologist Thomas Wysmuller explored the correlation between CO2 and sea level rise and found no measurable linkage between sea level and CO2! “For the past 2,000 years, Sea Level rise was unchangingly linear, increasing between 1 & 1.5 mm/yr.” The maximum rise is about 6 inches per century. This has continued for the past 135 years, even though CO2 concentrations have increased by 38%.

As Wysmuller indicates on his web site, the high-end estimates of the IPCC, etc. exceed the average century sea level rise that occurred over the 7,000 years when the great ice sheets covering much of northern Eurasia and most of North America melted. The great Laurentide Ice Sheet is no more, except in Greenland and small parts of Canada.

In November 2016, Wysmuller gave presentations on the lack of linkage between CO2 and sea level rise at the 5th Annual World Congress of Ocean in Qingdao China and in January at NASA’s Johnson Space Flight Center in Houston (not to be confused with NASA-GISS in New York City on Broadway)

However, the IPCC and CO-OPS use a direct relationship between CO2 emissions and general mean sea level (GMSL), yet to be demonstrated. The CO-OPS report states: “The 0.3 m-2.5 m GMSL range for 2100 is discretized by 0.5-m increments and aligned with emissions-based, conditional probabilistic storylines and global model projections into six GMSL rise scenarios: a Low, Intermediate-Low, Intermediate, Intermediate-High, High and Extreme, which correspond to GMSL rise of 0.3 m, 0.5 m, 1.0 m, 1.5 m, 2.0 m and 2.5 m, respectively.” [Boldface added.]

The “conditional probabilistic storylines” have yet to be empirically demonstrated. Further, the extreme case for CO-OPS is based on the “collapse” of the West Antarctic Ice Sheet, which may take thousands of years and may be due to geothermal warming from the Antarctic fault-rift system underlying it, unrelated to CO2. Such details are ignored by CO-OPS.

Contrarily, the web site, CO2 Science, reports an August 2016 study by Phil Watson of the School of Civil and Environmental Engineering, University of New South Wales, Australia, which highly questions some claims of acceleration in sea level rise in the U.S, while recognizing regional differences in the data. The abstract states:

“The physics-based climate projection models are forecasting that the current global average rate of mean sea-level rise (≈3 mm/y) might climb to rates in the range of 10–20 mm/y by 2100. Most research in this area has centred on reconciling current rates of rise with the significant accelerations required to meet the forecast projections of climate models. The analysis in this paper is based on a recently developed analytical package titled “msltrend,” specifically designed to enhance estimates of trend, real-time velocity and acceleration in the relative mean sea-level signal derived from long annual average ocean-water-level time series. Key findings are that at the 95% confidence level, no consistent or substantial evidence (yet) exists that recent rates of rise are higher or abnormal in the context of the historical records available for the United States, nor does any evidence exist that geocentric rates of rise are above the global average. It is likely that a further 20 years of data will identify whether recent increases east of Galveston and along the east coast are evidence of the onset of climate change induced acceleration.” [Boldface added]

Watson has another study on European trends, for which the full text was available. He found no clear evidence of an acceleration trend in Europe as well. He cited studies claiming the rise and / or land subsidence will result in damage costing tens of billions of Euros by 2080. The various graphs of what may happen make Mr. Mann’s hockey-stick appear modest.

Future flooding from realistic sea level rise and land subsidence are serious issues. But, exaggeration of the problems leads to loss of credibility and scorn for the entities that exaggerate. See links under Communicating Better to the Public – Make things up and Review of Recent Scientific Articles by CO2 Science.

*****************

The Mathematical Minister: Writing in The Telegraph, Rupert Darwall reports on an amazing comment concerning reliable electricity in the U.K. and how to stop price increases: “Greg Clark, the Business and Energy Secretary, is threatening government action whenever the markets are not working for consumers. As the Lords report makes clear, the real problem is government policy not working for consumers.

“In his evidence, Greg Clark claimed not to see conflict between security of supply, having more weather-dependent wind and solar, and cutting energy bills. Instead, he described it as an invitation “to see how we can solve them as simultaneous equations”. [Boldface added.]

“Pseudo-mathematical flannel can’t hide the fact that you can’t simultaneously maximise two variables, let alone three. You can’t maximise the amount of wind and solar while maintaining grid reliability and simultaneously drive electricity prices down. Something has to give – and it’s consumers who are picking up the rising bill for the Government’s Mission Impossible.”

Someday simultaneous equations may successfully describe and forecast the erratic nature of sunshine and wind, but they certainly do not cause the erratic nature. See link under Questioning European Green

*****************

Number of the Week: Up 22%, down 33% and 13%. In the article mentioned above, Darwall reports that: “Between 2008 and 2015, the average electricity bill rose by 22pc [%]. Over the same period, the price of hydrocarbon fuels used by power stations fell sharply – coal down by 33pc and gas by 13pc. The rise in electricity prices is wholly attributable to government policies – and would be even higher if coal and gas prices had not fallen.”

According to the US Energy Information Administration, 2008 to 2015, the average residential price increased 12%.

In 2009, the Democrats controlled the US Senate, yet the Senate did not pass President Obama’s cap-and-trade bill as a means of controlling CO2 emissions. The stated purpose of the bill was to address the dangerous global warming proclaimed in the IPCC’s Fourth Assessment Report (AR-4, 2007). The Senators who stood up to the leadership should be thanked. Otherwise, American may be facing the same increasing electricity costs being faced by the consumers in the U.K., whose Parliament passed the Climate Change Act of 2008. See link under Questioning European Green, Article # 2, and http://www.eia.gov/electricity/annual/html/epa_01_02.html

*****************

ARTICLES:

1. Draining the Regulatory Swamp

The Congressional Review Act is even better than we thought.

Editorial, WSJ, Feb 28, 2017

https://www.wsj.com/articles/draining-the-regulatory-swamp-1488328398

SUMMARY: After mentioning that some have criticized the current Congress as “do nothing”, the editorial states: “the House has already voted to repeal 13 Obama-era regulations, and President Trump signed his third on Tuesday. Now the GOP should accelerate by fully utilizing the 1996 Congressional Review Act.

“Republicans chose the damaging 13 rules based on a conventional reading of the CRA, which allows Congress to override regulations published within 60 legislative days, with simple (50-vote) majorities in both chambers. Yet the more scholars examine the law, which had only been used successfully once before this year, the clearer it is that the CRA gives Congress far more regulatory oversight than previously supposed.

“Spearheading this review is the Pacific Legal Foundation’s Todd Gaziano—who helped write the 1996 act—and the Heritage Foundation’s Paul Larkin. Their legal findings, and a growing list of rules that might be subject to CRA, are on http://www.redtaperollback.com.

“The pair argue, first, that the CRA defines “rule” broadly. The law relies on the definition in the Administrative Procedure Act, which includes any “agency statement” that is “designed to implement, interpret, or prescribe law or policy.” This includes major and minor rules as well as “guidance”—letters that spell out an agency’s interpretation of a law.

“This matters because President Obama’s regulators often ducked the notice and comment of formal rule-making by issuing “guidance” to act as de facto regulation. Examples include the guidance requiring transgender bathrooms in public schools, which the Trump Administration recently withdrew, or the 2011 guidance dictating how universities must handle sexual assault. The latter is ripe for CRA repeal.

“The second discovery is the law’s definition of when the clock starts on Congress’s time to review rules. The CRA’s opening lines require any agency promulgating a rule to present a “report” containing the rule’s text and definition. The CRA explains that Congress’s review period begins either on the date the rule is published in the Federal Register, or the date Congress receives the report—whichever comes later.

“Thus any rule for which any Administration (going back to 1996) failed to submit a report is fair game for CRA review and repeal. The Trump Administration can begin the clock merely by submitting a report to Congress.

“Our own search suggests past Administrations were fairly diligent about presenting reports for major rules. But a 2014 study by the Administrative Conference of the United States found at least 43 ‘major’ or ‘significant’ rules that had never been reported to Congress.

“The study estimated a further 1,000 smaller rules a year that agencies had failed to report. The study focused only on formal rules—not “guidance” that also requires a report to Congress under the CRA. Redtaperollback.com is offering tools so citizens can examine whether past rules have reports.

“A third discovery could be the most important. The opening words of the CRA read: ‘Before a rule can take effect’ the federal agency in question must submit a Congressional report. No one has tested the legal limits of this provision, but a fair reading suggests the Trump Administration could declare any rule for which a report has not been submitted to be null and void.

“The White House would be wise to start by simply directing federal agencies to catalog which rules have reports—and then devise a strategy with Congress. Some rules might deserve to stay on the books. Some bad rules might get reported to Congress for repeal under the CRA. Others could be declared null and void—which saves the trouble of formally reversing them. This last approach might appeal to Congressional Republicans who are fretting that a CRA crush is diverting them from health-care and tax reform.

“Democrats will howl in response to an aggressive use of the CRA, but the law was designed to impose penalties on agencies that failed to keep Congress informed. As Mr. Gaziano says, ‘the entire point of the CRA was to restore some minimal level of constitutional accountability over agencies that take a broadly worded statute as license to run wild.’

“The CRA is the most immediate tool Republicans have to reimpose democratic accountability on a lawless bureaucracy, and they should use it to the fullest.”

***************

2. The Carbon Tax Chimera

The Shultz-Baker proposal sounds better than it would work.

Editorial, WSJ, Feb 24, 2017

https://www.wsj.com/articles/the-carbon-tax-chimera-1487979109

SUMMARY: The editorial begins: “The climate may change but one thing that never does is the use of climate change as a political wedge against Republicans. Also never changing is the call from some Republicans to neutralize the issue by handing more economic power to the federal government through a tax on carbon. The risk is that Donald Trump takes up the idea, which would hurt the economy with little benefit to the environment.

“George Shultz and James Baker, the esteemed former secretaries of State, have joined a group of GOP worthies for a carbon tax and recently pressed the case in these pages. They propose a gradually increasing tax that would be redistributed to Americans as a “dividend.” This tax on fossil fuels would replace the Obama Administration’s Clean Power Plan and a crush of other punitive regulations. Energy imports from countries without a similar structure would face a tax at the border.

“A carbon tax would be better than bankrupting industries by regulation and more efficient than a “cap-and-trade” emissions credit scheme. Such a tax might be worth considering if traded for radically lower taxes on capital or income, or is narrowly targeted like a gasoline tax. But in the real world the Shultz-Baker tax is likely to be one more levy on the private economy. Even if a grand tax swap were politically possible, a future Congress might jack up rates or find ways to reinstate regulations.

“Another problem is the “dividend.” A carbon tax would be regressive, as the poor spend more of their income on gasoline and household energy. The plan purports to solve this in part by promising to return the tax to the American public. But the purpose of taxes is to fund government services, not shuffle money from one payer to another. No doubt politicians would take a cut to funnel into renewable energy or some other vote-buying program.

“The rebates would also become a new de facto entitlement with an uncertain funding future. A family of four would receive a $2,000 payout in the first year from a carbon tax, according to a report from the Climate Leadership Council, and that “amount would grow over time as the carbon tax rate increases.” But the point of taxing carbon is to emit less of it, and eventually revenues would decline as the tax rate rises. The public would then receive minimal or no help paying for energy the government made more expensive, and the progressives will try to make up the difference by raising other taxes.”

The editorial then discusses other alternatives, all of which raise the costs of energy to the consumer.

***************

3. Trump’s Clean Watershed

He orders the EPA to review Obama’s illegal waterways regulation.

Editorial, WSJ, Feb 28, 2017

https://www.wsj.com/articles/trumps-clean-watershed-1488328077

SUMMARY: The editorial states: “Speaking of deregulation (see nearby) [Article # 1], President Trump on Tuesday ordered the Environmental Protection Agency to reconsider an Obama Administration rule that seized control over tens of millions of acres of private land under the pretext of protecting the nation’s waterways. EPA chief Scott Pruitt will now follow due process to rescind one of his predecessor’s lawless rule-makings.

In 2015 the Obama EPA reinterpreted the Clean Water Act with a rule extending its extraterritorial claims to any creek, muddy farm field, ditch or prairie pothole located within a “significant nexus” of a navigable waterway. EPA defined significance broadly to include any land within the 100-year floodplain and 4,000 feet of land already under its jurisdiction, among other arbitrary delimitations.

Mr. Trump summed it up well, if not eloquently, when he said “it’s a horrible, horrible rule” and “massive power grab” that has “sort of a nice name, but everything else is bad.”

The rule would force farmers, contractors and manufacturers to obtain federal permits to put their property to productive use. After recent flooding in California, millions of more acres could come under EPA’s jurisdiction. Green groups could use the rule to block pipelines, housing projects or any development they don’t like. Farmers might be prohibited from using fertilizers that could flow downstream.

The Clean Water Act applies only to navigable waterways, but the EPA seized on the opening created by Justice Anthony Kennedy in the unfortunate 2006 Supreme Court case Rapanos v. U.S. that split 4-1-4. His controlling opinion invented the “significant nexus” standard that is a classic in judicial ambiguity and which the EPA used to expand government control over private property development.

The editorial then discusses the many lawsuits filed by states and other affected parties against the EPA and concludes: “EPA even acknowledged that the ‘science available today’ doesn’t support the regulation. Mr. Pruitt will be doing a national public service if he advises the Justice Department to withdraw the rule as an abuse of administrative power.”

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s