By Ken Haapala, President www.SEPP.org
The Science and Environmental Policy Project
Science or Dogma: In the 30 years between the 1979 Charney report to the National Academy of Sciences on an investigation of the possible effects of increased carbon dioxide on the earth’s temperatures to the 2009 EPA’s finding that carbon dioxide, and other greenhouse gases, endanger human health and welfare; government-funded Climate Studies have largely turned from empirical science to dogma – a belief system unsubstantiated by physical evidence.
The Charney report included some of the nation’s best meteorologists and climate researchers and the report recognized that laboratory tests demonstrated that the direct influence on global temperatures from doubling carbon dioxide would be minor – possibly unmeasurable.
The report also identified educated guesses – estimates – that the CO2 influence might be greatly enhanced by increases in water vapor – the dominant greenhouse gas. If correct, this positive feedback would greatly multiply any increase from CO2. The report recognized that the warming would occur in the atmosphere, and that we did not have comprehensive measurements of atmospheric temperatures. Thus, the hypothesis of significant atmospheric warming from increased water vapor could not be tested.
In March 1990, Science Magazine published a paper by Roy Spencer and John Christy describing a method of using data collected from NOAA polar orbiting weather satellites to comprehensively calculate atmospheric temperatures for virtually the entire globe, except for the extreme poles.
These data cover about 97 to 98 percent of the globe, including oceans, deserts, mountain ranges, jungles, etc. where there are few surface instruments. Initially, certain small errors in calculation were discovered, including orbital decay. These were acknowledged and corrected. This is how science advances.
These data, published monthly, are independently calculated by two other entities and are independently verified by four sets of weather balloon data using different instruments. The government-sponsored United Nation’s Intergovernmental Panel on Climate Change (IPCC) the US Global Change Research Program (USGCRP), and the EPA largely ignore the atmospheric data, which is far more comprehensive and better tested than surface data.
Unfortunately, subsequent government-funded research went from properly testing the educated guesses (hypotheses) in the Charney Report to using them to create fear of global warming, now called climate change. Economically drastic programs and government policies have been justified based on these untested guesses.
From 1993 to 2016, the US government spent over $40 Billion on what government entities classify Climate Science – and has produced no refinement to the 1979 Charney Report.
Independent scientists and climate researchers have produced far better estimates of the influence of CO2, based on empirical (scientific) observations. But, that research is not included in official government publications.
Public policies on energy and the environment should be based on the best available empirical science, not on incomplete studies, which have become dogma.
On March 29, the U.S. House Committee on Science Space & and Technology held a hearing titled “Climate Science: Assumptions, Policy Implications, and the Scientific Method” featuring climate scientists John Christy, Judith Curry, Michael Mann, and Roger Pielke Jr., who recently left the field, in part because of abusive tactics by certain members of Congress. Comparing the written testimony of John Christy with that of Michael Mann provides a stark illustration of the difference between empirical science and scientific dogma. The testimony of Judith Curry and Roger Pielke Jr. will be discussed in a subsequent TWTW. See links under Challenging the Orthodoxy, Defending the Orthodoxy, and Seeking a Common Ground.
Quote of the Week. “There’s nothing wrong with ideas, with working hypotheses, but unsubstantiated glittering generalities are a waste of time.” — American geologist (John) David Love [H/t Lowell Ray Anderson]
Number of the Week: Over 20% per year
John Christy’s Written Testimony: The Professor of Atmospheric Science at University of Alabama, Huntsville, began his written statement with a summary of what constitutes empirical science and how it applies to “official” science of the IPCC and its followers:
“’Science’ is not a set of facts but a process or method that sets out a way for us to discover information and which attempts to determine the level of confidence we might have in that
information. In the method, a “claim” or “hypothesis” is stated such that rigorous tests might be employed to test the claim to determine its credibility. If the claim fails a test, the claim is rejected or modified then tested again. When the “scientific method” is applied to the output from climate models of the IPCC AR5, specifically the bulk atmospheric temperature trends since 1979 (a key variable with a strong and obvious theoretical response to increasing GHGs in this period), I demonstrate that the consensus of the models fails the test to match the real-world observations by a significant margin. As such, the average of the models is considered to be untruthful in representing the recent decades of climate variation and change, and thus would be inappropriate for use in predicting future changes in the climate or for related policy decisions.
“The IPCC inadvertently provided information that supports this conclusion by (a) showing that the tropical trends of climate models with extra greenhouse gases failed to match actual trends and (b) showing that climate models without extra greenhouse gases agreed with actual trends. A report of which I was a co-author demonstrates that a statistical model that uses only natural influences on the climate also explains the variations and trends since 1979 without the need of extra greenhouse gases. While such a model (or any climate model) cannot “prove” the causes of variations, the fact that its result is not rejected by the scientific method indicates it should be considered when trying to understand why the climate does what it does. Deliberate consideration of the major influences by natural variability on the climate has been conspicuously absent in the current explanations of climate change by the well-funded climate science industry.”
Christy proceeded to substantiate his assertions with physical evidence – no polls, beliefs, models, assumptions, etc. His testimony is a follow-up on prior testimony, such as the one on February 2, 2016, which has been cited numerous times in TWTW.
Using the Canadian Climate Model, Christy gives a pictorial representation of the so-called “hot- spot” where the modelers suggest the atmospheric warming should occur, centered over the tropics at about 10km (33,000), 250 to 200 mb of pressure. Christy outlines the area from the surface to 50,000 feet (15km), making it clear where the pronounced atmospheric warming should occur, according to the modelers and the prevalent theory. By keeping his analysis below 50,000 feet, Christy avoids any confusion of the principle issue with stratospheric cooling, for which there is no generally accepted explanation.
Christy then shows that, in general, global climate models (CMIP5), from 32 institutions, greatly overestimate the atmospheric warming, where carbon dioxide (greenhouse gas) caused warming should occur. The number of simulations each institution contributes varies from one to eighteen. For the empirical data, Christy uses 3 different satellite datasets, 4 balloon datasets, and the average of 3 reanalysis datasets. The different types of datasets closely correspond, contrasting the average of the models which greatly overestimate the observations.
In addition, Christy sought the advice of an econometrician, Ross McKitrick, who applied statistical tests to determine if the trends in the model time series and the observation time series are statistically different. They are, with a confidence greater than 99%. Very simply, the models fail to describe what is occurring.
Very interestingly, Christy reveals that buried in the Supplementary Material of Chapter 10 of the Fifth IPCC Assessment Report (AR5, 2013), without comment, are graphs that show the models overestimate atmospheric warming trends, particularly over the tropics. Christy was a reviewer of AR5 and insisted that the graphs be in the main text, but he was ignored. This is another example of how highly politicized the UN reports have become.
Christy simplifies the graphic and shows that the models better describe atmospheric temperature trends when the influences of greenhouse gases (GHGs) are eliminated than when they are included (Fig 5 of Christy’s Testimony).
“Incredibly, what Fig. 5 shows is that the bulk tropical atmospheric temperature change is modeled best when no extra GHGs are included – a direct contradiction to the IPCC conclusion that observed changes could only be modeled if extra GHGs were included.” [Boldface in original]
Christy describes the simple statistical model by Wallace, Christy, and D’Aleo, which outperforms the global climate models used by the IPCC. Then, he advocates the need for a government funded “Red Team” to advocate opposing views as it is done in some government entities and private industry when critical issues are under consideration. [As a side note, the reports of the Nongovernmental International Panel on Climate Change (NIPCC) functioned as “red team” reports, but have been largely ignored or dismissed by “official” entities such as the EPA. See links under Challenging the Orthodoxy – NIPCC, Challenging the Orthodoxy, and Seeking a Common Ground
Michael Mann’s Written Testimony: The Distinguished Professor of Atmospheric Science at Penn State University states:
“at least 97% of scientists publishing in the field have all concluded, based on the evidence, that climate change is real, is human-caused, and is already having adverse impacts on us, our economy, and our planet.” John Cook et al 2016 Environ. Res. Lett. 11 048002
The Cook paper is a classic on how to mislead the public with surveys, which included the authors of the survey determining the classifications for the scientists, without their consent.
Mann addresses the criticism of Tom Karl and the recent work at NOAA, Ashville:
“This fake news story was built entirely on an interview with a single disgruntled former NOAA employee, John Bates, who had been demoted from a supervisory position at NOAA for his
inability to work well with others.”
Yet, many others have shown how NOAA, Ashville has distorted the historic temperature record in recent years.
“Bates’ allegations were also published on the blog of climate science denier Judith Curry (I use the term carefully—reserving it for those who deny the most basic findings of the scientific community, which includes the fact that human activity is substantially or entirely responsible for the large-scale warming we have seen over the past century— something Judith Curry disputes)” [Boldface added.]
In the subsequent oral testimony, Mr. Mann denied he called anyone a denier.
The above illustrates the thrust of the testimony, which can be summed as accusations with little physical evidence.
Fittingly, Exhibit B of the testimony is the latest version of Mr. Mann’s hockey-stick, PAGES 2k “published by a team of 78 scientists around the world using the most widespread paleoclimate database to date.” The graph superimposes instrument data on proxy data, yet the proxy data stops about 1990, failing to show a divergence between the two datasets.
Statistician Steve McIntyre of Climate Audit has written a number of highly critical comments on PAGES 2k, which included reconstructions that were inverted to give the false impression of warming, where the actual proxy data showed cooling, and various statistical tricks. See links under seeking a common ground and https://climateaudit.org/?s=pages+2k
Remaining Issues: With the physical evidence presented by John Christy, it is past time to re- visit the Charney Report, the EPA Endangerment Finding, as well as organizational statements regarding the influence of human carbon dioxide emissions on temperatures. However, in the highly politically charged atmosphere of Washington, which includes many scientific organizations, it is doubtful major change will take place quickly.
A great deal of government money went to support research on global warming, climate change, and, in particular, to schemes to reduce the effects of global warming even if the causes were not known. Much of this money, and the vested interests of many organizations, are difficult to trace, and will take time. No doubt, those organizations that received much of this money will object to change.
Also, it is important to avoid dogma in the opposite direct, such as CO2 has no influence on temperatures. The IPCC ignored the influence of El Niños, their frequency, and intensity. An important question needs to be examined: are El Niños influenced by increased amounts of carbon dioxide? See links under Funding Issues and Article #1.
ICCC – 12: The Twelfth International Conference on Climate Change ended on March 24. There were many interesting presentations. The videos, etc. of the conference are due to be available next week. TWTW will reserved comment on the presentations until they are available. Andy May wrote a solid summation of the conference, which was posted on Watts Up With That. See links under Challenging the Orthodoxy – ICCC-12
SEPP’S APRIL FOOLS AWARD THE JACKSON
SEPP is conducting its annual vote for the recipient of the coveted trophy, The Jackson, a lump of coal. Readers are asked to nominate and vote for who they think is most deserving, following these criteria:
· The nominee has advanced, or proposes to advance, significant expansion of governmental power, regulation, or control over the public or significant sections of the general economy.
· The nominee does so by declaring such measures are necessary to protect public health, welfare, or the environment.
· The nominee declares that physical science supports such measures.
· The physical science supporting the measures is flimsy at best, and possibly non-existent.
The five past recipients, Lisa Jackson, Barrack Obama, John Kerry, Ernest Moniz and John Holdren are not eligible. Generally, the committee that makes the selection prefers a candidate with a national or international presence. The voting will close on July 30. Please send your nominee and a brief reason why the person is qualified for the honor to Ken@SEPP.org. Thank you. The award will be presented at the annual meeting of the Doctors for Disaster Preparedness in August.
Number of the Week: Over 20% per year. Mark Mills, who correctly tracked the shale revolution in oil, writes that productivity in shale oil —output per shale drilling rig—has been rising by more than 20% a year. Advanced computer software and analyzing data are increasing productively. See Article # 2.
1. Trump’s Next Step on Climate Change
Reconsider the EPA’s labeling of carbon dioxide as a pollutant, based on now-outdated science.
By Paul Tice, WSJ, Mar 28, 2017 https://www.wsj.com/articles/trumps-next-step-on-climate-change- 1490740870?cx_campaign=poptart&mod=cx_poptart#cxrecs_s
SUMMARY: The executive-in-residence at New York University’s Stern School of Business and a former Wall Street energy research analyst writes:
“The executive orders on climate change President Trump signed this week represent a step in the right direction for U.S. energy policy and, importantly, deliver on Mr. Trump’s campaign promise to roll back burdensome regulations affecting American companies. But it will take more than the stroke of a pen to make lasting progress and reverse the momentum of the climate-change movement.”
After a brief discussion of executive orders issued, the writer states:
“Because they don’t attack the climate-change regulatory problem at its root, Mr. Trump’s orders will not provide enough clarity to U.S. energy companies—particularly electric utilities and coal- mining companies—for their long-term business forecasting or short-term capital investment and head-count planning.
“To accomplish that, the Trump administration, led by EPA Administrator Scott Pruitt, needs to target the EPA’s 2009 “endangerment finding,” which labeled carbon dioxide as a pollutant. That foundational ruling provided the legal underpinnings for all of the EPA’s follow-on carbon regulations, including the CPP.
“It also provided the rationale for the previous administration’s anti-fossil-fuel agenda and its various climate-change initiatives and programs, which spanned more than a dozen federal agencies and cost the American taxpayer roughly $20 billion to $25 billion a year during Mr. Obama’s presidency.
“The endangerment finding was the product of a rush to judgment. Much of the scientific data upon which it was predicated—chiefly, the 2007 Fourth Assessment Report of the U.N.’s Intergovernmental Panel on Climate Change—was already dated by the time of its publication and arguably not properly peer-reviewed as federal law requires.”
After a summary of IPCC changes the writer states:
“An updated EPA endangerment finding based on an objective review of the latest available scientific data is warranted, along with a more sober discussion of the threat posed by carbon dioxide and other greenhouse gases to the “public health and welfare of current and future generations,” in the words of the original endangerment finding.
“As long as the 2009 finding remains on the books, it will provide legal ammunition for environmentalists, academics and state government officials seeking to sue the administration for any actions related to climate change, including this week’s executive orders.
“Issuing a new endangerment finding would be a bold move requiring thorough work, but the Trump EPA would be well within its legal rights to undertake such an updated review process. In Massachusetts v. EPA (2007), the Supreme Court ruled that the Clean Air Act gives the EPA the authority, but not the obligation, to regulate carbon dioxide and other greenhouse gases. The EPA needs to “ground its reasons for action or inaction” with “reasoned judgment” and scientific analysis.
“Addressing the 2009 endangerment finding head-on would show that Mr. Trump is serious about challenging climate-change orthodoxy. Thus far he has sent a mixed message, as demonstrated by this week’s ambivalence on CPP (reworking rather than repealing) and his administration’s silence on U.S. participation in the U.N.’s 2015 Paris Agreement.
“Simply standing down on regulatory enforcement, cutting government funding for climate- change research and stopping data collection for the next four years will not suffice. Ignoring the EPA’s 2009 endangerment finding would mean that it is only a matter of time before another
liberal-minded occupant of the White House reasserts this regulatory power, bringing the country and the domestic energy sector back where Mr. Obama left them.
2. Saudi Arabia Puts U.S. Energy Producers to a Test—and They Ace It
It’s entirely feasible for America to become a far bigger oil exporter, even one of the biggest.
By Mark P. Mills, WSJ, Mar 27, 2017
SUMMARY: The senior fellow at the Manhattan Institute writes:
“Only a few years ago America’s policy makers were wringing their hands about ‘peak oil ‘ and dependence on imported fuels. Now headlines feature the return of oil gluts. What happened?
Saudi Arabia undertook a ‘stress test ‘ of America’s oil-and-gas industry that produced unintended consequences.
“We’re witnessing the first signs of a new normal in oil markets. Call it Shale 2.0, characterized by a potent combination: eager and liquid capital markets funding hundreds of experienced (now- lean) small to midsize companies that can respond to modest upticks in price with a velocity unseen in oil markets in eons—all using shale technology that is shockingly better than before and poised to keep improving.
“This year sees the U.S. not only filling storage tanks to the brim but also exporting more than a million barrels of crude oil a day. Exports are at the highest level in American history, twice the previous crude export peak in 1958. The U.S. is exporting more oil than five of the Organization of the Petroleum Exporting Countries’ 13 members.
“The stress test that brought this about began two years ago, when Saudi Arabia decided it would try to tame American shale oil and gas production. The technology of hydraulic fracturing, which began to emerge barely over a decade ago, led to the fastest and largest increase in hydrocarbon production in history.
“Oil prices started to collapse in 2014 because American shale businesses oversupplied markets. The Saudis responded by increasing production, which drove prices even lower. Their theory was that this would wreak havoc on small and midsize petroleum upstarts in states from Texas and Oklahoma to Pennsylvania and North Dakota.
“The fall from the $120-a-barrel stratosphere to under $30 did take a toll on producers everywhere. Businesses reduced investments and staffing, and many went bankrupt. It also deprived OPEC member states—and Russia, it bears noting—of hundreds of billions of dollars in revenues, forcing them to tap sovereign-wealth funds and cut domestic budgets.
“Something else happened. Little noticed outside the petroleum cognoscenti, shale technologies kept getting better. The productivity—output per shale drilling rig—has been rising by more than 20% a year. That means every 3½ years the average rig produces twice as much oil or gas. No other energy technology of any kind is improving at that rate. Put another way, the cost to produce shale oil keeps falling.
“As a result, with an assist from the recent modest increase in oil prices, shale investors and drillers are returning. Bad as that is for OPEC, the really frightening prospect is that software tools and techniques will now start to invade the shale domain, one of the least-computerized industrial sectors. ‘The cloud ‘ will be just as much of an economic accelerant for shale as it has been for other complex and distributed industries.
“Established tech companies such as Microsoft, IBM, Teradata and Splunk see the opportunity. The digital oilfield is also the animating logic of the huge merger of oil services giant Baker Hughes with General Electric’s ‘industrial internet ‘ and oil-and-gas business. Even more portentous, a new ecosystem of tech startups is chasing the prize of unlocking value in petabytes of untapped shale data.
“Venture capitalists like to talk about ‘unmet needs ‘ in ‘big markets. ‘ Oil is the world’s biggest market in a traded commodity, and America’s shale market went from near zero to $150 billion in a decade, largely without help from software.
“For the Saudis and other oil oligarchs, the worrisome feature of Shale 2.0 is that software enhances the most remarkable feature of shale production: velocity. The thousands of small to midsize shale operators and investors make rapid individual decisions, each involving a tiny fraction of capital per decision compared with the supermajors. This fluid, chaotic, very American entrepreneurial environment operates in private markets, largely on private land, and can expand or pull back with a volume and velocity unseen in oil markets in a century.
“Of course the U.S. still imports oil (for now), but net imports have declined by half. America is now the world’s biggest natural-gas producer and has become a net exporter. Other places can gush hydrocarbons into markets. But they’re all slow-moving, in some cases monopolistic, leviathans. As Ed Morse, Citi’s head of global commodities, recently observed, OPEC ‘has lost its clout. ‘”