The Week That Was: June 6, 2015 – Brought to You by www.SEPP.org
By Ken Haapala, President, Science and Environmental Policy Project
Climate and Health – USGCRP: As discussed in prior TWTWS, April 18, May 16 and May 31, the US Global Change Research Program (USGCRP) released a draft for public review of its upcoming Climate & Health Assessment. The entire document has significant issues, including it is based on forecasts from climate models that have not been validated, it ignores the importance of public health measures in controlling infectious diseases, and it estimates deaths from extreme weather events, namely heat, that cannot be supported by mortality tables. The last findings are contradicted by a far more comprehensive study published in Lancet shows cold weather, not heat, kills about 20 times more people than hot weather. (TWTW May 31, 2015)
The chapter, Food Safety, Nutrition, and Distribution, is contrary to empirical evidence. It states:
“There are two overarching means by which increasing carbon dioxide (CO2) and climate change alter safety, nutrition, and distribution of food (see Figure ES6). The first is associated with rising global temperatures and the subsequent changes in weather patterns and extreme climate events. Current and anticipated changes in climate and the physical environment have consequences for contamination, spoilage, and the disruption of food distribution. The second is that higher concentrations of CO2 stimulate carbohydrate production and plant growth, but lower the levels of protein and essential minerals in a number of widely consumed crops, including wheat, rice and potatoes, with potentially negative implications for human nutrition.”
Since temperatures have not been rising for over a decade, projections of increasing temperatures from increasing carbon dioxide (CO2) are not valid. There is no logical reason to accept assertions that projections from models that failed to predict the pause or plateau in temperatures are capable of projecting future temperatures.
The claim that increased CO2 may decrease the natural value of foods may have some minimal value, only if other issues are ignored, which the USGCSP does. The 19th century discovery of biological nitrogen fixation whereby plants convert nitrogen into ammonium, which then is converted by bacterium into edible food for plants, was a great breakthrough for agriculture. The subsequent Haber process for artificial fertilizer production (early 20th century) was another great breakthrough for agriculture.
Critics of these processes claim that the nutrient density is reduced in foods produced by application of artificial fertilizer. But the processes have greatly increased overall food production and the nutrients available for humanity.
As to carbon dioxide, Climate Change Reconsidered II: Biological Impacts, by the Nongovernmental, International Panel on Climate Change (NIPCC), cites thousands of studies, in the laboratory and in the field, that demonstrate that enhanced atmospheric carbon dioxide is a boon to the environment and humanity. The thirteen agencies of the USGCSP, which includes the Department of Agriculture, choose to ignore this enormously important research. To term the government-funded USGCRP report as biased may be too moderate. See links under: Challenging the Orthodoxy – NIPCC and Defending the Orthodoxy.
Quote of the Week: “In God we trust, all others bring data” Motto of the Apollo Mission Evaluation Room engineers who supported Flight Operations http://www.therightclimatestuff.com/
Number of the Week: 500% in five years.
A Dud? For years, Science magazine has refused to publish articles from those who question the climate establishment, which claims that human emissions of greenhouse gases, namely CO2, are the primary influence on global warming/climate change. This view is the position of the current Administration.
One of the major problems with this view, is that it cannot explain the current plateau, or pause, in warming starting near the beginning of this century (using atmospheric data). Using surface data, the pause has been about 18 years. Thomas Karl, the director of the National Oceanic and Atmospheric Administration’s National Climatic Data Center, (NOAA-NCDC) and some of his colleagues have taken the issue on. They adjusted existing sea surface temperatures to give the appearance of a stronger warming trend over the past 15 years. Sea surface data is collected from several sources including ships and, later, specially designed buoys. The data from buoys are considered superior in accuracy, than the data from ships, particularly data in the early part of the record.
The paper was published by Science magazine. To give journalists time to prepare a medial push, the magazine sent out pre-publication notices embargoed (to be held privately) until June 4. The media push went with certain journalists. But the effort also gave some of those skeptical of the views of the climate establishment opportunity to prepare rebuttals.
As Ross McKitrick, who was co-author of the work that exposed the hockey-stick stated, the Karl team increased the more recent sea surface temperatures by: 1) adding 0.12 degrees C to readings collected by buoys, ostensibly to make them comparable to readings collected by ships. As the authors note, buoy readings represent a rising fraction of observations over recent decades, so this boosts the apparent warming trend; 2) giving buoy data extra weight in the computations; and 3) adjusting post-1941 data collected from ships, in particular applying a large cooling adjustment to readings over 1998-2000.
There is no logical reason for adjusting what most consider to be superior data to bring it in line with what most consider to be inferior data, and Karl et al. gave none. A team from CATO commented on this questionable adjustment and noted that the adjusted data was still not statistically significant. Comparing the Karl et al. temperatures with those from other sources, particularly satellite temperatures, the CATO team (Michaels, Lindzen, & Knappenberger) also noted:
“If the Karl et al., result were in fact robust, it could only mean that the disparity between surface and mid-tropospheric temperatures is even larger than previously noted.
“Getting the vertical distribution of temperature wrong invalidates virtually every forecast of sensible weather made by a climate model, as much of that weather (including rainfall) is determined in large part by the vertical structure of the atmosphere.
“Instead, it would seem more logical to seriously question the Karl et al. result in light of the fact that, compared to those bulk temperatures, it is an outlier, showing a recent warming trend that is not in line with these other global records.”
Science magazine refused to published Fred Singer’s rebuttal to its fawning review of the Merchants of Doubt, and Singer noted an amusing dilemma created by this re-worked data: What about the dozens of recent articles published by members of the Climate Establishment that try to explain away the missing heat, such as it is hiding in the Southern Oceans (James Hansen)? The article has created a conflict in climate science among advocates who claim it is a settled science.
See Article # 1 and links under Defending the Orthodoxy – No Pause and Challenging the Orthodoxy – No Lull in Pause
Atmospheric Temperatures v. Models: In his testimony before the US House Committee on Natural Resources on the draft guidance for greenhouse gas emissions by the President’s Council on Environmental Quality, John Christy of the University of Alabama in Huntsville gave a simple graph that most members of Congress should be able to understand. The graph clearly demonstrates the divergence between forecasts from global climate models and actual atmospheric temperatures. Of course, some misrepresent the importance of the graph with frivolous claims such as people live on the surface, not in the atmosphere. However, the surface record includes human activity, such as building cities, irrigation, etc. that distort the record. Generally, atmospheric data do not include such activates and they measure the area in which the greenhouse effect takes place. See links under Challenging the Orthodoxy.
RICO: Senator Sheldon Whitehouse of Rhode Island, who is part of the Congressional Witch Hunt to identify those skeptics who received funding from sources other than government (the Administration) has written that skeptics should be subject to investigation under Racketeer Influenced and Corrupt Organizations Act, or RICO. The act was designed to fight organized crime.
For evidence, Mr. Whitehouse cites a report by Drexel University professor Robert Brulle, in a 2013 paper published in Climatic Change. The web site of Climate Change states it is an Interdisciplinary, International Journal Devoted to the Description, Causes and Implications of Climatic Change. The co-editors are M. Oppenheimer; and G. Yohe, stalwart supporters of the UN Intergovernmental Panel on Climate Change (IPCC). It is precisely the highly questionable reports of the IPCC to which many skeptics object.
Given the enormous amount of US taxpayer money spent on climate change (over $35 billion since 1993 with no official progress in understanding the natural influences on climate change), accusations are not enough. Mr. Whitehouse needs to explain why fossil fuel interests are capable of thwarting his goal of preventing what he calls “carbon pollution” – emissions of life-giving CO2. A more appropriate direction for a RICO-type investigation would be the IPCC and those who fund it. See links under Suppressing Scientific Inquiry – The Witch Hunt, Suppressing Scientific Inquiry – The Witch Hunt – Push-Back, and http://www.springer.com/earth+sciences+and+geography/atmospheric+sciences/journal/10584
Social Cost of Carbon? Robert Murphy of the Institute for Energy Research (IER) gives an excellent summary of the recent work on the Social Cost of Carbon by Robert Pindyck, a professor of economics and finance at MIT. Pindyck is not a global warming skeptic but he objects to the Integrated Assessment Models (IAM) commonly used. The model are “so flexible that the researcher can get out any desired answer.”
“Another huge problem that most people would be shocked to learn is that there is neither theory nor data to back up the way these computer models relate a specified temperature increase to a predicted amount of damage. As Pindyck explains it:
“’One of the most important parts of an IAM is the damage function, i.e., the relationship between an increase in temperature and GDP (or the growth rate of GDP). When assessing [the climate’s sensitivity to emissions], we can at least draw on the underlying physical science and argue coherently about the relevant probability distributions. But when it comes to the damage function, we know virtually nothing – there is no theory and no data that we can draw from. As a result, developers of IAMs simply make up arbitrary functional forms and corresponding parameter values. [Pindyck 2015, bold added, footnotes removed.]
“’Thus we see that all of the fancy computer models—including the three that the Obama Administration Working Group selected to estimate the “social cost of carbon”—rest on quicksand. Most policymakers, let alone the general public, have no idea how flimsy and arbitrary is the foundation upon which these computer simulations stand. This is what leads Pindyck to write: “I will argue that the use of IAMs to estimate the SCC [social cost of carbon] or evaluate alternative policies is in some ways dishonest, in that it creates a veneer of scientific legitimacy that is misleading.” Later in his paper Pindyck further writes that “the developers and users of IAMs have tended to oversell their validity, and have failed to be clear about their inadequacies.” Because of this overselling of the power of these models, Pindyck believes “[t]he result is that policy makers who rely on the projections of IAMs are being misled.”
In short, the IAFs are speculative models built upon speculative global climate models. Murphy concludes:
“Far from being optimally calibrated using an analysis of marginal costs and benefits—the way most economists describe it to the public—Pindyck openly admits that if the government wants to justify aggressive action against greenhouse gas emissions, it’s going to rely on a small group of experts simply making guesses about what should be done, in order to reduce the probability of vaguely defined catastrophes that even the experts admit probably won’t happen if governments do nothing. The case for aggressive government intervention keeps getting weaker and weaker, and yet the rhetoric against “deniers” continues to ratchet upward.” See links under Questioning the Orthodoxy.
Apollo Program: One of the latest ideas from international politicians planning for the meeting in Paris of the Conference of Parties of the UN Framework Convention on Climate Change (UNFCCC) is an Apollo program for renewable energy, with massive commitments in funding. The motto of the engineers and scientists at the Apollo Mission Evaluation Room was: “In God we trust, all others bring data.” Given the dislike that IPCC and its parent, the UNFCCC, have exhibited against data that is not compatible with their ideology, one can only imagine what the motto of this group would be. See links under: On to Paris!
Number of the Week: 500%. Energy researcher Mark Mills estimates efficiency of capital expenditures on various types of energy production. He measures it in terms of energy output per unit of capital cost for the energy-producing hardware, using EIA data. According to his estimates, over the past five years, efficiency gains for shale rigs is 500% per dollar spent on capital improvements; for solar cells about 200%, and for wind turbines about 125%. The US government heavily subsidies solar cells and wind turbines, but not shale rigs. Efficiency gains do not measure two basic problems of solar and wind: the lack of reliability and consistency. See link under: Oil and Natural Gas – the Future or the Past?