The Week That Was: 2019-01-26,Brought to You by www.SEPP.org
By Ken Haapala, President, Science and Environmental Policy Project
Observations or Theory? Last week’s TWTW discussed the weather engine, a process illustrated in a graph in the Kiehl and Trenberth’s paper on the “Earth’s Annual Global Mean Energy Budget.” Energy from the sun causes water vapor and evapotranspiration to rise in the atmosphere, then condense into liquid water (or ice) in the upper troposphere giving off latent heat centered about 9 to 11 km (30,000 to 36,000 feet). This was the apparent source of the tropical “hot spot” used by climate modelers and the UN Intergovernmental Panel on Climate Change (IPCC) and its followers. It provided the argument in the 1979 Charney Report that an increase water vapor would dramatically increase the greenhouse gas effect of carbon dioxide (CO2).
The 2018 McKitrick and Christy paper demonstrated that 60 years of weather balloon data have shown no such significant warming is taking place over the tropics at a pressure between 300 to 200 millibars (mb) (roughly 9 to 11 km). Without this component, predictions of dire greenhouse gas warming, as described in government-supported publications, is not occurring. The issue is not that the Kiehl and Trenberth model incorrectly describes the process, but that the process is not intensifying as estimated by the global climate modelers and asserted in the Charney Report and by the IPCC.
The absence of a “hot spot” creates a major issue among those examining climate science. Which dominates in science – observations or theory? Put differently, one may accept that a model may describe a natural process, statically, with no other changes; but that does not mean that the model describes the process dynamically, with other changes. If the model does not describe the processes dynamically, the model has no predictive skill. The predictive skill of a model is determined by verification and validation. Those questioning the predictive skill of a model are not questioning “the science” of the discerption by the model, but that of the model’s predictive ability.
In No Tricks Zone, Pierre Gosselin posted a film by Dutch filmmaker Marijn Poels. About 1 hour 9 minutes into the film, Poels interviews renown theoretical physicist and mathematician Freeman Dyson who addressed this issue. Dyson asserted that the models are wrong if they disagree with observations:
“Nature should be the deciding voice who is right and who is wrong.”
As John Christy and his team at University of Alabama in Huntsville (UAH) have repeatedly asserted, the models overestimate the warming of the atmosphere by 2.5 to 3 times. There is no reason to accept long-term forecasts / projections / predictions from such models. If anyone is “anti-science,” it is those who accept theory over observations.
Among other comments, not necessarily in order and not exact quotes, Dyson states that:
· Unfortunately, the global warming issue has become too political with strong political dogmas on both sides.
· There are two types of information: observations and theories. Generally, you can believe the observations, observations of greening of the planet are clear; but, the theories of climate are confused.
· Climate models are a very good tool for understanding climate, but a very bad tool for predicting climate.
The next TWTW will begin exploring what may be a better model for predicting the influence of increasing CO2 on climate, than the current, commonly accepted climate models. Note: reader Christopher Game correctly objected to the use of the additional term, “heat engine,” to describe the “weather engine” process. Although it is frequently used in the literature, TWTW will avoid using it. See links under Challenging the Orthodoxy and Defending the Orthodoxy.
A Karl Disclaimer? Last week’s TWTW discussed a new paper on ocean heat content by Cheng, L.J. et al., including 3 US co-authors. The discussion included severe criticisms by Roy Spencer, Judith Curry, and David Whitehouse. Although it was highlighted, TWTW did not discuss an interesting note accompanying the paper:
“We appreciate Tim BOYER from NOAA/NCEI to provide the in-situ ocean observations from NOAA/NCEI that were used in this analysis.”
As linked but not discussed in TWTW, the Department of Commerce engaged MITRE, an independent not-for-profit entity, to objectively assess the processes used to develop and publish the so-called Karl Study, “Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus,” published in Science Magazine on June 26, 2015. Karl retired from government service in 2016. Although released to the public last week, the report was finished in July 2018. The report is titled: “Assessment of National Oceanic and Atmospheric Administration Scientific Integrity Policies and Procedures: As Applied to the 2015 Dr. Thomas Karl, et al., Science Paper: ‘Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus.’”
The MITRE report found no wrong doing by the authors of the Karl Study, which claimed there was no “global warming pause” between 1998 and 2015. Of course, the surface data contradicts atmospheric data, which shows a pause and is far more comprehensive.
“However, the MITRE Committee determined that the 2013 NOAA guidelines were ambiguous and not clear on when the more stringent agency IQA [Information Quality Act] review requirements should apply. In 2016, NOAA updated the Framework for Internal Review and Approval to reference the OMB exemption. While this new language is an improvement over the 2013 NOAA guidance, it should be written more clearly and presented more prominently in the guidance.”
“Because the NOAA officials knew in advance of publication that the paper would be influential, impactful, and controversial, the authors of the Karl Study should have included a disclaimer in the paper to indicate that the views expressed in the paper represented the opinions of the authors and, as indicated with the guideline suggested in the NOAA Framework for Internal Review and Approval, that it ‘did not necessarily reflect the views of NOAA or the Department of Commerce.’”
John Bates, the retired NOAA employee whose concerns prompted the investigation. Similar to the objections by Gates, TWTW had significant problems with the reconstructions of surface temperature trends embodied in the study. Of special concern are the poor quality and coverage of the land surface data use prior to 1930, extending back to 1850; and the use sea surface temperature databases that were compiled using vastly different measurement techniques – buckets, ship intakes, and Argo free-floating buoys. As stated, the MITRE found no wrongdoing and the confidence intervals use were consisted with IPCC standards.
Apparently, IPCC standards of not using atmospheric temperature data are accepted in the US government even though the US entities, such as the US Global Change Research Program (USGCRP), have the Congressional mandate to consider both human and natural causes of climate change. The IPCC has no requirement to consider natural causes of climate change.
The comment in the Cheng et al. paper “We appreciate Tim Boyer from NOAA/NCEI to provide the in-situ ocean observations from NOAA/NCEI that were used in this analysis” prompts questions. Were the data the same as in the Karl Study? Were the Chinese co-authors informed of the MITRE report and its recommendations such as the data and any associated paper “represented the opinions of the authors” and “did not necessarily reflect the views of NOAA or the Department of Commerce.” There may be a new disclaimer needed when government employees publish ‘independent” papers. See links under Seeking a Common Ground.
Another Look: Ostracized by the global warming tribe, Judith Curry is taking another look at the global warming that occurred prior to 1950. On Climate Etc. she covers “some fine research on multi-decadal to millennial scale internal variability by Xie et al., Kravtsov, Huybers et al., Meehl et al. and others. And it all gets ignored by the circular reasoning of formal attribution studies.” Then she reflects:
“In order to have any confidence in the IPCC and NCA [National Climate Assessment] attribution statements, much greater effort is needed to understand the role multi-decadal to millennial scales of internal climate variability.
“Much more effort is needed to understand not only the early 20th century warming, but also the ‘grand hiatus’ from 1945-1975. Attempting to attribute these features to aerosol (stratospheric or pollution) forcing haven’t gotten us very far. The approach taken by Xie’s group is providing important insights.
“Once we do satisfactorily explain these 20th century features, then we need to tackle the 19th century — overall warming, with global sea level rise initiating ~1860, and NH [Northern Hemisphere] glacier melt initiating ~1850. And then we need to tackle the last 800 years – the Little Ice Age and the ‘recovery’. (See my previous post 400 years(?) of global warming). The mainstream attribution folk are finally waking up to the importance of multidecadal ocean oscillations — we have barely scratched the surface re understanding century to millennial scale oscillations, as highlighted in the recent Gebbie and Huybers paper discussed on Ocean Heat Content Surprises.
“There are too many climate scientists that expect global surface temperature, sea ice, glacier mass loss and sea level to follow the ‘forcing’ on fairly short time scales. This is not how the climate system works, as was eloquently shown by Gebbie and Huybers. The Arctic in particular responds very strongly to multidecadal and longer internal variability, and also to solar forcing.
“Until all this is sorted out, we do not have a strong basis for attributing anything close to ~100% of the warming since 1950 to humans, or for making credible projections of 21st century climate change.” See links under Challenging the Orthodoxy.
By 2020? The U.S. Energy Information Administration, specializing in independent statistics and energy analysis, released its annual energy outlook for 2019. It contains assertions that would have been considered impossible a few years ago. These include:
“In the Reference case, U.S. crude oil production continues to set annual records through the mid-2020s and remains greater than 14.0 million barrels per day (b/d) through 2040.
“The United States becomes a net energy exporter by 2020. In the Reference case, the United States becomes a net exporter of petroleum liquids in 2020 as U.S. crude oil production increases and domestic consumption of petroleum products decreases. The United States continues to be a net exporter of natural gas and coal (including coal coke) through 2050.
“U.S. net exports of natural gas continue to grow as liquefied natural gas becomes an increasingly significant export. In the Reference case, U.S. liquefied natural gas (LNG) exports and pipeline exports to Canada and to Mexico increase until 2030 and then flatten through 2050 as relatively low, stable natural gas prices make U.S. natural gas competitive in North American and global markets.
Supported by “state-of-the-art” computer models, forty years ago, most “experts” in Washington believed the nation would run out of oil and natural gas within a few years. Federal energy policy was determined by this belief. Fortunately, independent oil and gas producers began to challenge the belief. The remarkable turn-around demonstrates the folly of governments basing policy on models that are not rigorously tested. See links under Energy Issues – US.
Amplifications and Corrections: Reader Clyde Spencer alertly noted a blunder in last week’s TWTW. It stated: “Ice melting in a water glass is another example of a phase change: one that releases energy.” The correct statement is: Ice melting in a water glass without changing temperature of the water is another example of a phase change: one that absorbs energy from the surrounding atmosphere, if it is warmer. TWTW thanks those who take the time to correct such errors.
Number of the Week: 250 Million MW short. According to Jo Nova, on January 25 the State Energy Minister, Lily D’Ambrosio, for the State of Victoria, Australia, announced “there would ‘absolutely’ be no blackouts this morning … the rolling blackouts started 90 minutes later.” “More than 200,000 Victorian households had their power cut off …in a bid to protect the state’s energy system from shutting down.” Of course, the politicians responsible for replacing power plants using fossil fuels with wind and solar demanded the public “be reasonable.” The blackout included the state capitol, Melbourne, which historic records show had been as hot or hotter about 50 times since 1855.
This event is another example of how unreliable and unpredictable wind and solar are in generating electricity. Many politicians are so infected with green propaganda that they have lost all sense of reason. Should they be called “green zombies?” See links under Energy Issues – Australia.
1. What Utilities Can Do to Strengthen the Grid
Bolstering power networks against extreme events can require billions of dollars but some utilities are taking smaller measures
By Erin Ailworth, WSJ, Jan 22, 2019
[SEPP Comment: Misses the most obvious preventive measure: aggressive fire-fighting with removal of forest litter and new growth vegetation, “fine fuels”, to create defensible spaces.]
SUMMARY: The author discusses the following: 1) stronger utility poles, from concrete, steel, fiberglass, which are less fire-prone; 2) insulate power lines; 3) bury power lines, which is expensive and often impossible; 4) add sensors and reclosers to alert the utility of disruption; 5) tighter weather monitoring; and 6: use of wildfire cameras.
2. Vision Zero, a ‘Road Diet’ Fad, Is Proving to Be Deadly
Emergency vehicles get stuck on streets that have been narrowed to promote walking and bicycling.
By Christopher D. LeGras, WSJ, Jan 18, 2019
SUMMARY: Beginning an accident in which first responders had difficult reaching the person, the attorney continues:
“Los Angeles, like cities nationwide, is transforming its streets. In July 2017 the city installed a “road diet” on a 0.8-mile stretch of Venice Boulevard in Mar Vista, reducing four lanes to two and adding bike lanes separated from traffic by parking buffers. The project is part of Mayor Eric Garcetti’s Vision Zero initiative, which aims to eliminate traffic fatalities in the city by 2025. Launched in 2015, Vision Zero is the most radical transformation of how people move through Los Angeles since the dawn of the freeway era 75 years ago.
“By almost any metric it’s been a disaster. Pedestrian deaths have nearly doubled, from 74 in 2015 to 135 in 2017, the last year for which data are available. After years of improvement, Los Angeles again has the world’s worst traffic, according to the transportation research firm Inrix. Miles of vehicles idling in gridlock have reduced air quality to 1980s levels.
“The international Vision Zero movement began in the 1990s in Sweden, where it apparently worked well. The Swedish government claims a 50% reduction in traffic deaths since 2000. Hoping to achieve similar gains, U.S. mayors from New York City to North Pole, Alaska, have adopted Vision Zero. Projects range from multibillion-dollar light-rail lines to retiming traffic lights for slower traffic. Road diets are key.
“In neighborhoods across New York City, residents, community boards and local businesses have done battle with city officials over “traffic calming” measures imposed by city hall. Lane reductions, bike lanes, new meridians and other innovations designed to reduce vehicle speeds make it difficult for bulky ambulances and fire trucks to respond quickly to emergencies. And while pedestrian deaths have plummeted in the Big Apple under Vision Zero, deaths of bicyclists, motorcyclists and people in vehicles have ticked up.
“Around the country, officials have implemented projects on short notice, over local objections and without consulting first responders. Howard Holt, a fire captain in Oakland, Calif., said he found out about a road diet in front of his station when he arrived for a shift one morning. “I wasn’t sure if I was supposed to drive in the new green lanes,” he said recently. “Turns out they’re bike lanes.” He calls the city bureaucracy “The Wall.”
“During the 2017 La Tuna Fire, the biggest in Los Angeles in half a century, a road diet on Foothill Boulevard the in Sunland-Tujunga neighborhood bottlenecked evacuations. After the fire a neighborhood association voted to go off the road diet. The city ignored the request and instead added another one to La Tuna Canyon Road.
“The story isn’t confined to big cities. In Waverly, Iowa (pop. 9,837), Fire Chief Dennis Happel and Bremer County Sheriff Dan Pickett say the city has ignored their concerns over a road diet plan. In Fairbanks, Alaska, Fire Battalion Chief Brian Davis says the city installed traffic controls to mitigate the impact of new bike lanes in front of his fire house. In January the average high temperature in Fairbanks is zero Fahrenheit—much too cold to ride a bike.
“It’s noble to want to make America’s streets as safe as they can be. But government officials shouldn’t impose projects on communities that don’t work, inconvenience residents, hurt businesses and impede emergency responders in the process.”