Weekly Climate and Energy News Roundup #333

Brought to You by www.SEPP.org, The Science and Environmental Policy Project

By Ken Haapala, President

Biased: TWTW has been accused as being biased. It is biased against speculative ideas being used to justify far reaching government policy, particularly if the primary support of these ideas are complex mathematical models that have not been validated. Politicians and the public are often overwhelmed by such models even though the models may contain significant omissions and logical errors. Government policies based on speculative thinking can be harmful to the economy and to humans.

Over the past two weeks, TWTW discussed significant problems with the reports of the UN Intergovernmental Panel on Climate Change (IPCC) and its followers such as the US Global Change Research Program (USGCRP). Physicist Richard Lindzen brought up two: the climate system is unrealistically over-simplified and the global climate models fail to address critical issues regarding clouds and water vapor. Water vapor is by far the dominant greenhouse gas.

As linked in the last TWTW, scientists Jock Allison and Thomas Sheahen (Sheahen is Chairman of SEPP) discuss how IPCC’s failure to properly address water vapor results in overestimating the warming influence of trace greenhouse gases such as methane and nitrous oxide. Further, they assert that carbon dioxide (CO2) has a finite influence on temperatures as demonstrated in many laboratory tests showing the relationship between CO2 concentrations and temperatures is logarithmic. Increasing concentrations of CO2 after the beginning of the industrial period have little influence on temperatures, contrary to what the IPCC claims.

Global climate modelers have attempted to avoid the laboratory results by asserting various devices, such as the limited influence of CO2 will be greatly amplified by an increase in water vapor, which was claimed in the 1979 Charney Report. However, such an amplified warming cannot be identified in the physical record of atmospheric temperature trends. The net result is that all the models used by the IPCC greatly overestimate the greenhouse gas warming in the atmosphere, except one. The one exception is the model developed by the Institute of Numerical Mathematics in Moscow. See links under Challenging the Orthodoxy and the past two TWTWs.

****************

Erroneous Governmental Policies: The EPA and the US National Highway Traffic Safety Administration (NHTSA) requested comments on freezing changes in Corporate Average Fuel Economy (CAFE) and tailpipe carbon dioxide emissions standards for passenger cars and light trucks at levels proposed for the model year (MY) 2020. This freeze would cover model years 2021 through 2026. CAFE standards were first implemented in the Carter Administration, when many in Washington were convinced the world was about to run out of oil and the US out of natural gas. On line comments to EPA and NHTSA are limited to 5000 characters.

This was an opportunity to comment on two government policies that are not supported by current, physical evidence, to which Ken Haapala responded (in part):

“CAFÉ standards were conceived when US scientists and policy makers were operating under two erroneous concepts, both based on speculation supported by faulty mathematical models. These concepts instilled two fears, both of which are being demonstrated as false: 1) the world is about to run out of oil and natural gas, and 2) greenhouse gases, especially carbon dioxide, are causing dangerous global warming.

“Speculation is an important part of advancing science. Without speculation science becomes stagnant. But speculation must be tested against all available relevant evidence (data) before it is accepted.

“Similarly, mathematically models are valuable scientific tools but must be constantly tested against all available, relevant data (evidence) for their use to be justified.

“Policies on CAFÉ standards that previously made sense under past concepts and fears no longer make sense because the underlying rationale no longer applies. We have data (hard evidence) that contradict the fear that the world is about to run out of oil and natural gas and data (hard evidence) that contradict the fear that carbon dioxide-caused global warming will be dangerous. The evidence for each, contradict the mathematical models supporting these fears.”

After a brief discussion of errors in oil and natural gas modeling, the comments continue with:

“Modern automobiles emit little in their exhaust, except water vapor and carbon dioxide, gases essential for life on the planet, as we know it. In the US, category pollutants are no longer an issue. Further, modern technology in hydraulic fracturing and directional drilling permit exploration and development of oil and gas resources long thought to exist but thought to be impossible to extract. These abundant resources include the vast US quantities in dense shale, and deep-water reserves in the Gulf of Mexico and elsewhere. There is no foreseeable danger of ‘running out’ of oi and gas.

“Following my predecessor, I have spent over 10 years exploring the fears that greenhouse gases, mainly carbon dioxide, are causing dangerous global warming. These fears are as misplaced as the fears that the world is about to run out of oil. Since, the 1860s scientists have speculated as to the extent that atmospheric gases cause the planet to be warmer (at night) than it would be otherwise. It wasn’t until the 1920s that research laboratories started creating systematic tests on the effectiveness of atmospheric gases to absorb radiant energy, both incoming (from the sun) and outgoing (from the earth to space in the form of infrared energy). By far, water vapor is the dominant greenhouse gas, accounting for the earth being some 60ºF warmer that it would be otherwise. Water vapor is present in the atmosphere in varying concentrations ranging from 0.01% (polar regions) to 4% (tropical regions) of total atmospheric gases.

“By comparison, carbon dioxide is a bit player, quickly exhausted. Its concentration in the atmosphere is about 0.04%. Carbon dioxide quickly loses its effectiveness as a greenhouse gas. The effect is logarithmic, the first 20 units per million (0.002% of the atmosphere) has an appreciable effect on temperatures, the second 20 units considerably less (though the total effect increases), and so on. Even at pre-industrial levels (about 0.028% of the atmosphere), the effect of increasing carbon dioxide on temperatures is tiny.

“Strangely, the government entities reporting on the influences of greenhouse gases ignore water vapor and emphasize the bit player carbon dioxide. These entities rely on speculation in 1979 and 1981 reports published by the National Academy of Sciences. These reports speculated that the modest warming from carbon dioxide shown in laboratory experiments will be greatly amplified by water vapor or by a broadening of the energy absorption range of carbon dioxide. At the time there were no comprehensive measurements of atmospheric temperature trends to test this speculation.

“Today, we have almost 40 years of comprehensive atmospheric data showing that the rising temperature trends are very modest, and the components of the speculated amplification are not occurring. Nature is contradicting the US global climate models. Further, there has been no statistically significant rise in atmospheric temperatures for almost 20 years.

“There is no logical reason to ‘tighten’ CAFÉ standards because such ‘tightening’ will be costly and without any economic or health benefit. Using a predicted future warming from carbon dioxide to justify intensification of the CAFÉ standards is like claiming an intensification is necessary because the world is about to run out of oil and natural gas. Keep the MY 2020 standards until it becomes obvious they are not necessary.

All the comments can be viewed at https://www.regulations.gov/docketBrowser?rpp=25&so=DESC&sb=commentDueDate&po=0&dct=PS&D=EPA-HQ-OAR-2018-0283

****************

Quote of the Week“A philosopher should be a man willing to listen to every suggestion but determined to judge for himself. He should not be biased by appearances, have no favourite hypothesis, be of no school and in doctrine have no master …. Truth should be his primary object. If these qualities be added to industry, he may indeed hope to walk within the Veil of the temple of nature.” – Michael Faraday [H/t George Hacken]

Number of the Week: €424 million in ten years ($ 485 million at current conversion rates)

****************

Bureaucratic Science, Linear No Threshold Model: The lack of data supporting the Linear No Threshold (LNT) Model, commonly used by regulators such as the EPA, is becoming a major issue. Toxicologists have long used a dose-response model. To establish a toxic level for any substance requires experimentation, which bureaucratic officials try to avoid. The LNT model was developed by Hermann Muller, who received the 1946 Nobel Prize in Physiology or Medicine for his work on the biological effects of radiation. Muller asserted a small amount of radiation can be harmful, causing lasting mutations, cancer. This finding is apparently contradicted by experience which shows small amounts of radiation may be beneficial and those living in parts of the world, such as Iran, with relatively large amounts of natural radiation show no ill effects.

Patrick Michaels reports of a new paper by toxicologist Edward Calabrese who has been searching for the data supporting Muller’s early work on fruit flies. Muller claimed X-rays caused inheritable mutations in these insects and cited the work to support his claims of the LNT model for radiation. Calabrese reports “He [Mueller] made this gene mutation claim/interpretation in an article that discussed his findings but failed to include any data.” The entire conclusion by Calabrese bears quoting:

“This historical analysis suggests that Muller deliberately avoided peer-review on his most significant findings because he was extremely troubled by the insightful and serious criticism of Altenburg {Muller’s colleague], which suggested he had not produced gene mutations as he claimed. Nonetheless, Muller manipulated this situation (i.e., publishing a discussion within Science with no data, publishing a poorly written non-peer reviewed conference proceedings with no methods and materials, and no references) due to both the widespread euphoria over his claim of gene mutation and confidence that Altenburg would not publicly challenge him. This situation permitted Muller to achieve his goal to be the first to produce gene mutations while buying him time to later try to experimentally address Altenburg’s criticisms, and a possible way to avoid discovery of his questionable actions.”

Regulators such as the EPA have applied the LNT Model to stifle criticism of their regulations. One example pertains to asbestos, which is a generic term for several naturally occurring silicate minerals with different physical properties. Thanks to sloppy definitions, useful forms such as Chrysotile, used for building insulation and still mined in Canada, have “been linked” to causing cancer without hard, direct evidence. No doubt, Calabrese will be severely attacked for reporting the LNT Model is another hollow model, with no supporting hard evidence. See links under Challenging the Orthodoxy.

****************

More Subsidies Please: Energy expert Donn Dears provides some interesting calculations demonstrating the difference in energy efficiency between battery-powered electric vehicles (BEV) and vehicles with internal combustion engines (ICE). He finds that the electric vehicles have better energy efficiency and should gradually replace gasoline vehicles, if his assumptions bear out, and if the both types are a comparable value for consumers. Comparable value requires that battery prices and charging times come down.

Interestingly, this week the CEO for General Motors called for a National Zero Emission Vehicle (NZEV) program. The rational went through the usual: new jobs, not needed in this economy; zero emissions, of questionable value; conserving energy; of questionable value; etc. What was not mentioned is the current federal-government subsidy program for electric vehicles in the form of up to $7500 in tax credits for purchasers. The program runs out when 200,000 vehicles are sold by each manufacturer. Is General Motors running out of room before it hits the cap? Also, GM’s CEO’s action indicates that GM may not be able to deliver an electric vehicle of comparable value to vehicle with an internal combustion engine. See links under Subsidies and Mandates Forever and Alternative, Green (“Clean”) Vehicles.

****************

New York State Litigation: The Attorney General of the State of New York has filed a complaint in New York State Court against Exxon, a corporation headquartered in Texas. TWTW has not carefully reviewed the entire document. Reports indicate that it is significantly different from prior litigation by New York City and other municipalities. Indeed, the opening paragraph indicates that it is different. It states:

“This case seeks redress for a longstanding fraudulent scheme by Exxon, one of the world’s largest oil and gas companies, to deceive investors and the investment community, including equity research analysts and underwriters of debt securities (together, “investors”), concerning the company’s management of the risks posed to its business by climate change regulation. Exxon provided false and misleading assurances that it is effectively managing the economic risks posed to its business by the increasingly stringent policies and regulations that it expects governments to adopt to address climate change. Instead of managing those risks in the manner it represented to investors, Exxon employed internal practices that were inconsistent with its representations, were undisclosed to investors, and exposed the company to greater risk from climate change regulation than investors were led to believe.”

So, Exxon reviews risks from several different perspectives, before it reports its assessment of risks to investors. Is it not the responsibility of management to give an honest assessment of risks? According to the complaint, instead, management must give any possible assessment such as the New York State government may be run by fanatics who have no concept of evidence, law, or risk? Politicians never say things they know are false, misleading, or contradicted by what some of their colleagues might do in the future? See links under Litigation Issues.

****************

PAGES2k: Steve McIntyre continues his exhaustive research finding serious errors in a “Global 2,000 Year Multiproxy Database” known as PAGES2k. The effort by multiple authors is an attempt to re-instate Mr. Mann’s “hockey-stick” graph showing little temperature variation until the industrial revolution, and the associated increase in CO2.

McIntyre’s current analysis is of the North American network which consists entirely of tree rings. These have significant problems, particularly with the use of Mr. Mann’s Stripbark Bristlecone Pines and other proxies which give a false hockeystick shape from whitish noise. Originally, the pines were studied to show increasing growth from CO2 enrichment.

Tim Ball has reported the existence of approximate 300-year records of the Hudson Bay Company, which show climate changing in much of North America. Yet, these records are conveniently ignored by PAGES2k in favor of tree ring data. See links under Climategate Continued.

****************

Number of the Week: €424 million in ten years ($ 485 million at current conversion rates). According to reports, an independent auditor has found: “The European Commission has spent more than €424 million (£375m) over the last decade on carbon capture and storage (CCS) but failed to commercially deploy the technology.” See link under Carbon Schemes.

****************

ARTICLES:

1. Peak Embarrassment in War on Oil

New York’s AG claims that Exxon Mobil has been lying to itself.

Editorial, WSJ, Oct 25, 2018

https://www.wsj.com/articles/peak-embarrassment-in-war-on-oil-1540509533?mod=hp_opin_pos1

Summary: The editorial states:

“Before resigning this year amid allegations of sexual abuse, former New York Attorney General Eric Schneiderman spent nearly three years trying to harpoon his great white political whale— Exxon Mobil. His hunt failed to uncover malfeasance, but the AG’s office is suing Exxon anyway in a case that should be laughed out of court.

 

“Acting Attorney General Barbara Underwood alleged in a civil suit this week that Exxon defrauded shareholders, including those in the state workers’ pension fund, by failing to incorporate the projected costs of future climate regulation in its planning and investment decisions. The lawsuit says Exxon essentially kept two sets of books—one for public disclosures and another for internal purposes.

 

“Mr. Schneiderman initiated the roving investigation of Exxon’s business practices in November 2015. Exxon has since produced millions of pages of documents, but none have corroborated the political conspiracy theory that the oil and gas giant publicly downplayed the risks of climate change while preparing for them internally. No matter. The state AG’s office is now floating an alternative theory that is even more far-fetched.

 

“Lo, the AG says Exxon’s public disclosures projected a ‘proxy cost’ of climate regulation of $80 per ton of carbon in 2040 in developed countries and between $20 to $40 per ton in developing countries. Yet Exxon allegedly applied internally a “much lower price per ton to a small percentage of its GHG emissions, based on then-current regulations.” In other words, the AG claims Exxon was telling the truth to the public but lying to itself.

 

But as Exxon explained in a July motion challenging an AG subpoena, the two cost projections are used for distinct purposes. The ‘proxy costs’ are used to forecast global energy demand while ‘greenhouse gas costs’ projections are used internally to make particular investment decisions. Exxon has proprietary reasons for not publicly disclosing these internal estimates. And it must be accurate in cost projections if it wants its enormous and multiyear projects to earn a profit.

 

Each cost ‘is employed differently in Cash Flows’ Exxon added. ‘While Proxy Costs are indirectly reflected in line items associated with a commodity price, GHG Costs are incorporated, where appropriate, in various project economic metrics, including, but not limited to, operating expenses.’”

 

The editorial states the extensive efforts of Exxon to comply, then continues with:

 

“The reality is that nobody knows the future cost of carbon, and it will hinge as much on politics as on the evolving science and facts of climate change. President Trump sharply reduced the regulatory cost of carbon in the U.S. by rescinding Barack Obama’s Clean Power Rule, fuel-economy (Cafe) standards and methane regulations.

 

“Liberals claim oil will become obsolete as electric cars replace vehicles that run on fossil fuels. But these are the same people who said in 2006 that cellulosic ethanol would soon be an economic alternative to fossil fuels.

 

“The International Energy Agency reported this year that more oil investment is needed to keep up with increasing global demand: “Each year the world needs to replace 3 mb/d of supply lost from mature fields while also meeting robust demand growth. That is the equivalent of replacing one North Sea each year.”

 

“Ms. Underwood is charging Exxon under New York’s notorious Martin Act, which doesn’t require evidence of intent to prove fraud in civil cases. She may be hoping that Exxon agrees to settle and pay a fine so she can declare victory. Yet in this case there’s not even evidence of fraudulent conduct, much less intent. The only party guilty of misrepresentation in this lawsuit is the New York AG.”

2. Calculating life on Mars

By S. Fred Singer, American Thinker, Sep 16, 2018

https://www.americanthinker.com/blog/2018/09/calculating_life_on_mars.html

SEPP Chairman emeritus writes:

“The New York Times recently (June 21, 2018) speculated about a Summer Solstice mystery: “Does the Earth’s Tilt [of its spin axis, currently 23.5 degrees] Hold the Secret to Life?” It encouraged me to write about a different kind of speculation, linking the super-rotation of the Earth’s core to the possibility of life on Mars.

 

“It should be understood that the chain of events described below involves much uncertainty. Nevertheless, each link in the chain seems plausible – even if I cannot describe it quantitatively. So here goes:

 

“1. Super-rotation of the core

Seismic data, taken over a period of several years (Zhang, et al., Science 2005), suggest that the (innermost, solid iron) core is rotating slightly faster than the rest of the Earth, at 0.3-0.5 degrees/yr. We don’t know if this super-rotation is constant or varies over time. The analysts did not suggest a cause, hence relatively little attention has been devoted to the phenomenon. Most scientists I talked to had never heard of it.

 

“2. Possible explanation of super-rotation

If I invert the question and ask, ‘Why is the Earth rotating slower than its core?,’ then the answer becomes clear to me, and I can even calculate and estimate its magnitude. The slowing down may be caused by tidal friction, produced by the Moon. This also means that the super-rotation has been going on for billions of years.

 

“3. Production of Earth’s magnetic field

We may assume that the differential rotation at the inner-core boundary with the (still liquid) outer core “winds up” the magnetic lines of force and thus produces and strengthens the geomagnetic field. I have calculated that the geomagnetic field has a decay time of only about 20,000 years, based on the likely conductivity of the iron core. Clearly, some kind of energy source is required to maintain the magnetic field. I suggest that the source is kinetic energy of rotation.

 

“(The time variation and reversal of the geomagnetic field has been worked out by Paul H. Roberts and Gary A. Glatzmaier on the basis of magneto-hydrodynamic circulation within the liquid part of the core. The inner core is solid and is thought to be growing in size

 

“4. The geomagnetic field establishes the magnetosphere

This is the standard explanation for the existence of the magnetosphere, the outermost layer of the Earth’s atmosphere, consisting mostly of ions of atomic hydrogen, protons, magnetically trapped and spiraling around the lines of force, produced as the Earth neutral exosphere is dissociated and ionized by solar UV radiation.

 

“The magnetosphere extends from ~300 km to about 10 Earth radii (Earth’s radius is ~6,400 km, 4,000 miles) and is really an extension of the Earth’s ionosphere, which occupies the region of ~80-300 km.

 

“5. The magnetosphere shields the Earth’s lower atmosphere from the direct impact of the “solar wind,” consisting mainly of high-speed protons and some heavier nuclei. The solar wind would help to ionize the atmosphere and also “sweep away” the outer portions – thus speeding up the “escape” of this outermost atmosphere, the exosphere, as it is labeled, where the density is so low that the mean free path between collisions becomes long enough that one can ignore collisions. The concept of temperature loses significance.

 

“6. We may assume that similar processes happened on Mars. I believe that its core was liquefied by tidal friction, but it has cooled and is no longer liquid. Mars no longer has a general magnetic field like the Earth. Its magnetosphere has now disappeared, but its shielding effect may have lasted long enough, I believe, to maintain an ocean on Mars’s surface for some time.

 

“7. Lundin and others have measured the removal of the Martian upper atmosphere by the sweeping action of the solar wind. The crucial question is this: did the Mars atmosphere and surface ocean exist long enough to permit the creation of life forms – as it did on Earth?

 

“I can calculate a ‘survival time’ for the ocean using available physical theory. However, this calculation is complicated by the greenhouse effect and possible freezing over of the ocean surface, which would stop its evaporation. In addition, covering up the ocean, or the remaining puddles of water, affects survival. But I don’t know how long it takes for living forms to come into being; I assume that this interval is fairly short.

 

“8. We can calculate all we want, but the answer will be available eventually if we discover life forms, either krypto-life or paleo-life, just below the Martian surface.

 

“Life may have been produced several times during the early history of Mars but then wiped out by solar UV radiation or solar wind, or by the ionization produced by cosmic rays – unless protected by a magnetic field – as on Earth – or by a protective covering of dust or soil. That is to say life may exist only below the Martian surface.

 

“Conclusion

There are many links in this chain of events, many of them quite speculative and hard for me to quantify. But the sequence of events seems plausible. I summarize the main positive results:

 

“1. An explanation of the observed super-rotation in terms of tidal friction, produced by a (captured) moon. The Mars moon has now disappeared, leaving behind two fragments, Phobos and Deimos.

 

“2. A theoretical construct that accounts for the maintenance of the geomagnetic field, with kinetic energy of Earth’s rotation as the energy source.

 

“3. A conceptual calculation of the survival time of an ocean on Earth, with the protection of the magnetosphere – and on Mars, where the magnetosphere survives only as long as the Mars moon and a general magnetic field exist.”

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s