AAAS: “Let’s hold them accountable”

By David Middleton – Re-Blogged From http://www.WattsUpWithThat.com

This morning, I received another email from the American Association for the Advancement of Science…

AAAS junk

We cannot overstate this: Under the current administration, the future of scientific inquiry and discovery in the U.S. is in serious jeopardy.

You can do something important right now to protect our progress and our planet: become an AAAS member.

Organizations that have propelled us forward — NIH, NOAA, and the EPA, just to name a few — are facing major funding cuts. President Trump has begun the process of pulling the United States out of the Paris Climate Accord, putting us in the company of only two other nations to reject this planet-saving agreement.

As scientists, engineers, teachers, students, and science advocates, we must join forces and oppose this administration’s dire proposals for science. We must continue to educate and keep pressure on elected officials to make evidence- and research-based decisions that protect our planet and help all of humankind to progress.

Join today, and you’ll get a free “I Am a Force for Science” water bottle as a thank you.

Thanks in advance for your membership,

Michael Savelli
Chief Information & Engagement Officer
American Association for the Advancement of Science

*Offer valid from May 31, 2017 to June 30, 2017, for new individual members only. There is a limit of one water bottle per membership order. Please allow up to four weeks for domestic delivery and up to five weeks for international delivery. The AAAS water bottle is provided as is without any guarantees or warranty and cannot be exchanged or returned. In association with the product, AAAS makes no warranties of any kind, either express or implied, including but not limited to warranties of merchantability or fitness for a particular purpose.

I agree!  “Let’s hold them accountable”!

Let’s hold them accountable for the epic failure of their climate models

hansen_5

Hansen’s 1988 model and GISTEMP, 5-yr running mean. https://wattsupwiththat.com/2017/04/17/the-good-the-bad-and-the-null-hypothesis/

ar1_01

IPCC First Assessment Report (FAR). Model vs. HadCRUT4. https://wattsupwiththat.com/2017/04/17/the-good-the-bad-and-the-null-hypothesis/

tar_01

IPCC Third Assessment Report (TAR) model vs. HadCRUT4.

christy_dec8

“Climate models versus climate reality.”  https://judithcurry.com/2015/12/17/climate-models-versus-climate-reality/

cmip5-90-models-global-tsfc-vs-obs-thru-2013-1024x921

“95% of Climate Models Agree: The Observations Must be Wrong.”  http://www.drroyspencer.com/2014/02/95-of-climate-models-agree-the-observations-must-be-wrong/

The climate models have never demonstrated any predictive skill.

And the models aren’t getting better. Even when they start the model run in 2006, the observed temperatures consistently track at or below the low end 5-95% range.  Observed temperatures only approach the model mean (P50) in 2006, 2015 and 2016.

By Indur Goklany, originally published at the Cato Institute, but publsihed here also by invitation from the author

Arguably the most influential graphic from the latest IPCC report is Figure SPM.2 from the IPCC WG 2’s Summary for Policy Makers (on the impacts, vulnerability and adaptation to climate change). This figure, titled “Key impacts as a function of increasing global average temperature change”, also appears as Figure SPM.7 and Figure 3.6 of the IPCC Synthesis Report (available at http://www.ipcc.ch/pdf/assessment-report/ar4/syr/ar4_syr.pdf). Versions also appear as Table 20.8 of the WG 2 report, and Table TS.3 in the WG 2 Technical Summary. Yet other versions are also available from the IPCC WG2’s Graphics Presentations & Speeches, as well as in the WG 2’s “official” Power Point presentations, e.g., the presentation at the UNFCCC in Bonn, May 2007 (available at http://www.ipcc.ch/graphics/pr-ar4-2007-05-briefing-bonn.htm).

Notably the SPMs, Technical Summary, Synthesis Report, and the versions made available as presentations are primarily for consumption by policy makers and other intelligent lay persons. As such, they are meant to be jargon-free, easy to understand, and should be designed to shed light rather than to mislead even as they stay faithful to the science.

Let’s focus on what Figure SPM.2 tells us about the impacts of climate change on water.

The third statement in the panel devoted to water impacts states, “Hundreds of millions of people exposed to increased water stress.” If one traces from whence this statement came, one is led to Arnell (2004). [Figure SPM.2 misidentifies one of the sources as Table 3.3 of the IPCC WG 2 report. It ought to be Table 3.2. ]

What is evident is that while this third statement is correct, Figure SPM.2 neglects to inform us that water stress could be reduced for many hundreds of millions more — see Table 10 from the original reference, Arnell (2004). As a result, the net global population at risk of water stress might actually be reduced. And, that is precisely what Table 9 from Arnell (2004) shows. In fact, by the 2080s the net global population at risk declines by up to 2.1 billion people (depending on which scenario one wants to emphasize)!

And that is how a net positive impact of climate change is portrayed in Figure SPM.2 as a large negative impact. The recipe: provide numbers for the negative impact, but stay silent on the positive impact. That way no untruths are uttered, and only someone who has studied the original studies in depth will know what the true story is. It also reminds us as to why prior to testifying in court one swears to “tell the truth, the whole truth and nothing but the truth.”

Figure SPM.2 fails to tell us the whole truth.

Hints of the whole truth, however, are buried in the body of the IPCC WG 2 Report as evidenced by the following quote from Section 3.5.1, p. 194, of that report. Note that Arnell (2004b) and Arnell (2004) are identical.

In the 2050s, differences in the population projections of the four SRES scenarios would have a greater impact on the number of people living in water-stressed river basins (defined as basins with per capita water resources of less than 1,000 m3/year) than the differences in the emissions scenarios (Arnell, 2004b). The number of people living in severely stressed river basins would increase significantly (Table 3.2). The population at risk of increasing water stress for the full range of SRES scenarios is projected to be: 0.4 to 1.7 billion, 1.0 to 2.0 billion, and 1.1 to 3.2 billion, in the 2020s, 2050s, and 2080s, respectively (Arnell, 2004b). In the 2050s (SRES A2 scenario), 262-983 million people would move into the water stressed category (Arnell, 2004b). However, using the per capita water availability indicator, climate change would appear to reduce global water stress. This is because increases in runoff are heavily concentrated in the most populous parts of the world, mainly in East and South-East Asia, and mainly occur during high flow seasons (Arnell, 2004b). Therefore, they may not alleviate dry season problems if the extra water is not stored and would not ease water stress in other regions of the world. [Emphasis added]

But even this acknowledgment seems grudging, and leaves a misleading impression, as can be seen by the following annotated version of the above quote. [My annotations are indicated within the quote in square brackets and are in bold.]

In the 2050s, differences in the population projections of the four SRES scenarios would have a greater impact on the number of people living in water-stressed river basins (defined as basins with per capita water resources of less than 1,000 m3/year) than the differences in the emissions scenarios (Arnell, 2004b). The number of people living in severely stressed river basins would increase significantly (Table 3.2). The population at risk of increasing water stress for the full range of SRES scenarios is projected to be: 0.4 to 1.7 billion, 1.0 to 2.0 billion, and 1.1 to 3.2 billion, in the 2020s, 2050s, and 2080s, respectively (Arnell, 2004b). [COMMENT: note that the IPCC text fails to mention that the reductions in populations at risk of water stress due to climate change are projected to be substantially higher — 0.6 to 2.4 billion, 1.8 to 4.3 billion, and 1.7 to 6.0 billion in the 2020s, 2050s and 2080s, respectively. See Table 10 from the original source.] In the 2050s (SRES A2 scenario), 262-983 million people would move into the water stressed category (Arnell, 2004b). [COMMENT: The corresponding figures for the population moving out of water stress category are 191 to 1,493 million. See Table 9 from the original source.] However, using the per capita water availability indicator, climate change would appear to reduce global water stress. This is because increases in runoff are heavily concentrated in the most populous parts of the world, mainly in East and South-East Asia, and mainly occur during high flow seasons (Arnell, 2004b). Therefore, they may not alleviate dry season problems if the extra water is not stored and would not ease water stress in other regions of the world. [COMMENT: One should expect that societies would take action to store water if that’s what is necessary to avoid water stress. Such actions are not rocket science; they are probably as old as humanity itself, and have a successful track record going back for millennia. Moreover, if the IPCC’s emission scenarios, and the economic growth rates they assume are to be believed, these societies would be much wealthier in the future and should, therefore, have access to more capital to help adapt to such problems. See here (pp. 1034-1036, Tables 1 and 10).]

[Note that the Arnell paper is discussed in some detail here (pp. 1034-1036; Table 4), among other places.]

To summarize, with respect to water resources, Figure SPM.2 — and its clones — don’t make any false statements, but by withholding information that might place climate change in a positive light, they have perpetrated a fraud on the readers.

” data-medium-file=”” data-large-file=”” class=”alignnone wp-image-3142″ src=”https://debunkhouse.files.wordpress.com/2017/04/fig-nearterm_all_update_2017-1024×5091.png?w=825&h=417″ alt=”fig-nearterm_all_update_2017-1024×5091″ originalw=”545″ originalh=”271″ src-orig=”https://debunkhouse.files.wordpress.com/2017/04/fig-nearterm_all_update_2017-1024×5091.png?w=545&h=271″ scale=”1.5″ height=”271″ width=”545″>

Figure 21.  Climate Lab Book. Comparing CMIP5 & observations.

The ensemble consists of 138 model runs using a range of representative concentration pathways (RCP), from a worst case scenario RCP 8.5, often referred to as “business as usual,” to varying grades of mitigation scenarios (RCP 2.6, 4.5 and 6.0).

fig-nearterm_all_update_2017-panela-1-1024x525

Figure 22. Figure 21 with individual model runs displayed.

SOURCE

When we drill wells, we run probability distributions to estimate the oil and gas reserves we will add if the well is successful.  The model inputs consist of a range of estimates of reservoir thickness, area and petrophysical characteristics.  The model output consists of a probability distribution from P10 to P90.

  • P10 = Maximum Case.  There is a 10% probability that the well will produce at least this much oil and/or gas.
  • P50 = Mean Case.  There is a 50% probability that the well will produce at least this much oil and/or gas.  Probable reserves are >P50.
  • P90 = Minimum Case.  There is a 90% probability that the well will produce at least this much oil and/or gas.  Proved reserves are P90.

Over time, a drilling program should track near P50.  If your drilling results track close to P10 or P90, your model input is seriously flawed.

If the CMIP5 model ensemble had predictive skill, the observations should track around P50, half the runs should predict more warming and half less than is actually observed. During the predictive run of the model, HadCRUT4.5 has not *tracked* anywhere near P50…

This is unusual. A live media teleconference on the sun. Even more unusual is this statement: The sun’s current state could result in changing conditions in the solar system.

As you may recall, I posted an entry about the Ulysses mission back on June 16th and the findings of a lowered magnetic field in the sun, from the JPL press release then:

Ulysses ends its career after revealing that the magnetic field emanating from the sun’s poles is much weaker than previously observed.  This could mean the upcoming solar maximum period will be less intense than in recent history.

We live in interesting times.


Dwayne Brown
Headquarters, Washington
202-358-1726
dwayne.c.brown@nasa.gov

DC Agle
Jet Propulsion Laboratory, Pasadena, Calif.
818-393-9011
agle@jpl.nasa.gov 

Sept. 18, 2008

MEDIA ADVISORY : M08-176

http://www.nasa.gov/home/hqnews/2008/sep/HQ_M08176_Ulysses_teleconference.html

NASA To Discuss Conditions On And Surrounding The Sun

WASHINGTON — NASA will hold a media teleconference Tuesday, Sept. 23, at 12:30 p.m. EDT, to discuss data from the joint NASA and European Space Agency Ulysses mission that reveals the sun’s solar wind is at a 50-year low. The sun’s current state could result in changing conditions in the solar system.

 

Ulysses was the first mission to survey the space environment above and below the poles of the sun. The reams of data Ulysses returned have changed forever the way scientists view our star and its effects. The venerable spacecraft has lasted more than 17 years – almost four times its expected mission lifetime.

The panelists are:
— Ed Smith, NASA Ulysses project scientist and magnetic field instrument investigator, Jet Propulsion Laboratory, Pasadena, Calif.
— Dave McComas, Ulysses solar wind instrument principal investigator, Southwest Research Institute, San Antonio
— Karine Issautier, Ulysses radio wave lead investigator, Observatoire de Paris, Meudon, France
— Nancy Crooker, Research Professor, Boston University, Boston, Mass.

Reporters should call 866-617-1526 and use the pass code “sun” to participate in the teleconference. International media should call 1-210-795-0624.

To access visuals that will the accompany presentations, go to:

http://www.nasa.gov/topics/solarsystem/features/ulysses-20080923.html

Audio of the teleconference will be streamed live at:

http://www.nasa.gov/newsaudio

– end –

h/t to John Sumpton

” data-medium-file=”” data-large-file=”” class=”alignnone wp-image-3144″ src=”https://debunkhouse.files.wordpress.com/2017/04/cmip5_2.png” alt=”cmip5_2″ scale=”0″>

Figure 23. Figure 21 zoomed in on model run period with probability distributions annotated.

I “eyeballed” the instrumental observations to estimate a probability distribution of predictive run of the model.

Prediction Run Approximate Distribution

2006 P60 (60% of the models predicted a warmer temperature)
2007 P75
2008 P95
2009 P80
2010 P70
2011-2013 >P95
2014 P90
2015-2016 P55

Note that during the 1998-99 El Niño, the observations spiked above P05 (less than 5% of the models predicted this). During the 2015-16 El Niño, HadCRUT only spiked to P55.  El Niño events are not P50 conditions. Strong El Niño and La Niña events should spike toward the P05 and P95 boundaries.

The temperature observations are clearly tracking much closer to strong mitigation scenarios rather than RCP 8.5, the bogus “business as usual” scenario.

The red hachured trapezoid indicates that HadCRUT4.5 will continue to track between less than P100 and P50. This is indicative of a miserable failure of the models and a pretty good clue that the models need be adjusted downward.

In any other field of science CAGW would be a long-discarded falsified hypothesis.

[From: The Good, the Bad and the Null Hypothesis]

Let’s hold them accountable for their unwillingness to accept a realistic climate sensitivity

Relentlessly shrinking climate sensitivity estimates

Remember how all the news stories keep telling us the evidence is growing and getting stronger than ever “against the skeptics”?

David Stockwell has done a beautiful graph of the value of climate sensitivity estimates that of recent climate research that Steven McIntyre discussed in detail.

The trend looks pretty clear. Reality is gradually going to force itself on the erroneous models.

climate_sensitivity5

Indications are that around 20202030 climate sensitivity will hit zero. ;- )

[…]

JoNova

“Let’s hold them accountable” for fraudulently clinging to high climate sensitivities, when all of the recent observation-based evidence indicates that it is quite low.

Since 2011, at lleast 14 studies published in the peer-reviewed scientific literature provide strong evidence that the equilibrium climate sensitivity (ECS)—how much the earth’s average surface temperature will rise under a doubling of the atmospheric carbon dioxide concentration—lies near the low end of the IPCC estimates (Figure 5). This recent research includes investigations of the earth’s thermal response to changes in climate forcings that have taken place over the past century, millennium, and over glacial periods.

slide5

Figure 5. Equilibrium climate sensitivity (ECS) estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence) range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing. https://judithcurry.com/2015/12/17/climate-models-versus-climate-reality/

Several of these research findings were published subsequent to the 2013 release of the IPCC’s Fifth Assessment Report (AR5), and thus were not included in that Assessment. Others were considered in the IPCC AR5, and still others were ignored. And while the IPCC AR5 did reflect some influence on these new low ECS estimates—by expanding its “likely” range of ECS estimates downward to include 1.5°C (the low end was 2.0°C in the 2007 IPCC Fourth Assessment Report) and omitting a “best estimate” value (which had previously been given as 3.0°C in the 2007 report)—it still doggedly held on to its high end “likely” estimate of 4.5°C. This was a disservice to the latest science, but was a necessary step to preserve the IPCC’s reliance on climate projections made by models with an ECS averaging 3.2°C and ranging from 2.1°C to 4.7°C—the same models recently evaluated by Christy and in our AGU presentation. Had the IPCC fully embraced an ECS near 2.0°C—that which the recent literature suggests—it would have had to throw out much of the rest of the report.

Climate models versus climate reality

Let’s hold them accountable for fraudulently describing RCP 8.5 as a “business as usual” scenario

(1) AN INTRODUCTION TO SCENARIOS ABOUT OUR FUTURE

In AR5 four Representative Concentration Pathways (RCPs) describe scenarios for future emissions, concentrations, and land-use, ending with radiative forcing levels of 2.6, 4.5, 6.0, and 8.5 W/m2 by 2100. Strong mitigation policies result in a low forcing level (RCP2.6). Two medium stabilization scenarios lead to intermediate outcomes: (RCP4.5, RCP6.0).

Hi Anthony,
I’m not the author of the study I would like to show you.
I tried to find it on your site but didn’t see it and I think it’s an amazing article that, in my opinion, puts the last nail in the AGW’s coffin!
I’ll put 2 excerpts to show my point:

«”Most conventional theories expect that global temperatures will continue to increase as CO2 levels continue to rise, as they have done since 1850. What’s striking is that SINCE 2002, GLOBAL TEMPERATURES HAVE ACTUALLY DECLINED – matching a decline in CFCs in the atmosphere,” Professor Lu said. “My calculations of CFC greenhouse effect show that there was global warming by about 0.6 °C from 1950 to 2002, but the EARTH HAS ACTUALLY COOLED SINCE 2002. The cooling trend is set to continue for the next 50-70 years as the amount of CFCs in the atmosphere continues to decline.”»

«Data recorded from 1850 to 1970, before any significant CFC emissions, show that CO2 levels increased significantly as a result of the Industrial Revolution, but the global temperature, excluding the solar effect, KEPT NEARLY CONSTANT. The conventional warming model of CO2, suggests the temperatures should have risen by 0.6°C over the same period, similar to the period of 1970-2002.»

The article also mentions the 11-year solar cycles.

Read more at: http://phys.org/news/2013-05-global-chlorofluorocarbons-carbon-dioxide.html#jCp

Below I put the link to the original abstract.

Thank you.

AUTHOR: Pedro Alves
AUTHOR EMAIL: pedrocarlosalves@live.com.pt
AUTHOR URL: http://www.worldscientific.com/doi/abs/10.1142/S0217979213500732
SUBJECT: WUWT story submission
IP: 89.180.138.227
Array
(
[Story Title] => Global warming caused by chlorofluorocarbons, not carbon dioxide, new study says
[One line summary of story] => CO2 is not to blame for the so-called AGW but Earth is cooling anyway.
[Story body (HTML tags supported)] => Hi Anthony,
I’m not the author of the study I would like to show you.
I tried to find it on your site but didn’t see it and I think it’s an amazing article that, in my opinion, puts the last nail in the AGW’s coffin!
I’ll put 2 excerpts to show my point:

«”Most conventional theories expect that global temperatures will continue to increase as CO2 levels continue to rise, as they have done since 1850. What’s striking is that SINCE 2002, GLOBAL TEMPERATURES HAVE ACTUALLY DECLINED – matching a decline in CFCs in the atmosphere,” Professor Lu said. “My calculations of CFC greenhouse effect show that there was global warming by about 0.6 °C from 1950 to 2002, but the EARTH HAS ACTUALLY COOLED SINCE 2002. The cooling trend is set to continue for the next 50-70 years as the amount of CFCs in the atmosphere continues to decline.”»

«Data recorded from 1850 to 1970, before any significant CFC emissions, show that CO2 levels increased significantly as a result of the Industrial Revolution, but the global temperature, excluding the solar effect, KEPT NEARLY CONSTANT. The conventional warming model of CO2, suggests the temperatures should have risen by 0.6°C over the same period, similar to the period of 1970-2002.»

The article also mentions the 11-year solar cycles.

Read more at: http://phys.org/news/2013-05-global-chlorofluorocarbons-carbon-dioxide.html#jCp

Below I put the link to the original abstract.

Thank you.
[URL of story (if applicable)] => http://www.worldscientific.com/doi/abs/10.1142/S0217979213500732
[Your name as you wish it to display] => Pedro Alves
[Email] => pedrocarlosalves@live.com.pt
[I have read and agreed to the submission guidelines] => Yes
)

” data-medium-file=”” data-large-file=”” class=”wp-image-87490 size-full” src=”https://fabiusmaximus.files.wordpress.com/2015/07/ar5-rcps-2.jpg?w=500″ alt=”IPCC’s AR5: 4 RCPs” originalw=”500″ scale=”1.5″ height=”282″ width=”250″>RCP8.5 gets the most attention. It assumes the fastest population growth (a doubling of Earth’s population to 12 billion), the lowest rate of technology development, slow GDP growth, a massive increase in world poverty, plus high energy use and emissions. For more about the RCPs see “The representative concentration pathways: an overview” by Detlef P. van Vuuren et al, Climatic Change, Nov 2011.

RCP8.5 assumes a nightmarish world even before climate impacts, resulting from substantial changes to long-standing trends. It provides AR5 with an essential worst case scenario necessary for conservative planning.

Unfortunately scientists often inaccurately describe RCP8.5 as the baseline scenario — a future without policy action: “a relatively conservative business as usual case with low income, high population and high energy demand due to only modest improvements in energy intensity” from “RCP 8.5: A scenario of comparatively high greenhouse gas emissions” by Keywan Riahi et al in Climate Change, November 2011, This is a material misrepresentation of RCP8.5. Scientists then use RCP8.5 to construct horrific visions of the future. They seldom mention its unlikely assumptions.

A closer look at scenario RCP8.5

RCP 8.5 clearly has little or no basis in reality, much less represent a “business as usual” scenario.

Let’s hold them accountable for their willful disregard of economics

President Trump’s “secret weapon” is the discount rate…

How Climate Rules Might Fade Away

Obama used an arcane number to craft his regulations. Trump could use it to undo them.

by Matthew Philips , Mark Drajem , and Jennifer A Dlouhy
December 15, 2016, 3:30 AM CST

In February 2009, a month after Barack Obama took office, two academics sat across from each other in the White House mess hall. Over a club sandwich, Michael Greenstone, a White House economist, and Cass Sunstein, Obama’s top regulatory officer, decided that the executive branch needed to figure out how to estimate the economic damage from climate change. With the recession in full swing, they were rightly skeptical about the chances that Congress would pass a nationwide cap-and-trade bill. Greenstone and Sunstein knew they needed a Plan B: a way to regulate carbon emissions without going through Congress.

Over the next year, a team of economists, scientists, and lawyers from across the federal government convened to come up with a dollar amount for the economic cost of carbon emissions. Whatever value they hit upon would be used to determine the scope of regulations aimed at reducing the damage from climate change. The bigger the estimate, the more costly the rules meant to address it could be. After a year of modeling different scenarios, the team came up with a central estimate of $21 per metric ton, which is to say that by their calculations, every ton of carbon emitted into the atmosphere imposed $21 of economic cost. It has since been raised to around $40 a ton.

This calculation, known as the Social Cost of Carbon (SCC), serves as the linchpin for much of the climate-related rules imposed by the White House over the past eight years. From capping the carbon emissions of power plants to cutting down on the amount of electricity used by the digital clock on a microwave, the SCC has given the Obama administration the legal justification to argue that the benefits these rules provide to society outweigh the costs they impose on industry.

It turns out that the same calculation used to justify so much of Obama’s climate agenda could be used by President-elect Donald Trump to undo a significant portion of it. As Trump nominates people who favor fossil fuels and oppose climate regulation to top positions in his cabinet, including Oklahoma Attorney General Scott Pruitt to head the Environmental Protection Agency and former Texas Governor Rick Perry to lead the Department of Energy, it seems clear that one of his primary objectives will be to dismantle much of Obama’s climate and clean energy legacy. He already appears to be focusing on the SCC.

[…]

The SCC models rely on a “discount rate” to state the harm from global warming in today’s dollars. The higher the discount rate, the lower the estimate of harm. That’s because the costs incurred by burning carbon lie mostly in the distant future, while the benefits (heat, electricity, etc.) are enjoyed today. A high discount rate shrinks the estimates of future costs but doesn’t affect present-day benefits. The team put together by Greenstone and Sunstein used a discount rate of 3 percent to come up with its central estimate of $21 a ton for damage inflicted by carbon. But changing that discount just slightly produces big swings in the overall cost of carbon, turning a number that’s pushing broad changes in everything from appliances to coal leasing decisions into one that would have little or no impact on policy.

According to a 2013 government update on the SCC, by applying a discount rate of 5 percent, the cost of carbon in 2020 comes out to $12 a ton; using a 2.5 percent rate, it’s $65. A 7 percent discount rate, which has been used by the EPA for other regulatory analysis, could actually lead to a negative carbon cost, which would seem to imply that carbon emissions are beneficial. “Once you start to dig into how the numbers are constructed, I cannot fathom how anyone could think it has any basis in reality,” says Daniel Simmons, vice president for policy at the American Energy Alliance and a member of the Trump transition team focusing on the Energy Department. “Depending on what the discount rate is, you go from a large number to a negative number, with some very reasonable assumptions.”

[…]

Bloomberg

This is worth repeating:

A 7 percent discount rate, which has been used by the EPA for other regulatory analysis, could actually lead to a negative carbon cost, which would seem to imply that carbon emissions are beneficial.

One of the most common ways of estimating the value of oil and gas revenue and reserves is called “PV10.”

PV10 is the current value of approximated oil and gas revenues in the future, minus anticipated expenses, discounted using a yearly discount rate of 10%. Used primarily in reference to the energy industry, PV10 is helpful in estimating the present value of a corporation’s proven oil and gas reserves.

Read more: PV10 Definition | Investopediahttp://www.investopedia.com/terms/p/pv10.asp#ixzz4bQb2uKyw
Follow us: Investopedia on Facebook

We generally use a 10% discount rate when deciding how to allocate current capital.

A 3% discount rate, as used in the SCC calculation, essentially assumes that the time-value of money is insignificant.  I suppose that since it’s OPM (other people’s money), the government doesn’t view the time-value of money as a particularly relevant thing.

Discounting Away the Social Cost of Carbon: The Fast Lane to Undoing Obama’s Climate Regulations

Let’s hold them accountable for the billions of taxpayer dollars wasted on greenschist

Why Do Federal Subsidies Make Renewable Energy So Costly?

James Conca , CONTRIBUTOR
I write about nuclear, energy and the environment

On a total dollar basis, wind has received the greatest amount of federal subsidies. Solar is second. Wind and solar together get more than all other energy sources combined.

However, based on production (subsidies per kWh of electricity produced), solar energy, has gotten over ten times the subsidies of all other forms of energy sources combined, including wind (see figure).

subsidies-per-kwh

Figure Caption: Subsidies for various energy sources normalized to total energy produced by each source for the years 2010, 2013, 2016 and projected for 2019. Data Source: University of Texas

According to the Energy Information Administration (EIA) and the University of Texas, from 2010 through 2013, federal renewable energy subsidies increased by 54%, from $8.6 billion to $13.2 billion, despite the fact that total federal energy subsidies declined by 23%, from $38 billion to $29 billion.

[…]

Forbes

If the reduction of greenhouse gas emissions truly is a primary objective, the only energys source which merits subsidization is nuclear power:

Nuclear power absolutely is the leader of the pack at reducing so-called “greenhouse” gas emissions:

252491_5_

Figure 1. Nuclear and gas kick @$$, wind breaks even and solar is a loser. http://www.realclearenergy.org/charticles/2014/08/01/solar_and_wind_more_expensive_than_realized_107939.html

If reducing greenhouse gas emissions is important, nuclear power is the obvious answer.  If reducing greenhouse gas emissions at a reasonable cost is important, natural gas is the obvious answer.  If treading water is important, wind is the obvious answer.  If failure is important, solar is the obvious answer.  So, Mr. Dominguez is generally correct.

“The renewables industry has been playing by competitive market rules that have helped to produce good prices,” Amy Francetic, an Invenergy senior vice president, said in an interview. “This is picking and winners and losers in a way that’s troubling.”

Really?  Ms. Francetic, *government* always picks “winners and losers in a way that’s troubling.”

As far as the renewables industry “playing by competitive market rules that have helped to produce good prices”…

Evidence of variability of atmospheric CO2 concentration during the 20th century
Geo-Ecological Seminar University of Bayreuth, 17th July 2008 (see here)
Ernst-Georg Beck, Dipl. Biol Summary of the presentation (printable PDF available here)

In 1958 the modern NDIR spectroscopic method was introduced to measure CO2 concentrations in the atmosphere [Beck 2007]. In the preceding period, these measurements were taken with the old wet chemical method. From this period, starting from 1857, more than 90,000 reliable CO2 measurements are available, with an accuracy within ± 3 %. They had been taken near ground level, sea surface and as high as the stratosphere, mostly in the northern hemisphere. Comparison of these measurements on the basis of old wet chemical methods with the new physical method (NDIR) on sea and land reveals a systematic analysis difference of about minus 10 ppm.

Wet chemical analyses indicate three atmospheric CO2 maxima in the northern hemisphere up to approx. 400 ppm over land and sea since about 1812. The measured atmospheric CO2 concentrations since 1920 –1950 prove to be strongly correlated (more than 80 %) with the arctic sea surface temperature (SST).

A detailed analysis of the Atlantic Ocean water during the arctic warming since 1918 – 1939 by Wattenberg (southern Atlantic ocean) and Buch (northern Atlantic ocean) indicates a very similar state of the Atlantic Ocean (pH, salinity, CO2 in water and air over sea etc.) These data show the characteristics of the warm ocean currents (part of global conveyor belt) at that time, indicating a strong CO2 degassing from the Atlantic Sea, especially in the area of Greenland/Iceland and Spitsbergen. More than 360 ppm had been measured over the sea surface.

In 2004 Polyakov published evidence for a multi-decadal oscillation of the ocean currents in the arctic circle, showing a warm phase (strong arctic warming during 1918 –1940 with high temperatures in the Iceland/Spitsbergen area) similar to the current situation, and a cold phase (around 1900 and 1960). Today the Iceland/Spitsbergen area is known for a strong absorption of CO2.

This multi-decadal heating of the oceanic CO2 absorption area and larger parts of the Northern Atlantic Ocean was followed by an increase of the atmospheric carbon dioxide concentration to approx. 400 ppm during the 30s and approx. 390 ppm today. The abundance of plankton (13C) and other biota supports this view.

Conclusion:

Atmospheric CO2 concentration varies with climate, the sea is the dominant CO2 store, releasing the gas depending on multi-decadal changes of temperature.

See several supporting graphs here: 180 Years of atmospheric CO2 Gas Analysis by Chemical Methods

""NOTE: While this paper presents some interesting findings, I and others do have some concerns with the older chemically derived sources of CO2 concentration data, which are obtained from a chemical analysis process that has some significant variability, es. Beck seems to think he has accurate enough data from these methods, and he has another essay on the process of chemical analysis of CO2 gas concentration in the atmosphere here. I’d recommend reading it also. My personal opinion on this paper is that I’m “on the fence with it”, and I encourage a rigorous debate, pro and con. – Anthony

” data-medium-file=”” data-large-file=”” class=”alignnone size-full wp-image-1855″ src=”https://debunkhouse.files.wordpress.com/2013/06/wpid-data-laughs-o.gif” alt=”wpid-data-laughs-o.gif” scale=”0″>

Figure 2. Ms. Francetic, Data is laughing at you.

The most recent U.S. Energy Information Administration report on energy subsidies reveals the following:

Solar and wind power are insignificant sources of energy.

Energy Subsidies1

Figure 3a. U.S. Energy production by source 2010 & 2013 (trillion Btu), U.S. Energy Information Administration.(Corrected for error in Geothermal Btu shortly after publication.)

mtoe

Figure 3b. U.S. primary energy production 1981-2015 (million tonnes of oil equivalent), BP 2016 Statistical Review of World Energy.

Solar and wind power receive massive Federal subsidies.

Energy 2

Figure 4. Federal subsidies by energy source 2010 and 2013 (million 2013 US dollars), U.S. Energy Information Administration.

The solar and wind subsidies are truly massive in $/Btu.

Energy Subsidies3

Figure 5. Subsidies per unit of energy by source ($/mmBtu), U.S. Energy Information Administration. (Corrected for error in Geothermal Btu shortly after publication.)

So… By all means…

LET’S HOLD THEM ACCOUNTABLE!!!

CONTINUE READING –>

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s