Economic Cost Of The Social Cost Of Carbon

By Willis Eschenbach – Re-Blogged From Skating Under The Ice

The unscientific enterprise called the Social Cost of Carbon (SCC) is a thinly disguised political attempt to justify some kind of a “carbon tax”. Of course calling it a “carbon tax” or the “social cost of carbon” is doublespeak, or perhaps triplespeak. It is doublespeak because the issue is carbon dioxide, not carbon. What they are talking about taxing is not carbon but CO2. (In passing, the irony of a carbon-based life form studying the “social cost of carbon” is worth noting …)

It is triplespeak because in the real world what this so-called “carbon tax” means is a tax on energy, since the world runs on carbon-based fossil fuel energy and will for the foreseeable future.

This energy tax has been imposed in different jurisdictions in a variety of forms—a direct carbon tax, a “cap-and-trade” system, a “renewable mandate”, they come in many disguises but they are all taxes on energy, propped up by the politically driven “Social Cost of Carbon”.

I’ve written before about how taxes on energy are among the most regressive taxes known. Increasing fuel prices hurt the poor more than anyone, because the poor spend a larger fraction of their income on energy. Gasoline prices to drive to work don’t matter to the wealthy, but they can be make-or-break for the poor.

However, in addition to harming the poor, there is a deeper reason that such a tax on energy is a very bad idea.When you tax energy, you are taxing an input to wealth production. Taxing any of the inputs to wealth production is destructive. Instead of inputs, you want to tax the outputs of wealth production. Let me lay out the several reasons why.

I’ve discussed in the past that there are three and only three ways to create real wealth. By real wealth I mean the actual stuff that we use—houses and food and cars and clothing and nails and fish. Real wealth. Here are the three ways to create wealth:

First, you can manufacture wealth—you can build a shirt factory, manufacture a new medicine, or sew clothing in your living room and sell it on the web.

Next, you can grow wealth—you can cultivate an apple tree, keep a home garden, or plant a thousand acres of corn.

Finally, you can extract wealth—you can drill for oil, dig for gold, or fish for trout in a mountain stream.

Everything else is services. Important services to be sure, life-and-death services in some cases … but services nonetheless.

To understand this distinction between services on the one hand and wealth generating activities on the other hand, let me use an example I’ve given before. Suppose there are two couples on a tropical island. One person fishes, one has a garden, one gathers native medicines and building materials from the forest, one builds huts and makes clothes from local fiber. They could go on for a long time that way, because they are creating real wealth. They have food and clothing and housing, the things that we need to survive and thrive.

Next, suppose on a nearby tropical island there are two other couples. On that island one person is a barber, one is a doctor, one is a journalist, and one is a musician. Noble occupations all, but services all … those folks will have nothing to eat, nothing to wear, nothing to keep the rain off. None of those occupations create any real wealth at all, while all the activities on the first island do create real wealth.

This means that if we want our country to be wealthy we need to do everything we can to encourage manufacturing, agriculture, and extraction. And that brings me back to the subject of this essay, the energy tax masquerading as a “carbon tax” and crudely propped up by the laughable “Social Cost of Carbon”.

Let’s set aside for the moment the question of whether a given tax is used wisely or not. And let’s also set aside the consideration of WHAT we tax. Instead, let’s look at the effects of WHERE in the economic cycle we apply our tax.

Each of the three ways to earn wealth has both inputs and outputs. For example, I’ve worked a lot as a commercial fisherman, an extractive industry. The inputs to this way to generate wealth are a boat and motor, nets, diesel, a captain, and some deckhands. The output is yummy fish.

Similarly, the inputs to manufacture are things like raw materials and labor and energy and machinery. The outputs are manufactured goods.

In the third and final way to create wealth, inputs to agriculture are things like water and seeds and fertilizer and tractors and diesel and farmers and field workers. The outputs are fruits and vegetables and fiber and oils and all the rest of the things we grow.

So let me pose you a theoretical question. Assuming that we need to tax the wealth generating process … is it better to tax the inputs to the process, or to tax the outputs of the process?

The answer is perhaps clearest in the field of agriculture, where the question becomes:

Should we tax the seed corn, or should we tax the resulting corn crop?

The first and most obvious reason that we should tax the corn crop is because taxing the seed corn makes it more expensive, and thus it discourages people from planting. We don’t ever want to do that. Discouraging the generation of wealth weakens the economy. We want to encourage the generation of wealth.

The second reason not to tax the seed corn is that agriculture, like all ways to generate wealth, has a multiplier effect. Every single corn seed will likely turn into a plant yielding hundreds of corn seeds. Taxing the seed corn means a farmer can buy less seed … and a reduction of one seed can reduce the eventual crop by a hundredfold. This damages the economy in a second and distinct way.

Finally, there is a third separate hidden damage from taxing the seed corn instead of the  corn crop. Having grown up on a remote cattle ranch I know that farmers are broke in the spring and generally only have cash when the crop comes in. The same is true of most wealth generating activities. Money is scarce at the start of the process and ample at the end. This means that extracting the dollars by taxing the inputs to the wealth generating activities puts a much greater strain on the individual wealth generators, the farmers and the fishermen, than does extracting the same dollars from the outputs of the process.

From these three separate kinds of damage it is clear that taxing the inputs to wealth generating activities is generally a mistake.

And this brings me back again to the question of taxing energy. The problem is that energy in the form of fossil fuels is an input to all forms of large-scale wealth generation. This means that driving the cost of energy up for any reason, or in any manner, imposes a greatly magnified cost on the economy through at least the three separate and distinct mechanisms I listed above.

And this is the reason that I am utterly opposed to any kind of tax on energy, whether it is a so-called “carbon tax”, a “renewable energy mandate”, or any other measure to increase energy costs. We have businesses fleeing California for neighboring states in part because our laws REQUIRE that we pay astronomical costs for electricity from expensive green power sources.

When I was a kid, my schoolbooks were clear that cheap electricity was the savior of the poor housewife and the poor farmer. And growing up on a remote cattle ranch where we generated our own electricity, I could see as a kid that it was absolutely true. Having ample cheap electricity transforms a family, a farm, a town, or a society.
Art courtesy Dave Stephens

Foreword by Anthony Watts: This article, written by the two Jeffs (Jeff C and Jeff Id) is one of the more technically complex essays ever presented on WUWT. It has been several days in the making. One of the goals I have with WUWT is to make sometimes difficult to understand science understandable to a wider audience. In this case the statistical analysis is rather difficult for the layman to comprehend, but I asked for (and got) an essay that was explained in terms I think many can grasp and understand. That being said, it is a long article, and you may have to read it more than once to fully grasp what has been presented here. Steve McIntyre of Climate Audit laid much of the ground work for this essay, and from his work as well as this essay, it is becoming clearer that Steig et al (see “Warming of the Antarctic ice-sheet surface since the 1957 International Geophysical Year”, Nature, Jan 22, 2009) isn’t holding up well to rigorous tests as demonstrated by McIntyre as well as in the essay below. Unfortunately, Steig’s office has so far deferred (several requests) to provide the complete data sets needed to replicate and test his paper, and has left on a trip to Antarctica and the remaining data is not “expected” to be available until his return.

To help layman readers understand the terminology used, here is a mini-glossary in advance:

RegEM – Regularized Expectation Maximization
PCA – Principal Components Analysis
PC – Principal Components
AWS – Automatic Weather Stations

One of the more difficult concepts is RegEM, an algorithm developed by Tapio Schneider in 2001.   It’s a form of expectation maximization (EM) which is a common and well understood method for infilling missing data. As we’ve previously noted on WUWT, many of the weather stations used in the Steig et al study had issues with being buried by snow, causing significant data gaps in the Antarctic record and in some burial cases stations have been accidentally lost or confused with others at different lat/lons. Then of course there is the problem of coming up with trends for the entire Antarctic continent when most of the weather station data is from the periphery and the penisula, with very little data from the interior.

Expectation Maximization is a method which uses a normal distribution to compute the best probability of fit to a missing piece of data.  Regularization is required when so much data is missing that the EM method won’t solve.  That makes it a statistically dangerous technique to use and as Kevin Trenberth, climate analysis chief at the National Center for Atmospheric Research, said in an e-mail: “It is hard to make data where none exist.” (Source: MSNBC article) It is also valuable to note that one of the co-authors of Steig et al, Dr. Michael Mann, dabbles quite a bit in RegEm in this preparatory paper to Mann et al 2008 “Return of the Hockey Stick”.

For those that prefer to print and read, I’ve made a PDF file of this article available here.


This article is an attempt to describe some of the early results from the Antarctic reconstruction recently published on the cover of Nature which demonstrated a warming trend in the Antarctic since 1956.   Actual surface temperatures in the Antarctic are hard to come by with only about 30 stations prior to 1980 recorded through tedious and difficult efforts by scientists in the region.  In the 80’s more stations were added including some automatic weather stations (AWS) which sit in remote areas and report the temperature information automatically.  Unfortunately due to the harsh conditions in the region many of these stations have gaps in their records or very short reporting times (a few years in some cases).  Very few stations are located in the interior of the Antarctic, leaving the trend for the central portion of the continent relatively unknown.  The location of the stations is shown on the map below.


In addition to the stations there are satellite data from an infrared surface temperature measurement which records the temperature of the actual emission from the surface of the ice/ground in the Antarctic.  This is different from the microwave absorption measurements as made from UAH/RSS data which measure temperatures in a thickness of the atmosphere.  This dataset didn’t start until 1982.

Steig 09 is an attempt to reconstruct the continent-wide temperatures using a combination of measurements from the surface stations shown above and the post-1982 satellite data.  The complex math behind the paper is an attempt to ‘paste’ the 30ish pre-1982 real surface station measurements onto 5509 individual gridcells from the satellite data.  An engineer or vision system designer could use several straightforward methods which would insure reasonable distribution of the trends across the grid based on a huge variety of area weighting algorithms, the accuracy of any of the methods would depend on the amount of data available.  These well understood methods were ignored in Steig09 in favor of RegEM.

The use of Principal Component Analysis in the reconstruction

Steig 09 presents the satellite reconstructions as the trend and also provides an AWS reconstruction as verification of the satellite data rather than a separate stand alone result presumably due to the sparseness of the actual data.  An algorithm called RegEM was used for infilling the missing data. Missing data includes pre 1982 for satellites and all years for the very sparse AWS data.  While Dr. Steig has provided the reconstructions to the public, he has declined to provide any of the satellite, station or AWS temperature measurements used as inputs to the RegEM algorithm.  Since the station and AWS measurements were available through other sources, this paper focuses on the AWS reconstruction.

Without getting into the detail of PCA analysis, the algorithm uses covariance to assign weighting of a pattern in the data and does not have any input whatsoever for actual station location.  In other words, the algorithm has no knowledge of the distance between stations and must infill missing data based solely on the correlation with other data sets.  This means there is a possibility that with improper or incomplete checks, a trend from the peninsula on the west coast could be applied all the way to the east.  The only control is the correlation of one temperature measurement to another.

If you were an engineer concerned with the quality of your result, you would recognize the possibility of accidental mismatch and do a reasonable amount of checking to insure that the stations were properly assigned after infilling.  Steig et. al. described no attempts to check this basic potential problem with RegEM analysis.  This paper will describe a simple method we used to determine that the AWS reconstruction is rife with spurious (i.e. appear real but really aren’t) correlations attributed to the methods used by Dr. Steig.  These spurious correlations can take a localized climactic pattern and “smear” it over a large region that lacks adequate data of its own.

Now is where it becomes a little tricky.  RegEM uses a reduced information dataset to infill the missing values.  The dataset is reduced by Principal Component Analysis (PCA) replacing each trend with a similar looking one which is used for covariance analysis.  Think of it like a data compression algorithm for a picture which uses less computer memory than the actual but results in a fuzzier image for higher compression levels.


While the second image is still visible, the actual data used to represent the image is reduced considerably.  This will work fine for pictures with reasonable compression, but the data from some pixels has blended into others.  Steig 09 uses 3 trends to represent all of the data in the Antarctic.  In it’s full complexity using 3 PC’s is analogous to representing not just a picture but actually a movie of the Antarctic with three color ‘trends’ where the color of each pixel changes according to different weights of the same red, green and blue color trends (PC’s).  With enough PC’s the movie could be replicated perfectly with no loss.  Here’s an important quote from the paper.

“We therefore used the RegEM algorithm with a cut-off parameter K=3. A disadvantage of excluding higher-order terms (k>3) is that this fails to fully capture the variance in the Antarctic Peninsula region.  We accept this tradeoff because the Peninsula is already the best-observed region of the Antarctic.”


Above: a graph from Steve McIntyre of ClimateAudit where he demonstrates how “K=3 was in fact a fortuitous choice, as this proved to yield the maximum AWS trend, something that will, I’m sure, astonish most CA readers.

K=3 means only 3 trends were used, the ‘lack of captured variance’ is an acknowledgement and acceptance of the fuzziness of the image.  It’s easy to imagine that it would be difficult to represent a complex movie image of Antarctic with any sharpness from 1957 to 2006 temperature with the same 3 color trends reweighted for every pixel.  In the satellite version of the Antarctic movie the three trends look like this.


Note that the sudden step in the 3rd trend would cause a jump in the ‘temperature’ of the entire movie.  This represents the temperature change between the pre 1982 recreated data and the after 1982 real data in the satellite reconstruction.  This is a strong yet overlooked hint that something may not be right with the result.

In the case of the AWS reconstruction we have only 63 AWS stations to make the movie screen, by which the trends of 42 surface station points are used to infill the remaining data.  If the data from one surface station is copied to the wrong AWS stations the average will overweight and underweight some trends. So the question becomes, is the compression level too high?

The problems that arise when using too few principal components

Fortunately, we’re here to help in this matter.  Steve McIntyre again provided the answer with a simple plot of the actual surface station data correlation with distance.  This correlation plot compares the similarities ‘correlation’ of each temperature station with all of the 41 other manual surface stations against the distance between them.  A correlation of 1 means the data from one station is exactly equal to the other.  Because A -> B correlation isn’t a perfect match for B->A there are 42*42 separate points in the graph.  This first scatter plot is from measured temperature data prior to any infilling of missing measurements.  Station to station distance is shown on the X axis.  The correlation coefficient is shown on the Y axis.


Since this plot above represents the only real data we have existing back to 1957, it demonstrates the expected ‘natural’ spatial relationship from any properly controlled RegEM analysis.  The correlation drops with distance which we would expect because temps from stations thousands of miles away should be less related than those next to each other.  (Note that there are a few stations that show a positive correlation beyond 6000 km.  These are entirely from non-continental northern islands inexplicably used by Steig in the reconstruction.  No continental stations exhibit positive correlations at these distances.)  If RegEM works, the reconstructed RegEM imputed (infilled) data correlation vs. distance should have a very similar pattern to the real data.  Here’s a graph of the AWS reconstruction with infilled temperature values."2jeffs-steig-image5"

Compare this plot with the previous plot from actual measured temperatures.  Now contrast that with the AWS plot above.  The infilled AWS reconstruction has no clearly evident pattern of decay over distance.  In fact, many of the stations show a correlation of close to 1 for stations at 3000 km distant!  The measured station data is our best indicator of true Antarctic trends and it shows no sign that these long distance correlations occur.  Of course, common sense should also make one suspicious of these long distance correlations as they would be comparable to data that indicated Los Angeles and Chicago had closely correlated climate.

It was earlier mentioned that the use of 3 PCs was analogous to the loss of detail that occurs in data compressions.   Since the AWS input data is available, it is possible to regenerate the AWS reconstruction using a higher number of PCs.  It stood to reason that spurious correlations could be reduced by retaining the spatial detail lost in the 3 PC reconstruction.  Using RegEM, we generated a new AWS reconstruction using the same input data but with 7 PCs.  The distance correlations are shown in the plot below.


Note the dramatic improvement over that shown in the previous plot.  The correlation decay with distance so clearly seen in the measured station temperature data has returned.  While the cone of the RegEM data is slightly wider than the ‘real’ surface station data, the counterintuitive long distance correlations seen in the Steig reconstruction have completely disappeared.  It seems clear that limiting the reconstruction to 3 PCs resulted in numerous spurious correlations when infilling missing station data.

Using only 3 principal components distorts temperature trends

If Antarctica had uniform temperature trends across the continent, the spurious correlations might not have a large impact in the overall reconstruction.  Individual sites may have some errors, but the overall trend would be reasonably close.  However, Antarctica is anything but uniform.  The spurious correlations can allow unique climactic trends from a localized region to be spread over a larger area, particularly if an area lacks detailed climate records of its own.  It is our conclusion is that is exactly what is happening with the Steig AWS reconstruction.

Consider the case of the Antarctic Peninsula:

  • The peninsula is geographically isolated from the rest of the continent
  • The peninsula is less than 5% of the total continental land mass
  • The peninsula is known to be warming at a rate much higher than anywhere else in Antarctica
  • The peninsula is bordered by a vast area known as West Antarctica that has extremely limited temperature records of its own
  • 15 of the 42 temperature surface stations (35%) used in the reconstruction are located on the peninsula

If the Steig AWS reconstruction was properly correlating the peninsula stations temperature measurements to the AWS sites, you would expect to see the highest rates of warming at the peninsula extremes.  This is the pattern seen in the measured station data.  The plot below shows the temperature trends for the reconstructed AWS sites for the period of 1980 to 2006.  This time frame has been selected as this is the period when AWS data exists.  Prior to 1980, 100% of AWS reconstructed data is artificial (i.e. infilled by RegEM).


Note how warming extends beyond the peninsula extremes down toward West Antarctica and the South Pole.  Also note the relatively moderate cooling in the vicinity of the Ross Ice Shelf (bottom of the plot).  The warming once thought to be limited to the peninsula appears to have spread.  This “smearing” of the peninsula warming has also moderated the cooling of the Ross Ice Shelf AWS measurements.  These are both artifacts of limiting the reconstruction to 3 PCs.

Now compare the above plot to the new AWS reconstruction using 7 PCs.


The difference is striking.  The peninsula has become warmer and warming is largely limited to its confines.  West Antarctica and the Ross Ice Shelf area have become noticeably cooler.  This agrees with the commonly-held belief prior to Steig’s paper that the peninsula is warming, the rest of Antarctica is not.

Temperature trends using more traditional methods

In providing a continental trend for Antarctica warming, Steig used a simple average of the 63 AWS reconstructed time series.  As can be seen in the plots above, the AWS stations are heavily weighted toward the peninsula and the Ross Ice Shelf area.  Steig’s simple average is shown below.  The linear trend for 1957 through 2006 is +0.14 deg C/decade.  It is worth noting that if the time frame is limited to 1980 to 2006 (the period of actual AWS measurements), the trend changes to cooling, -0.06 deg C/decade.


We used a gridding methodology to weight the AWS reconstructions in proportion to the area they represent.  Using the Steig’s method, 3 stations on the peninsula over 5% of the continent’s area would have the same weighting as three interior stations spread over 30% of the continent area.  The gridding method we used is comparable to that utilized in other temperature constructions such as James Hansen’s GISStemp.  The gridcell map used for the weighted 7 PC reconstruction is shown here.



Cells with a single letter contain one or more AWS temperature stations.  If more than one AWS falls within a gridcell, the results were averaged and assigned to that cell.  Cells with multiple letters had no AWS within them, but had three or more contiguous cells containing AWS stations.  Imputed temperature time series were assigned to these cells based on the average of the neighboring cells.  Temperature trends were calculated both with and without the imputed cells.  The reconstruction trend using 7 PCs and a weighted station average follow.


The trend has decreased to 0.08 deg C/decade.  Although it is not readily apparent in this plot, from 1980 to 2006 the temperature profile has a pronounced negative trend.

Temporal smearing problems caused by too few PCs?

The temperature trends using the various reconstruction methods are shown in the table below.  We have broken the trends down into three time periods; 1957 to 2006, 1957 to 1979, and 1980 to 2006.  The time frames are not arbitrarily chosen, but mark an important distinction in the AWS reconstructions.  There is no AWS data prior to 1980.  In the 1957 to 1980 time frame, every single temperature point is a product of the RegEM algorithm.   In the 1980 to 2006 time frame, AWS data exists (albeit quite spotty at times) and RegEM leaves the existing data intact while infilling the missing data.

We highlight this distinction as limiting the reconstruction to 3 PCs has an additional pernicious effect beyond spatial smearing of the peninsula warming.   In the table below, note the balance between the trends of the 1957 to 1979 era vs. that of the 1980 to 2006 era. In Steig’s 3 PC reconstruction, moderate warming that happened prior to 1980 is more balanced with slight cooling that happened post 1980.  In the new 7 PC reconstruction, the early era had dramatic warming, the later era had strong cooling.  It is believed that the 7 PC reconstruction more accurately reflects the true trends for the reasons stated earlier in this paper.  However, the mechanism for this temporal smearing of trends is not fully understood and is under investigation.  It does appear to be clear that limiting the selection to three principal components causes warming that is largely constrained to a pre-1980 time frame to appear more continuous and evenly distributed over the entire temperature record.


1957 to 2006 trend

1957 to 1979 trend (pre-AWS)

1980 to 2006 trend (AWS era)

Steig 3 PC

+0.14 deg C./decade

+0.17 deg C./decade

-0.06 deg C./decade

New 7 PC

+0.11 deg C./decade

+0.25 deg C./decade

-0.20 deg C./decade

New 7 PC weighted

+0.09 deg C./decade

+0.22 deg C./decade

-0.20 deg C./decade

New 7 PC wgtd imputed cells

+0.08 deg C./decade

+0.22 deg C./decade

-0.21 deg C./decade


The AWS trends which this incredibly long post was created from were used only as verification of the satellite data.  The statistics used for verification are another subject entirely.  Where Steig09 falls short in the verification is that RegEM was inappropriately applying area weighting to individual temperature stations.  The trends from the AWS reconstruction clearly have blended into distant stations creating an artificially high warming result.  The RegEM methodology also appears to have blended warming that occurred decades ago into more recent years to present a misleading picture of continuous warming.  It should also be noted that every attempt made to restore detail to the reconstruction or weight station data resulted in reduced warming and increased cooling in recent years.  None of these methods resulted in more warming than that shown by Steig.

We don’t yet have the satellite data (Steig has not provided it) so the argument will be:

“Silly Jeff’s you haven’t shown anything, the AWS wasn’t the conclusion it was the confirmation.”

To that we reply with an interesting distance correlation graph of the satellite reconstruction (also from only 3 PCs).  The conclusion has the exact same problem as the confirmation.  Stay tuned.


(Graph originally calculated by Steve McIntyre)

” data-medium-file=”” data-large-file=”” class=” wp-image-5926 alignright” src=”” alt=”clinton-energy-tax” scale=”0″>

But now, based on the crazy war on CO2, people are doing everything that they can to drive the cost of energy up. Obama’s Energy Secretary famously said he wanted US gasoline (petrol) prices to go up to $8 a gallon like in Europe. Obama himself said that his electricity policy would necessarily cause electricity prices to “skyrocket”. We were into this nonsense all the way back to Clinton.

Let me recap. In addition to energy cost increases hitting the poor harder than anyone, taxing or increasing prices of any of the inputs to wealth generation also damages the economy in three separate ways.

First, taxing or increasing the price of the seed corn discourages planting.

Second, taxing or increasing the price of the seed corn has a very large effect because of the multiplicative power of wealth generation. Since each corn seed can become a plant that produces hundreds of kernels of corn, anything affecting the seeds has a disproportionately large effect on the eventual production.

Third, taxing or increasing the price of the seed corn hits the producers when they have the least money to pay the tax.

Now, consider the role of energy in this process. For all three wealth-generating activities, energy is an input. And this in turn means that any increase in energy prices reduces wealth generation by more, sometimes much more, than the price increase would suggest.


Let me move to a final topic, the size of the claimed Social Cost of Carbon. Estimates range from a “negative cost”, or what ordinary humans would call a “Social Benefit of Carbon”, through net zero cost to a cost of fifteen hundred dollars per tonne. Let me take eighty dollars a tonne as a representative price for the following calculations.

In 2016, humans emitted on the order of ten gigatonnes (10E+9 tonnes) of carbon in the form of CO2. At eighty bucks a tonne, that works out to about $0.8 trillion dollars per year. Since the global GDP (the value of all goods and services) is about eighty trillion dollars per year, supporters of a carbon tax have pointed out that if we taxed all emissions that is only one percent of GDP. They say that this is a small price to pay.

But this is a simple view that ignores several important things.

First, the critical metric is not GDP. It is GDP growth. GDP growth averages something around 3% per year. This continued growth is critical both to provide for the needs for an increasing population as well as to providing for lifting the global population further out of poverty. A drag of one percent on the economy reduces growth by a third.

Next, the carbon tax itself would be somewhere around 1% of GDP or less … but that doesn’t allow for the multiplier effect of taxing energy. Because energy is an input to all forms of wealth generation, for all the reasons discussed above the cost to the economy of taxing an input to all wealth generation is much larger than just the size of the tax itself.

Finally, the magic of compound interest and the “rule of seventy”. At three percent growth per year, the “rule of seventy” says that the economy will double in size in 70 / 3 = 23 years. But if we foolishly impose a carbon tax and it drags economic growth down by only a single percent, at 2% growth it will take 70 / 2 – 35 years for the economy to double in size. And since all of these CO2 fears are a long ways out, fifty or a hundred years, over time the small drag of a carbon tax on the economy will loom large.

All of this leads us to a simple conclusion. Even if you wish to fight the eeeeevil scourge of CO2, increasing the cost of fossil fuels is the wrong way to go about it. The associated present and future damage from increasing energy costs, both to the poor and to the economy, far outweigh any possible future benefits fifty years from now.

Now me, I see no reason to fight CO2. I don’t think CO2 is the secret temperature control knob of the climate. No persisting complex natural system is that simply controlled.

But if you do want to fight CO2, DON’T RAISE THE COST OF ENERGY. If you raise energy costs in any manner you are fighting CO2 on the backs of the poor housewife and the poor farmer, the very people  you are claiming you are helping. And it’s not just the poor you are hurting. If you raise energy costs you are doing untold damage to the economy itself.

There are other options. Go for greater energy efficiency if you wish, that will reduce emissions without increasing energy cost. Get more production out of each gallon. Or support a shift from coal to natural gas. That shift does both—it reduces both energy costs AND emissions of CO2. Or for the third world solution, fog nets in Peru provide water for hillside dwellers without requiring energy to pump water up the hills. And as always, the mantra of reduce, reuse, and recycle combined with general energy conservation all can cut emissions without cutting CO2.

Because in all of this useless and futile fight against CO2, I can only implore everyone to follow the Hippocratic Oath, which says “First, Do No Harm“. And that means no carbon tax in any form, no “renewable mandate”, no “cap-and-trade”,  because they all raise the cost of energy. A carbon tax, backed up by the anti-scientific political cover story for that tax called the “Social Cost of Carbon”, will do and in some parts of the world already is doing immense harm to both the economy and the poor. Carbon tax and the “Social Cost of Carbon” do uncalculated damage, they should be avoided completely.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s