Weekly Climate and Energy News Roundup #281

The Week That Was: August 19, 2017 Brought to You by www.SEPP.org

By Ken Haapala, President, Science and Environmental Policy Project


Quote of the Week. Physics has a history of synthesizing many phenomena into a few theories – Richard Feynman


Number of the Week: $4 Trillion


Blackwaters: Blackwater rivers and bogs belie the claims that ocean carbonization, foolishly called “ocean acidification”, will eliminate life. Blackwater rivers are common to the Amazon and the Southeast US, and found in Europe, Africa, Australia, Indonesia, and elsewhere. A blackwater river is a slow-moving current running through forested or highly vegetated swamps or wetlands. Decaying vegetation, particularly leaves, release tannins into the water, making a comparatively transparent, acidic water into one darkly stained, resembling tea or black coffee.

Continue reading

Four Reasons Why “Gold Has Entered A New Bull Market”

By Mark O’Byrne – Re-Blogged From http://www.Gold-Eagle.com

– 4 reasons why “gold has entered a new bull market” – Schroders
– Market complacency is key to gold bull market say Schroders
– Investors are currently pricing in the most benign risk environment in history as seen in the VIX
– History shows gold has the potential to perform very well in periods of stock market weakness (see chart)
– You should buy insurance when insurers don’t believe that the “risk event” will happen
– Very high Chinese gold demand, negative global interest rates and a weak dollar should push gold higher

Continue reading

Consensus Of Convenience

[If you’re willing to wade through a lot of technical presentation, there is much valuable information here. -Bob]

By George White – Re-Blogged From http://www.WattsUpWithThat.com

Climate science is the most controversial science of the modern era. A reason why the controversy has been so persistent is that those who accept the IPCC as the arbiter of climate science fail to recognize that a controversy even exists. Their rationalization is that the IPCC’s conclusions are presented as the result of a scientific consensus, therefore, the threshold for overturning them is so high, it can’t be met, especially by anyone who’s peer reviewed work isn’t published in a main stream climate science journal. Their universal reaction when presented with contraindicative evidence is that there’s no way it can be true, therefore, it deserves no consideration and whoever brought it up can be ignored while the catch22 makes it almost impossible to get contraindicative evidence into any main stream journal.

This prejudice is not limited to those with a limited understanding of the science, but is widespread among those who think they understand and even quite prevalent among notable scientists in the field. Anyone who has ever engaged in communications with an individual who has accepted the consensus conclusions has likely observed this bias, often accompanied with demeaning language presented with extreme self righteous indignation that you would dare question the ‘settled science’ of the consensus.

The Fix

Correcting broken science that’s been settled by a consensus is made more difficult by its support from recursive logic where the errors justify themselves by defining what the consensus believes. The best way forward is to establish a new consensus. This means not just falsifying beliefs that support the status quo, but more importantly, replacing those beliefs with something more definitively settled.

Since politics has taken sides, climate science has become driven by the rules of politics rather than the rules of science. Taking a page from how a political consensus arises, the two sides must first understand and acknowledge what they have in common before they can address where they differ.

Alarmists and deniers alike believe that CO2 is a greenhouse gas, that GHG gases contribute to making the surface warmer than it would be otherwise, that man is putting CO2 into the atmosphere and that the climate changes. The denier label used by alarmists applies to anyone who doesn’t accept everything the consensus believes with the implication being that truths supported by real science are also being denied. Surely, if one believes that CO2 isn’t a greenhouse gas, that man isn’t putting CO2 into the atmosphere, that GHG’s don’t contribute to surface warmth, that the climate isn’t changing or that the laws of physics don’t apply, they would be in denial, but few skeptics are that uninformed.

Most skeptics would agree that if there was significant anthropogenic warming, we should take steps to prepare for any consequences. This means applying rational risk management, where all influences of increased CO2 and a warming climate must be considered. Increased atmospheric CO2 means more raw materials for photosynthesis, which at the base of the food chain is the sustaining foundation for nearly all life on Earth. Greenhouse operators routinely increase CO2 concentrations to be much higher than ambient because it’s good for the plants and does no harm to people. Warmer temperatures also have benefits. If you ask anyone who’s not a winter sports enthusiast what their favorite season is, it will probably not be winter. If you have sufficient food and water, you can survive indefinitely in the warmest outdoor temperatures found on the planet. This isn’t true in the coldest places where at a minimum you also need clothes, fire, fuel and shelter.

While the differences between sides seems irreconcilable, there’s only one factor they disagree about and this is the basis for all other differences. While this disagreement is still insurmountable, narrowing the scope makes it easier to address. The controversy is about the size of the incremental effect atmospheric CO2 has on the surface temperature which is a function of the size of the incremental effect solar energy has. This parameter is referred to as the climate sensitivity factor. What makes it so controversial is that the consensus accepts a sensitivity presumed by the IPCC, while the possible range theorized, calculated and measured by skeptics has little to no overlap with the range accepted by the consensus. The differences are so large that only one side can be right and the other must be irreconcilably wrong, which makes compromise impossible, perpetuating the controversy.

The IPCC’s sensitivity has never been validated by first principles physics or direct measurements. It’s most widely touted support comes from models, but it seems that as they add degrees of freedom to curve fit the past, the predictions of the future get alarmingly worse. Its support from measurements comes from extrapolating trends arising from manipulated data where the adjustments are poorly documented and the fudge factors always push results in one direction. This introduces even less certain unknowns, which are how much of the trend is a component of natural variability, how much is due to adjustments and how much is due to CO2. This seems counterproductive since the climate sensitivity should be relatively easy to predict using the settled laws of physics and even easier to measure with satellite observations, so what’s the point in the obfuscation by introducing unnecessary levels of indirection, additional unknowns and imaginary complexity?

Quantifying the Relationships

To quantify the sensitivity, we must start from a baseline that everyone can agree upon. This would be the analysis for a body like the Moon which has no atmosphere and that can be trivially modeled as an ideal black body. While not rocket science, an analysis similar to this was done prior to exploring the Moon in order to establish the required operational limits for lunar hardware. The Moon is a good place to start since it receives the same amount of solar energy as Earth and its inorganic composition is the same. Unless the Moon’s degenerate climate system can be accurately modeled, there’s no chance that a more complex system like the Earth can ever be understood.

To derive the sensitivity of the Moon, construct a behavioral model by formalizing the requirements of Conservation Of Energy as equation 1).

1) Pi(t) = Po(t) + ∂E(t)/∂t

Consider the virtual surface of matter in equilibrium with the Sun, which for the Moon is the same as its solid surface. Pi(t) is the instantaneous solar power absorbed by this surface, Po(t) is the instantaneous power emitted by it and E(t) is the solar energy stored by it. If Po(t) is instantaneously greater than Pi(t), ∂E(t)/∂t is negative and E(t) decreases until Po(t) becomes equal to Pi(t). If Po(t) is less than Pi(t), ∂E(t)/∂t is positive and E(t) increases until again Po(t) is equal to Pi(t). This equation quantifies more than just an ideal black body. COE dictates that it must be satisfied by the macroscopic behavior of any thermodynamic system that lacks an internal source of power, since changes in E(t) affect Po(t) enough to offset ∂E(t)/∂t. What differs between modeled systems is the nature of the matter in equilibrium with its energy source, the complexity of E(t) and the specific relationship between E(t) and Po(t). An astute observer will recognize that if an amount of time, τ, is defined such that all of E is emitted at the rate Po, the result becomes Pi = E/τ + ∂E/∂t which is the same form as the differential equation describing the charging and discharging of a capacitor which is another COE derived model of a physical system whose solutions are very well known where τ is the RC time constant.

For an ideal black body like the Moon, E(t) is the net solar energy stored by the top layer of its surface. From this, we can establish the precise relationship between E(t) and Po(t) by first establishing the relationship between the temperature, T(t) and E(t) as shown by equation 2).

2) T(t) = κE(t)

The temperature of matter and the energy stored by it are linearly dependent on each other through a proportionality constant, κ, which is a function of the heat capacity and equivalent mass of the matter in direct equilibrium with the Sun. Next, equation 3) quantifies the relationship between T(t) and Po(t).

3) Po(t) = εσT(t)4

This is just the Stefan-Boltzmann Law where σ is the Stefan Boltzmann constant and equal to about 5.67E-8 W/m2 per T4, and for the Moon, the emissivity of the surface, ε, is approximately equal to 1.

Pi(t) can be expressed as a function of Solar energy, Psun(t), and the albedo, α, as shown in equation 4).

4) Pi(t) = Psun(t)(1 – α)

Going forward, all of the variables will be considered implicit functions of time. The model now has 4 equations and 7 variables, Psun, Pi, Po, T, α, κ and ε. Psun is known for all points in time and space across the Moon’s surface. The albedo α and heat capacity κ are mostly constant across the surface and ε is almost exactly 1. To the extent that Psun, α, κ and ε are known, we can reduce the problem to 4 equations and 4 unknowns, Pi, T, Po and E, whose time varying values can be calculated for any point on the surface by solving a simple differential equation applied to an equal area gridded representation whose accuracy is limited only by the accuracy of α, κ and ε per cell. Any model that conforms to equations 1) through 4) will be referred to as a Physical Model.

Quantifying the Sensitivity

Starting from a Physical Model, the Moon’s sensitivity can be easily calculated. The ∂E/∂t term is what the IPCC calls ‘forcing’ which is the instantaneous difference between Pi and Po at TOA and/or TOT. For the Moon, TOT and TOA are coincident with the solid surface defining the virtual surface in direct equilibrium with the Sun. The IPCC defines forcing like this so that an increase in Pi owing to a decrease in albedo or increase in solar output can be made equivalent to a decrease in Po from a decrease in power passing through the transparent spectrum of the atmospheric that would arise from increased GHG concentrations. This definition is ambiguous since Pi is independent of E, while Po is highly dependent on E, thus a change in Pi is not equivalent to a the same change in Po since both change E, while only Po changes in response to changes in E which initiates further changes E and Po. The only proper characterization of forcing is a change in Pi and this is what will be used here.

While ∂E/∂t is the instantaneous difference between Pi and Po and conforms to the IPCC definition of forcing, the IPCC representation of the sensitivity assumes that ∂T/∂t is linearly proportional to ∂E/∂t, or at least approximately so. This is incorrect because of the T4 relationship between T and Po. The approximately linear assumption is valid over a small temperature range around average, but is definitely not valid over the range of all possible temperatures.

To calculate the Long Term Equilibrium sensitivity, we must consider that in the steady state, the temporal average of Pi is equal to the temporal average of Po, thus the integral over time of dE/dt will be zero. Given that in LTE, Pi is equal to Po, and the Moon certainly is in an LTE steady state, we can write the LTE balance equation as,

5) Pi = Po = εσT4

To calculate the LTE sensitivity, simply differentiate and invert the above equation which gives us,

6) ∂T/∂Pi = ∂T/∂Po = 1/(4εσT3)

This derivation does make an assumption, which is that ∂T/∂Pi = ∂T/∂Po since we’re really calculating ∂T/∂Po. For the Moon this is true, but for a planet with an semi-transparent atmosphere between the energy source and the surface in equilibrium with it, they aren’t for the same reason that the IPCC’s metric of forcing is ambiguous. None the less, what makes them different can be quantified and the quantification can be tested. But for the Moon, which will serve as the baseline, it doesn’t matter.

Define the average temperature of the Moon as the equivalent temperature of a black body where each square meter of surface is emitting the same amount of power such that when summed across all square meters, it adds up to the actual emissions. Normalizing to an average rate per m2 is a meaningful metric since all Joules are equivalent and the average of incoming and outgoing rates of Joules is meaningful for quantifying the effects one has on the other, moreover; a rate of energy per m2 can be trivially interchanged with an equivalent temperature. This same kind of average is widely applied to the Earth’s surface when calculating its average temperature from satellite data where the resulting surface emissions are converted to an equivalent temperature using the Stefan-Boltzmann Law.

If the average temperature of the Moon was 255K, equation 6) tells us that ∂T/∂Pi is about 0.3C per W/m2. If it was the 288K like the Earth, the sensitivity would be about 0.18C per W/m2. Notice that owing to the 1/T3 dependence of the sensitivity on temperature, as the temperature increases, the sensitivity decreases at an exponential rate. The average albedo of the Moon is about 0.12 leading to an average Pi and Po of about 300 W/m2 corresponding to an equivalent average temperature of about 270K and an average sensitivity of about 0.22 C per W/m2.

As far as the Moon is concerned, this analysis is based on nothing but first principles physics and the undeniable, deterministic average sensitivity that results is about 0.22C per W/m2. This is based on indisputable science, moreover; the predictions of Lunar temperatures using models like this have been well validated by measurements.

The 270K average temperature of the Moon would be the Earth’s average temperature if there were no GHG’s since this also means no liquid water, ice or clouds resulting in an Earth albedo of 0.12 just like the Moon. This contradicts the often repeated claim that GHG’s increase the temperature of Earth from 255K to 288K, or about 33C, where 255K is the equivalent temperature of the 240 W/m2 average power arriving at the planet after reflection. This is only half the story and it’s equally important to understand that water also cools the planet by about 15K owing to the albedo of clouds and ice which can’t be separated from the warming effect of water vapor making the net warming of the Earth from all effects about 18C and not 33C. Water vapor accounts for about 2/3 of the 33 degrees of warming leaving about 11C arising from all other GHG’s and clouds. The other GHG’s have no corresponding cooling effect, thus the net warming due to water is about 7C (33*2/3 – 15) while the net warming from all other sources combined is about 11C, where only a fraction of this arises from from CO2 alone.

Making It More Complex

Differences arise as the system gets more complex. At a level of complexity representative of the Earth’s climate system, the consensus asserts that the sensitivity increases all the way up to 0.8C per W/m2, which is nearly 4 times the sensitivity of a comparable system without GHG’s. Skeptics maintain that the sensitivity isn’t changing by anywhere near that much and remains close to where it started from without GHG’s and if anything, net negative feedback might make it even smaller.

Lets consider the complexity in an incremental manner, starting with the length of the day. For longer period rotations, the same point on the surface is exposed to the heat of the Sun and the cold of deep space for much longer periods of time. As the rotational speed increases, the difference between the minimum and maximum temperature decreases, but given the same amount of total incident power, the average emissions and equivalent average temperature will remain exactly the same. At real slow rotation rates, the dark side can emit all of the energy it ever absorbed from the Sun and the surface emissions will approach those corresponding to it’s internal temperature which does affect the result.

The sensitivity we care about is relevant to how the LTE averages change. The average emissions and corresponding average temperature are locked to an invariant amount of incident solar energy while the rotation rate has only a small effect on the average sensitivity related to the T-3 relationship between temperature and the sensitivity. Longer days and nights mean that local sensitivities will span a wider range owing to a wider temperature range. Since higher temperatures require a larger portion of the total energy budget, as the rotation rate slows, the average sensitivity decreases. To normalize this to Earth, consider a Moon with a 24 hour day where this effect is relatively small.

The next complication is to add an atmosphere. Start with an Earth like atmosphere of N2, O2, and Ar except without water or other GHG’s. On the Moon, gravity is less, so it will take more atmosphere to achieve Earth like atmospheric pressures. To normalize this, consider a Moon the size of the Earth and with Earth like gravity.

The net effect of an atmosphere devoid of GHG’s and clouds will also reduce the difference between high and low extremes, but not by much since dry air can’t hold and transfer much heat, nor will there be much of a difference between ∂T/∂Pi and ∂T/∂Po. Since O2, N2 and Ar are mostly transparent to both incoming visible light and outgoing LWIR radiation, this atmosphere has little impact on the temperature, the energy balance or the sensitivity of the surface temperature to forcing.

At this point, we have a Physical Model representative of an Earth like planet with an Earth like atmosphere, except that it contains no GHG’s, clouds, liquid or solid water, the average temperature is 270K and the average sensitivity is 0.22 W/m2. It’s safe to say that up until this point in the analysis, the Physical Model is based on nothing but well settled physics. There’s still an ocean and a small percentage of the atmosphere to account for, comprised mostly of water and trace gases like CO2, CH4 and O3.

The Fun Starts Here

The consensus contends that the Earth’s climate system is far too complex to be represented with something as deterministic as a Physical Model, even as this model works perfectly well for an Earth like planet missing only water a few trace gases. They arm wave complexities like GHG’s, clouds, coupling between the land, oceans and atmosphere, model predictions, latent heat, thermals, non linearities, chaos, feedback and interactions between these factors as contributing to making the climate too complex to model in such a trivial way, moreover; what about Venus? Each of these issues will be examined by itself to see what effects it might have on the surface temperature, planet emissions and the sensitivity as quantified by the Physical Model, including how this model explains Venus.

Greenhouse Gases

When GHG’s other than water vapor are added to the Physical Model, the effect on the surface temperature can be readily quantified. If some fraction of the energy emitted by the surface is captured by GHG molecules, some fraction of what was absorbed by those molecules is ultimately returned to the surface making it warmer while the remaining fraction is ultimately emitted into space manifesting the energy balance. This is relatively easy to add to the model equations as a decrease in the effective emissivity of a surface at some temperature relative to the emissions of a planet. If Ps is the surface emissions corresponding to T, Fa is the fraction of Ps that’s captured by GHG’s and Fr is the fraction of the captured power returned to the surface, we can express this in equations 7) and 8).

7) Ps = εxσT4

8) Po = (1 – Fa)Ps + FaPs(1 – Fr)

 

The first term in equation 8) is the power passing though the atmosphere that’s not intercepted by GHG’s and the second term is the fraction of what was captured and ultimately emitted into space. Solving equation 8) for Po/Ps, we get equation 9),

9) Po/Ps = 1 – FaFr

Now, we can combine with equation 9) with equation 7) to rewrite equation 3) as equation 3a).

3a) Po = (1 – FaFr)εxσT4

Here, εx is the emissivity of the surface itself, which like the surface of the Moon without GHG’s is also approximately 1, where (1 – FaFr) is the effective emissivity contributed by the semi-transparent atmosphere. This can be double checked by calculating Psi, which is the power incident to the surface and by recognizing that Psi – Ps is equal to ∂E/∂t and Pi – Po.

 

10) Psi = Pi + PsFaFr

11) Psi – Ps = Pi – Po

Solving 11) for Psi and substituting into 10), we get equation 12), solving for Po results in 13) which after substituting 7) for Ps is yet another way to arrive at equation 3a).

12) Ps – Po = PsFaFr

13) Po = (1 – FaFr)Ps

The result is that adding GHG’s modifies the effective emissivity of the planet from 1 for an ideal black body surface to a smaller value as the atmosphere absorbs some fraction of surface emissions making the planets emissions, relative to its surface temperature, appear gray from space. The effective emissivity of this gray body emitter, ε’, is given exactly by equation 3a) as ε’ = (1 – FaFr)εx.

Clouds

Clouds are the most enigmatic of the complications, but none the less can easily fit within the Physical Model. The way to model clouds is to characterize them by the fraction of surface covered by them and then apply the Physical Model with values of α, κ and ε specific to average clear and average cloudy skies and then weighting the results based on the specific proportions of each.

Consider the Pi term, where if ρ is the fraction of the surface covered by clouds, αc is the average albedo of cloudy skies and αs is the average albedo of clear skies, α can be calculated as equation 14).

14) α = ραc + (1 – ρ)αs

Now, consider the Po term, which can be similarly calculated as equation 15) where Ps and Pc are the emissions of the surface and clouds at their average temperatures, εs is the equivalent emissivity characterizing the clear atmosphere and εc is the equivalent emissivity characterizing clouds.

15) Po = ρεsεcPc + ρ(1 – εcsPs + (1 – ρ)εsPs

The first term is the power emitted by clouds, the second term is the surface power passing through clouds and the last term is the power emitted by the surface and passing through the clear sky. GHG’s can be accounted for by identifying the value of εs corresponding to the average absorption characteristics between the surface and space and between clouds and space. By considering Pc as some fraction of Ps and calling this Fx, equation 15) can be rearranged to calculate Po/Ps which is the same as the ε’ derived from equation 3a). The result is equation 16).

16) ε’ = Po/Ps = ρεs εcFx + ρεs (1 – εc) + (1 – ρ)εs

 

The variables εc, Fx and ρ can all be extracted from the ISCCP cloud data, as can αc and αs., moreover; the data supports a very linear relationship between Pc and Ps. The average value of ρ is 0.66, the average value of αc is 0.37 and αs is 0.16 resulting in a value for α of about 0.30 which is exactly equal to the accepted value. The average value of εc is about 0.72 and Fx is measured to be about 0.68. Considering εs to be 1, the effective ε’ is calculated to be about 0.85.

From line by line simulations of a standard atmosphere, the fraction of surface and cloud emissions absorbed by GHG’s, Fa, is about 0.58, the value of Fr as constrained by geometry is 0.5 and is measured to be about 0.51. From equation 13), the equivalent εs becomes 0.71. The new ε’ becomes 0.85 * 0.70 = 0.60 which is well within the margin of error for the expected value of Po/Ps which is 240/395 = 0.61 and even closer to the measured value from the ISCCP data of 238/396 = 0.60. When the same analysis is performed one hemisphere at a time, or even on individual slices of latitude, the predicted ratios of Po/Ps match the measurements once the net transfer of energy from the equator to the poles and between hemispheres is properly accounted for.

At this point, we have a Physical Model that accounts for GHG’s and clouds which accurately predicts the ratio between the BB surface emissions at its average temperature and predicts the average emissions of the planet spanning the entire range of temperatures found on the surface.

The applicability of the Physical Model to the Earth’s climate system is a hypothesis derived from first principles, which still must be tested. The first test predicting the ratio of the planets emissions to surface emissions got the right answer, but this is a simple test and while questioning the method is to deny physical laws, surely some will question the coefficients that led to this result. While the coefficients aren’t constant, they do vary around a mean and its the mean value that’s relevant to the LTE sensitivity. A more powerful testable prediction is that of the planets emissions as a function of surface temperature. The LTE relationship predicted by equation 3) is that if Po are the emissions of the planet and T is the surface temperature, the relationship between them is that of a gray body whose temperature is T and whose emissivity is ε’ and which is calculated to be about 0.61. The results of this test will be presented a little later along with justification for the coefficients used for the first test.

Complex Coupling

In the context of equation 1), complex couplings are modeled as individual storage pools of E that exchange energy among themselves. We’re only concerned about the LTE sensitivity, so by definition, the net exchange of energy among all pools contributing to the temperature must be zero. Otherwise, parts of the system will either heat up or cool down without bound. LTE is defined when the average ∂E/∂t is zero, thus the rate of change for the sum of its components must also be zero.

Not all pools of E necessarily contribute to the surface temperature. For example, some amount of E is consumed by photosynthesis and more is consumed to perform the work of weather. If we quantify E as two pools, one storing the energy that contributes to the surface temperature Es, and the energy stored in all other pools as Eo, we can rewrite equations 1) and 2) as,

1) Pi = Po + ∂Es/∂t + ∂Eo/∂t

1a) ∂E/∂t = ∂Es/∂t + ∂Eo/∂t

2a) T = κ(Es – Eo)

If Eo is a small percentage of Es, an equivalent κ can be calculated such that κE = κ(Es – Eo) and the Physical Model is still representative of the system as a whole and the value of κ will not deviate much from its theoretical value. Measurements from the ISCCP data suggest an average of about 1.8 +/- 0.5 W/m2 of the 240 W/m2 of the average incident solar energy is not contributing to heating the planet nor must it be emitted for the planet to be in a thermodynamic steady state.

Thus far, GHG’s, clouds and the coupling between the surface, oceans and atmosphere can all be accommodated with the Physical Model, by simply adjusting α, κ and ε. There can be no question that the Physical Model is capable of modeling the Earth’s climate and that per equation 6), the upper bound on the sensitivity is less than the 0.4C per W/m2 lower bound suggested by the IPCC. The rest of this discussion will address why the issues with this model are invalid, demonstrate tests whose results support predictions of the Physical Model and show other tests that falsify a high sensitivity.

Models

The results of climate models are frequently cited as supporting an ‘emergent’ high sensitivity, however; these models tend to include errors and assumptions that favor a high sensitivity. Many even dial in a presumed sensitivity indirectly. The underlying issue is that the GCM’s used for climate modeling have a very large number of coefficients whose values are unknown, so they are set based on ‘educated’ guesses and it’s this that leads to bias as objectivity is replaced with subjectivity.

In order to match the past, simulated annealing like algorithms are applied to vary these coefficients around their expected mean until the past is best matched, which if there are any errors in the presumed mean values or there are any fundamental algorithmic flaws, the effects of these errors accumulate making both predictions of the future and the further past worse. This modeling failure is clearly demonstrated by the physics defying predictions so commonly made by these models.

Consider a sine wave with a gradually increasing period. If the model used to represent it is a fixed period sine wave and the period of the model is matched to the average period of a few observed cycles, the model will deviate from what’s being modeled both before and after the range over which the model was calibrated. If the measurements span less than a full period, both a long period sine wave and a linear trend can fit the data, but when looking for a linear trend, the long period sine wave becomes invisible. Consider seasonal variability, which is nearly perfectly sinusoidal. If you measure the average linear trend from June to July and extrapolate, the model will definitely fail in the past and the future and the further out in time you go, the worse it will get. Notice that only sinusoidal and exponential functions of E work as solutions for equation 1), since only sinusoids and exponentials have a derivative whose form is the same as itself, given that Po is a function of E. Note that the theoretical and actual variability in Pi can be expressed as the sum of sinusoids and exponentials and that this leads to the linear property of superposition when behavior is modeled in the energy in, energy out domain, rather than in the energy in, temperature out domain preferred by the IPCC.

The way to make GCM’s more accurate is to insure that the macroscopic behavior of the system being modeled conforms to the constraints of the Physical Model. Clearly this is not being done, otherwise the modeled sensitivity would be closer to 0.22 C per W/m2 and no where near the 0.8C per W/m2 presumed by the consensus and supported by the erroneous models.

Non Radiant Energy

Adding non radiant energy transports to the mix adds yet another level of obfuscation. This arises from Trenberth’s energy balance which includes latent heat and thermals transporting energy into the atmosphere along with the 390 W/m2 of radiant energy arising from an ideal black body surface at 288K. Trenberth returns the non radiant energy to the surface as part of the ‘back radiation’ term, but its inclusion gets in the way of understanding how the energy balance relates to the sensitivity, especially since most of the return of this energy is not in the form of radiation, but in the form of air and water returning that energy back to the surface.

The reason is that neither latent heat, thermals or any other energy transported by matter into the atmosphere has any effect on the surface temperature, input flux or emissions of the planet, beyond the effect they are already having on these variables and whatever effects they have is bundled into the equivalent values of α, κ and ε. The controversy is about the sensitivity, which is the relationship between changes in Pi and changes in T. The Physical Model ascribed with equivalent values of α, κ and ε dictates exactly what the sensitivity must be. Since Pi, Po and T are all measurable values, validating that the net results of these non radiative transports are already accounted for by the relative relationships of measurable variables and that these relationships conform to the Physical Model is very testable and whose results are very repeatable.

Chaos and Non Linearities

Chaos and non linearities are a common complication used to dismiss the requirement that the macroscopic climate system behavior must obey the macroscopic laws of physics. Chaos is primarily an attribute of the path the climate system takes from one equilibrium state to another and is also called weather, which of course, is not the climate. Relative to the LTE response of the system and its corresponding LTE sensitivity, chaos averages out since the new equilibrium state itself is invariant and driven by the incident energy and its conservation. Even quasi-stable states like those associated with ENSO cycles and other natural variability averages out relative to the LTE state.

Chaos may result in over shooting the desired equilibrium, in which case it will eventually migrate back to where it wants to be, but what’s more likely, is that the system never reaches its new steady state equilibrium because some factor will change what that new steady state will be. Consider seasonal variability, where the days start getting shorter or longer before the surface reaches the maximum or minimum temperature it could achieve if the day length was consistently long or short.

Non linearities are another of these red herrings and the most significant non linearity in the system as modeled by the IPCC is the relationship between emissions and temperature. By keeping the analysis in the energy domain and converting to equivalent temperatures at the end, the non linearities all but disappear.

Feedback

Large positive feedback is used to justify how 1 W/m2 of forcing can be amplified into the 4.3 W/m2 of surface emissions required in order to sustain a surface temperature 0.8C higher than the current average of 288K. This is ridiculous considering that the 240 W/m2 of accumulated forcing (Pi) currently results in 390 W/m2 of radiant emissions from the surface (Ps) and that each W/m2 of input results in only 1.6 W/m2 of surface emissions. This means that the last W/m2 of forcing from the Sun resulted in about 1.6 W/m2 of surface emissions, the idea that the next one would result in 4.3 W/m2 is so absurd it defies all possible logic. This represents such an obviously fatal flaw in consensus climate science that either the claimed sensitivity was never subject to peer review or the veracity of climate science peer review is nil, either of which deprecates the entire body of climate science publishing.

The feedback related errors were first made by Hansen, reinforced by Schlesinger and have been cast in stone since AR1 and more recently, they’ve been echoed by Roe. Bode developed an analysis technique for linear, feedback amplifiers and this analysis was improperly applied to quantify climate system feedback. Bode’s model has two non negotiable preconditions that were not met by the application of his analysis to the climate. These are specified in the first couple of paragraphs in the book referenced by both Hansen and Schlesinger as the theoretical foundation for climate feedback. First is the assumption of strict linearity. This means that if the input changes by 1 and the output changes by 2, then, if the input changes by 2, the output must change by 4. By using a delta Pi as the input to the model and a delta T as the output, this linearity constraint was violated since power and temperature are not linearly related, but power is related to T4. Second is the requirement for an implicit source of Joules to power the gain. This can’t be the Sun, as solar energy is already accounted for as the forcing input to the model and you can’t count it twice.

To grasp the implications of nonlinearity, consider an audio amplifier with a gain of 100. If 1 V goes in and 100 V comes out before the amplifier starts to clip, increasing the input to 2V will not change the output value and the gain, which was 100 for inputs from 0V to 1V is reduced to 50 at 2V of input. Bode’s analysis requires the gain, which climate science calls the sensitivity, to be constant and independent of the input forcing. Once an amplifier goes non linear and starts to clip, Bode’s analysis no longer applies.

Bode defines forcing as the stimulus and defines sensitivity as the change in the dimensionless gain consequential to the change in some other parameter and is also a dimensionless ratio. What climate science calls forcing is an over generalization of the concept and what they call sensitivity is actually the incremental gain, moreover; they’ve voided the ability to use Bode’s analysis by choosing a non linear metric of gain. For the linear systems modeled by Bode, the incremental gain is always equal to the absolute gain as this is the basic requirement that defines linearity. The consensus makes the false claim that the incremental gain can be many times larger than the absolute gain, which is a non sequitur relative to the analysis used. Furthermore, given the T-3 dependence of the sensitivity on the temperature, the sensitivity quantified as a temperature change per W/m2 of forcing must decrease as T increases, while the consensus quantification of the sensitivity requires the exact opposite.

At the measured value of 1.6 W/m2 of surface emissions per W/m2 of accumulated solar forcing, the extra 0.6 W/m2 above and beyond the initial W/m2 of forcing is all that can be attributed to what climate science refers to as feedback. The hypothesis of a high sensitivity requires 3.3 W/m2 of feedback to arise from only 1 W/m2 of forcing. This is 330% of the forcing and any system whose positive feedback exceeds 100% of the input will be unconditionally unstable and the climate system is certainly stable and always recovers after catastrophic natural events that can do far more damage to the Earth and its ecosystems then man could ever do in millions of years of trying. Even the lower limit claimed by the IPCC of 0.4C per W/m2 requires more than 100% positive feedback, falsifying the entire range they assert.

An irony is that consensus climate science relies on an oversimplified feedback model that makes explicit assumptions that don’t apply to the climate system in order to support the hypothesis of a high sensitivity arising from large positive feedback, yet their biggest complaint about the applicability of the Physical Model is that the climate is too complicated to be represented with such a simple and undeniably deterministic model.

Venus

Venus is something else that climate alarmists like to bring up. However; if you consider Venus in the context of the Physical Model, the proper surface in direct equilibrium with the Sun is not the solid surface of the planet, but a virtual surface high up in its clouds. Unlike Earth, where the lapse rate is negative from the surface in equilibrium with the Sun and up into the atmosphere, the Venusian lapse rate is positive from its surface in equilibrium with the Sun down to the solid surface below. Even if the Venusian atmosphere was 90 ATM of N2, the surface would still be about as hot as it is now.

Venus is a case of runaway clouds and not runaway GHG’s as often claimed. The thermodynamics of Earth’s clouds are tightly coupled to that of its surface through evaporation and precipitation, thus cloud temperatures are a direct function of the surface temperature below and not the Sun. While the water in clouds does absorb some solar energy, owing to the tight coupling between clouds and the oceans, the LTE effect is the same as if the oceans had absorbed that energy directly. This isn’t the case for Venus, where the thermodynamics of its clouds are independent from that of its surface enabling clouds to arrive at a steady state with incoming energy by themselves.

Even for Earth, the surface in direct equilibrium with the Sun is not the solid surface, as it is for the Moon, but is a virtual surface comprised of the top of the oceans and the bits of land that poke through. Most of the solid surface is beneath the oceans and its nearly 0C temperature is a function of the temperature/density profile of the ocean above. The dense CO2 atmosphere of Venus, whose mass is comparable to the mass of Earth’s oceans, acts more like Earth’s oceans than it does Earth’s atmosphere thus Venusian cloud tops above a CO2 ocean is a good analogy for the surface of Earth and will be at about the same average temperature and atmospheric pressure.

Testing Predictions

The Physical Model makes predictions about how Pi, Po and the surface temperature will behave relative to each other. The first test was a prediction of the ratio between surface emissions and planet emissions based on measurable physical parameters and this calculation was nearly exact. The values of αc, αs, ρ, and εc in equations 14) and 16) were extracted as the average values reported or derived from the ISCCP cloud data set provided by GISS while εs arose from line by line simulations.

Figures 1, 2, 3 and 4 illustrate the origins of αc, αs, ρ, and εc, where the dotted line in each plot represents the measured LTE average value for that parameter. Those values were rounded to 2 significant digits for the purpose of checking the predictions of equations 14) and 16). Clicking

on a figure should bring up a full resolution version.

clip_image002clip_image004

clip_image006clip_image008

The absolute accuracy of ISCCP surface temperatures suffers from a 2001 change to a new generation of polar orbiters combined with discontinuous polar orbiter coverage which the algorithms depended on for consistent cross satellite calibration. This can be seen more dramatically in Figure 5, which is a plot of the global monthly average surface temperature derived from the gridded temperatures reported in the ISSCP. While this makes the data useless for establishing trends, it doesn’t materially affect the use of this data for establishing the average coefficients related to the sensitivity.

clip_image010clip_image012

Figure 5 demonstrates something even more interesting, which is that the two hemispheres don’t exactly cancel and the peak to peak variability in the global monthly average is about 5C. The Northern hemisphere has significantly more seasonal p-p temperature variability than the Southern hemisphere owing to a larger fraction of land resulting in a global sum whose minimum and maximum are 180 degrees out of phase of what you would expect from the seasonal position of perihelion. To the extent that the consensus assumes the effects of perihelion average out across the planet, the 5C p-p seasonal variability in the planets average temperature represents the minimum amount of natural variability to expect given the same amount of incident energy. In about 10K years when perihelion is aligned with the Northern hemisphere summer, the p-p differences between hemispheres will become much larger which is a likely trigger for the next ice age. The asymmetric response of the hemispheres is something that consensus climate science has not wrapped its collective heads around, largely because the anomaly analysis they depend on smooths out seasonal variability obfuscating the importance of understanding how and why this variability arises, how quickly the planet responds to seasonal forcing and how the asymmetry contributes to the ebb and flow of ice ages.

While Pi is trivially calculated as reflectance applied to solar energy, both of which are relatively accurately known, Po is trickier to arrive at. Satellites only measure LWIR emissions in 1 or 2 narrow bands in the transparent regions of the emission spectrum and in an even narrower band whose magnitude indicates how much water vapor absorption is taking place. These narrow band emissions are converted to a surface temperature by applying a radiative model to a varying temperature until the emissions leaving the radiative model in the bands measured by the satellite are matched and then the results are aligned to surface measurements. Equation 15) was used to calculate Po which was based on reported surface temperatures, cloud temperatures and cloud emissivity applied to a reverse engineered radiative model to determine how much power leaves the top of the atmosphere across all bands. This is done for both cloudy and clear skies across each equal area grid cell and the total emissions are a sum weighted by the fraction of clouds modified by the clouds emissivity. To cross check this calculation, ∂E(t)/∂t can be calculated as the difference between Pi and the derived Po. If the long term average of this is close to zero, then COE is not violated by the calculated Po. Figure 6 shows this and indeed, the average ∂E(t)/∂t is approximately zero within the accuracy of the data. The 1.8 W/m2 difference could be a small data error, but seems to be the solar power that’s not actually heating the surface but powering photosynthesis and driving the weather and that need not be emitted for balance to arise. Note that ∂E/∂t per hemisphere is about 200 W/m^2 p-p and that the ratio between the global ∂E/∂t and the global ∂T/∂t infers a transient sensitivity of only about 0.12 C per W/m^2.

Figure 7 shows another way to validate the predictions as a scatter plot of the relative relationship between monthly averages of Pi and Po for constant latitude. Each little dot is the average for 1 month of data and the larger dots are the per slice averages across 3 decades of measurements. The magenta line represents Pi == Po. Where the two curves intersect defines the steady state which at 239 W/m2 is well within the margin of error of the accepted value. Note that the tilt in the measured relationships represents the net transfer of energy from tropical latitudes on the right to polar latitudes on the left.

clip_image014clip_image016

The next test is of the prediction that the relationship between the average temperature of the surface and the planets emissions should correspond to a gray body emitter whose equivalent emissivity is about 0.61, which was the predicted and measured ratio between the planets emissions and that of the surface.

Figure 8 shows the relationship between the surface temperature and both Pi and Po, again for constant latitude slices of the planet. Constant latitude slices provide visibility to the sensitivity as the most significant difference between adjacent slices is Pi, where a change in Pi is forcing per the IPCC definition. The change in the surface temperature of adjacent slices divided by the change in Pi quantifies the sensitivity of that slice per the IPCC definition. The slope of the measured relationship around the steady state is the short line shown in green. The larger green line is a curve of the Stefan-Boltzmann Law predicting the complete relationship between the temperature and emissions based on the measured and calculated equivalent emissivity of 0.61. The monthly average relationship between Po and the surface temperature is measured to be almost exactly what was predicted by the Physical Model. The magenta line is the prediction of the relationship between Pi and the surface temperature based on the requirement that the surface is approximately an ideal black body emitter and again, the prediction is matched by the data almost exactly.

For reference, Figure 9 shows how little the effective emissivity, ε varies on a monthly basis with a max deviation from nominal of only about +/- 3%. Figure 10 shows how the fraction of the power absorbed by the atmosphere and returned to the surface also varies in a relatively small range around 0.51. In fact, the monthly averages for all of the coefficients used to calculate the sensitivity with equation 16) vary over relatively narrow ranges.

clip_image018clip_image020

The hypothesized high sensitivity also makes predictions. The stated nominal sensitivity is 0.8C per W/m2 of forcing and if the surface temperature increases by 0.8C from 288K to 288.8K, 390.1 W/m2 of surface emissions increases to 394.4 W/m2 for a 4.3 W/m2 increase that must arise from only 1 W/m2 of forcing. Since the data shows that 1 W/m2 of forcing from the Sun increases the surface emissions by only 1 W/m2, the extra 3.3 W/m2 required by the consensus has no identifiable origin thus falsifies the possibility of a sensitivity as high as claimed. The only possible origin is the presumed internal power supply that Hansen and Schlesinger incorrectly introduced to the quantification of climate feedback.

Joules are Joules and are interchangeable with each other. If the next W/m2 of forcing will increase the surface emissions by 4.3 W/m2, each of the accumulated 239 W/m2 of solar forcing must be increasing the surface emissions by the same amount. If the claimed sensitivity was true, the surface would be emitting 1028 W/m2 which corresponds to an average surface temperature of 367K which is about 94C and close to the boiling point of water. Clearly it’s not once again falsifying a high sensitivity.

Conclusion

Each of the many complexities cited to diffuse a simple analysis based on the immutable laws of physics has been shown to be equivalent to variability in the α, κ and ε coefficients quantifying the Physical Model. Another complaint is that the many complexities interact with each other. To the extent they do and each by itself is equivalent to changes in α, κ and ε, any interactions can be similarly represented as equivalent changes to α, κ and ε. It’s equally important to remember that unlike GCM’s, this model has no degrees of freedom to tweak its behavior, other than the values of α, κ and ε, all of which can be measured, and that no possible combination of coefficients within factors of 2 of the measured values will result in a sensitivity anywhere close to what’s claimed by the consensus. The only possible way for any Physical Model to support the high sensitivity claimed by the IPCC is to violate Conservation Of Energy and/or the Stefan-Boltzmann Law which is clearly impossible.

Predictions made by the Physical Model have been confirmed with repeatable measurements while the predictions arising from a high sensitivity consistently fail. In any other field of science, this is unambiguous proof that the model whose predictions are consistently confirmed is far closer to reality than a model whose predictions consistently fail, yet the ‘consensus’ only accepts the failing model. This is because the IPCC, which has become the arbiter of what is and what is not climate science, needs the broken model to supply its moral grounds for a massive redistribution of wealth under the guise of climate reparations. It’s an insult to all of science that the scientific method has been superseded by a demonstrably false narrative used to support an otherwise unsupportable agenda and this must not be allowed to continue.

Here’s a challenge to those who still accept the flawed science supporting the IPCC’s transparently repressive agenda. First, make a good faith effort to understand how the Physical Model is relevant, rather than just dismiss it out of hand. If you need more convincing after that, try to derive the sensitivity claimed by the IPCC using nothing but the laws of physics. Alternatively, try to falsify any prediction made by the Physical Model, again, relying only on the settled laws of physics. Another thing to try is to come up with a better explanation for the data, especially the measured relationships between Pi, Po and the surface temperature, all of which are repeatably deterministic and conform to the Physical Model. If you have access to a GCM, see if its outputs conform to the Physical Model and once you understand why they don’t, you will no doubt have uncovered serious errors in the GCM.

If the high sensitivity claimed by the IPCC can be falsified, it must be rejected. If the broadly testable Physical Model produces the measured results and can’t be falsified, it must be accepted. Falsifying a high sensitivity is definitive and unless and until something like the Physical Model is accepted by a new consensus, climate science will remain controversial since no amount of alarmist rhetoric can change the laws of physics or supplant the scientific method.

CONTINUE READING –>

Cause and Timings of El Nino Events, 1850-Present

[See bottom for a description of an El Nino. -Bob]

By Burl C Henry

ABSTRACT:

An hypothesis is presented that the cause of an EL Nino event is the reduction in the amount of dimming Sulfur Dioxide (SO2) aerosol emissions into the troposphere.

For the modern era, 1850 – present, the SO2 removal episodes are coincident with either business recessions, or environmentally-driven reductions in net global SO2 aerosol emissions.

DISCUSSION:

One of the enduring questions regarding Earth’s climate has been what causes the onset of an El Nino event, because of their generally adverse impact upon the Earth’s weather. They are believed to have occurred for thousands of years, but up until now, the reason for their appearance has been the subject of much debate.

Continue reading

Gold Juniors’ Q2’17 Fundamentals

By Adam Hamilton – Re-Blogged From http://www.Gold-Eagle.com

The junior gold miners’ stocks have spent months grinding sideways near lows, sapping confidence and breeding widespread bearishness.  The entire precious-metals sector has been left for dead, eclipsed by the dazzling Trumphoria stock-market rally.  But traders need to keep their eyes on the fundamental ball so herd sentiment doesn’t mislead them.  The juniors recently reported Q2 earnings, and enjoyed strong results.

Four times a year publicly-traded companies release treasure troves of valuable information in the form of quarterly reports.  Companies trading in the States are required to file 10-Qs with the US Securities and Exchange Commission by 45 calendar days after quarter-ends.  Canadian companies have similar requirements.  In other countries with half-year reporting, some companies still partially report quarterly.

Continue reading

Gold Price – Crossing The Rubicon

By Alasdair Macleod – Re-Blogged From http://www.Gold-Eagle.com

Gold is challenging the $1300 level for the third time this year. If it breaks upwards out of this consolidation phase convincingly, it could be an important event, signalling a dollar that will continue to weaken.

The factors driving the dollar lower are several and disparate. The US economy is sluggish relative to the rest of the world, the rise of Asia from which America is excluded is unstoppable, geopolitics are shifting away from US global dominance, and the end is in sight for monopolistic payment for oil in US dollars.

These subjects have been covered in some detail in my recent articles, which will be referred to for further clarification where appropriate. This article summarises these trends, and explains why the consequence appear certain to drive gold, priced in dollars, much higher.

Continue reading

The Great Carbon Scam

By Ira Goldstein – Re-Blogged From Toronto Sun

One thing you learn on the climate change beat is that the best journalism is done overseas.

In Canada, too many in the media, not knowing the issues, are empty vessels waiting to be filled by Trudeau government propaganda, which they uncritically regurgitate to their audiences.

By contrast, in the UK, one of many examples of serious reporting is a new radio documentary by the BBC’s environment correspondent, Matt McGrath.

Called “Carbon Counting,” McGrath reveals how many nations that signed the Paris accord are inaccurately reporting and/or hiding their greenhouse gas emissions from the United Nations.

Continue reading

Climate Science Double-Speak

By Kip Hansen – Re-Blogged From http://www.WattsUpWithThat.com

A quick note for the amusement of the bored but curious.

While in search of something else, I ran across this enlightening page from the folks at UCAR/NCAR [The University Corporation for Atmospheric Research/The National Center for Atmospheric Research — see pdf here for more information]:

What is the average global temperature now?

Continue reading

History Is About To Repeat Itself Again…And It Might Get Ugly

By Shannara Johnson – Re-Blogged From http://www.Gold-Eagle.com

Grant Williams believes that the 76 million retiring Baby Boomers will trigger a major pension crisis. He should know, because he’s been studying financial history and telltale crisis patterns for nearly two decades.

“With that potentially bad situation we could face,” the seasoned asset manager and co-founder of Real Vision TV said in a recent Metal Masters interview, “holding physical metal, somewhere safe, somewhere outside the banking system, is just a sensible precaution to take.”

His outlook has changed drastically since he started his first job trading Japanese markets in 1986: “What I walked into at that time was one of the greatest bull market bubbles the world had ever seen, in the Japanese equity market and real estate market.”

Continue reading

Sleeper Issues Poised To Rattle Markets

By Clint Siegner – Re-Blogged From http://www.Silver-Phoenix500.com

Investors have been well-trained in complacency. They have spent the past few years watching markets shrug off momentous geopolitical events – each more quickly than the last. Brexit’s impact faded within days. Trump’s election faded within hours.

Stocks traded at all-time highs this summer and volatility made all-time lows. That is the set-up as we head into the fall…

Almost nobody seems nervous. In this age of central planning and highly artificial markets, it is hard to tell when this period of strange market serenity will end. But vigilant investors should have a few ideas. The next few months are going to challenge the status quo.

Continue reading

Stock Market is Priced to Sell

By Vitaliy Katsenelson – Re-Blogged From Investment Management Associates

If you feel that you have to own stocks no matter the cost; if you tell yourself, “Stocks are expensive, but I am a long-term investor,” — there’s help for you yet.

First, let’s scan the global economic landscape. The health of the European Union has not improved, and Brexit only increased the possibility of other nation’s “exits” as the structural issues that render this union dysfunctional go unfixed.

Meanwhile, Japan’s population isn’t getting any younger — in fact, it’s the oldest in the world. Japan is also the world’s most-indebted developed nation (though, in all fairness, other countries are desperately trying to take that title away from it). Despite the growing debt, Japanese five-year government bonds are “paying” an interest rate of negative 0.10%. Imagine what will happen to the government’s budget when Japan has to start actually paying to borrow money commensurate with its debtor profile.

Continue reading

The Fed Just Admitted It No Longer Has A Clue What’s Going On

By Graham Summers – Re-Blogged From http://www.Gold-Eagle.com

The Fed July FOMC minutes that were released last week were nothing short of extraordinary. However, to fully appreciate just what the Fed admitted, we first need a little background.

From November 2016 until June 2017, the Fed was pushing a hawkish agenda. The running mantra at this time was that the Fed would raise rates 3-4 times in 2017. As the year progressed, the Fed also began talking about shrinking its balance sheet.

The Fed’s justification for these moves was that inflation was rising and the economy was strong enough to tolerate these moves. As a result the Fed hiked rates twice, first in March and then again in June 2017.

Continue reading

Cryptocurrencies: Modern Day Alchemy

By Michael Pento – Re-Blogged From Pento Portfolio Strategies

Cryptocurrencies make good currencies, but fail miserably when trying to achieve the status of money.

Cryptocurrencies are both created and held electronically inside a virtual wallet. These digital currencies use encryption techniques to regulate the generation of new units and to verify the transfer of funds. Cryptocurrencies operate independently of governments and are decentralized.

The most popular cryptocurrency now is Bitcoin. Bitcoin has risen in popularity because, unlike government-backed fiat currencies, it has a finite number of coins–21 million, 15.5 million of which are currently in circulation–and user transactions remain anonymous. Thus, the argument goes, it is superior to the fiat currency system and a viable replacement for precious metals because of the limited supply, anonymity, and independence of central bank authority.

Continue reading

The Truth About Bundesbank Repatriation Of Gold From US

By Mark O’Byrne – Re-Blogged From http://www.Gold-Eagle.com

– Bundesbank has completed a transfer of gold worth €24B from France and U.S.
– Germany has completed domestic gold storage plan 3 years ahead of schedule
– In the €7.7 million plan, 54,000 gold bars were shipped and audited
– In 2012 German court called for inspection of Germany’s foreign gold holdings
– Decision to repatriate from Paris and New York was ‘to build trust and confidence domestically’
– 1,236t or 37% of German holdings remain in New York Fed facility
– Bundesbank wants to hold gold bullion
– U.S. government declines to audit gold reserves … doesn’t want world to realise gold’s importance in the global monetary system

Continue reading

Buffett Sees Market Crash Coming

By Mark O’Byrne – Re-Blogged From http://www.Gold-Eagle.com

The Sage of Omaha’s adage is “it’s far better to buy a wonderful company at a fair price than a fair company at a wonderful price.”

But for Warren Buffett the current environment doesn’t appear to be offering up any wonderful companies at fair valuations. The situation is so bad that the cash stockpile of Berkshire Hathaway has more than doubled in the last four years, from under $40 billion to $100bn.The infamous investor is famed for his investment approach of pouncing on companies when they run

Continue reading

Extracting Rare-Earth Elements From Coal Soon Could be Economical in US

From Penn State – Re-Blogged From Science Daily

The U.S. could soon decrease its dependence on importing valuable rare-earth elements that are widely used in many industries, according to a team of Penn State and U.S. Department of Energy researchers who found a cost-effective and environmentally friendly way to extract these metals from coal byproducts.

Rare-earth elements are a set of seventeen metals — such as scandium, yttrium, lanthanum and cerium — necessary to produce high-tech equipment used in health care, transportation, electronics and numerous other industries. They support more than $329 billion of economic output in North America, according to the American Chemistry Council, and the United States Geological Survey expects worldwide demand for REEs to grow more than 5 percent annually through 2020. China produces more than 85 percent of the world’s rare-earth elements, and the U.S. produces the second most at just over 6 percent, according to the USGS.

Continue reading

Dangerous Volcanoes

By Larry Kummer – From the Fabius Maximus website

Summary: While we obsess about climate change and debate if we live in the Anthropocene, we prepare poorly or not at all for natural forces like volcanoes that can level cities. This is folly we can no longer afford. Experts recommend a simple first step to better protect ourselves. Let’s start listening, or nature will teach us an expensive lesson.

“We don’t even plan for the past.”
Steven Mosher (of Berkeley Earth), a comment posted at Climate Etc.

Continue reading

Top Electric Grid Regulator Will Make Keeping Coal Plants Online One Of His Main Goals

By Michael Bastasch – Re-Blogged From Daily Caller

Federal Energy Regulatory Commission (FERC) Chairman Neil Chatterjee said he would look for ways to “properly compensate” coal plants for providing reliable electricity during his time as a top energy regulator.

“These are essential to national security. And to that end, I believe baseload power should be recognized as an essential part of the fuel mix,” Chatterjee said in a video interview FERC officials posted online Monday.

“I believe that generation, including our existing coal and nuclear fleet, need to be properly compensated to recognize the value they provide to the system,” Chatterjee said.

Continue reading

Good For The Greenland Ice Sheet, Bad For The Corn Belt

By David Archibald – Re-Blogged From http://www.WattsUpWithThat.com

One thing that climate rationalists and warmers can agree on is that we all would like to have a healthy Greenland Ice Sheet. The good news on that front is that the ice sheet has put on 500 Gt this year as per this diagram provided by the Danish Meteorological Institute:

clip_image002

Figure 1: Total daily contribution to the surface mass balance of the Greenland Ice Sheet

Continue reading

Hackable, Messy Politics

By John Rubino – Re-Blogged From Dollar Collapse

It’s already widely understood that the electronic voting machines used by a growing number of US states are easy to hack. But just how easy may not be clear. Consider this from today’s Wall Street Journal:

Hacker Cracks Voting Machine in Less Than 2 Hours

LAS VEGAS—A touch-screen voting machine used in a 2014 election in Virginia was hacked in about 100 minutes by exploiting a Windows XP flaw that was more than a decade old as part of a demonstration on security vulnerabilities in election technology.

Continue reading

Paul Krugman Shows Why the Climate Campaign Failed

By Larry Kummer – From the Fabius Maximus website

Summary: Like all of Krugman’s work, we can learn much from his latest column about climate change. See this annotated version to see how he shows why 30 years of climate crusading has produced so little policy action in the US.

Burning World in Gloved Hand

The Axis of Climate Evil“ – “Bad faith may destroy civilization”

Paul Krugman’s op-ed in the New York Times, 11 August 2017.

Continue reading

Money Keeps Pouring In

By John Rubino – Re-Blogged From Dollar Collapse

Someday, stock, bond and real estate valuations will matter again. And the mechanism by which this return to sanity is achieved will probably be the torrent of money now flowing in from people who, for various reasons, don’t care about (or understand) the prices they’re paying.

Millennials, for instance, seem to have reached the “beginners’ mistakes” phase of their financial lives. They’re major buyers of recreational vehicles – see The Perfect Crash Indicator Is Flashing Red — and are now opening stock brokerage accounts at a startling pace:

Continue reading

Report Showed Climate Data Was Unfit For Purpose

By Dr. Tim Ball – Re-Blogged From http://www.WattsUpWithThat.com

IPCC Knew from Start Climate Data Inadequate

[This station, on top of pavement, fails the NWS standard for minimal accuracy & precision. Only around 8% of all official stations in the US meet these standards, meaning that the data is not fit for the purpose used in measuring climate. See http://www.SurfaceStations.org. -Bob]

Climate Monitoring Weather Station at the University of Arizona, Tucson, measuring temperature in the University parking lot if front of the Atmospheric Science Building. The station has since been closed. 2007 Photo by Warren Meyer

Continue reading

Government Debt Isn’t Actually Debt (??)

By John Rubino – Re-Blogged From Dollar Collapse

The failure of fiat currency and fractional reserve banking to produce a government-managed utopia is generating very few mea culpas, but lots of rationalizations.

Strangest of all these rationalizations might be the notion that government debt is not really a liability, but an asset. Where personal and business loans are bad if taken to excess, government borrowing is not just good on any scale, but necessary to a healthy economy. Here’s an excerpt from a particularly assertive version of this argument:

Continue reading

‘Deep’ Subprime Car Loans Hit Crisis-Era Milestone

From Bloomberg – Re-Blogged From Newsmax

Amid all the reflection on the 10-year anniversary of the start of the subprime loan crisis, here’s a throwback that investors could probably do without.

There’s a section of the auto-loan market — known in industry parlance as deep subprime — where delinquency rates have ticked up to levels last seen in 2007, according to data compiled by credit reporting bureau Equifax.

Image: 'Deep' Subprime Car Loans Hit Crisis-Era Milestone as Woes Mount

Global Warming & Crop Yields

By Eric Worrall – Re-Blogged From http://www.WattsUpWithThat.com

A PNAS study claims that crop yields will fall by up to 7% for each degree celsius of global warming, assuming no CO2 fertilisation and no adaption measures.

Climate change will cut crop yields: study

August 15, 2017

Climate change will have a negative effect on key crops such as wheat, rice, and maize, according to a major scientific report out Tuesday that reviewed 70 prior studies on global warming and agriculture.

Continue reading

Mythical Subsidies

By Michel de Rougemont – Re-Blogged From http://www.WattsUpWithThat.com

Following the decarbonisation goals set forth in the Paris climate agreement of December 2015, appeals are made to suppress energy subsidies linked to the use of fossil fuels, and to increase consumption taxes massively as an incentive to burn less of them.

According to « experts » of the International Monetary Fund (IMF) in a report published last December[1], energy is the beneficiary of subsidies amounting to $4’900 billion in 2013, or 6.5% of the global GDP. On its part, the International Energy Agency (EIA)[2] estimates them at $532 billion for that same year? The not so small 920% difference stems from considerations given to so-called negative externalities associated with the use of energy.

Continue reading

Dr. Judith Curry Explains The Reality Of Bad Climate Science And Bad Politics

By Larry Hamlin – Re-Blogged From http://www.WattsUpWithThat.com

Dr. Judith Curry conducted an interview with YouTube which was published on August 9, 2017 where she clearly lays out the many flaws and failures of “consensus” climate science and how this highly politicalized scheme tremendously misleads policy makers regarding the need for government directed climate actions.

Regarding the role that human greenhouse gas emissions play in driving the earth’s climate Dr. Curry concludes that:

“On balance, I don’t see any particular dangers from greenhouse warming. {Humans do} influence climate to some extent, what we do with land-use changes and what we put into the atmosphere. But I don’t think it’s a large enough impact to dominate over natural climate variability.”

Continue reading

Cyber Hack Feared in USS John S. McCain Crash

By Cathy Burke – Re-Blogged From Newsmax

Red flags are being raised over fears that the latest at-sea collision involving the USS John S. McCain could have been caused by a cyberattack on the Navy’s electronic guidance systems, McClatchy reported.

The Pacific collision – the fourth involving a Seventh Fleet warship this year – occurred near the Strait of Malacca, a busy 1.7-mile-wide waterway connecting the Indian Ocean and South China Sea that accounts for roughly 25 percent of global shipping.

“When you are going through the Strait of Malacca, you can’t tell me that a Navy destroyer doesn’t have a full navigation team going with full lookouts on every wing and extra people on radar,” Jeff Stutzman, an ex-information warfare specialist in the Navy who works at Wapack Labs, told McClatchy.

“There’s something more than just human error going on because there would have been a lot of humans to be checks and balances.”

Chief of Naval Operations, Adm. John Richardson, did not rule out a cyberattack in the collision, from which 10 American sailors remain missing.

Continue reading

Me, My Boy and Warren Buffett

By Vitaliy Katsenelson – Re-Blogged From Investment Management Associates

I have a confession to make. I want my company to someday be called Katsenelson & Kids. That doesn’t have to be its official name, but I want to work with my kids. I want my kids to be value investors. I know I am supposed to want them to be doctors or nuclear physicists. I don’t. Maybe if you go Freud on me, you’ll tell me this is my way of not wanting to let them go.
But I don’t want to push them into investing unless they absolutely love it. I want them to be happy. So far, none of my three kids — especially my 15-month-old, Mia Sarah — has shown any interest in following in my footsteps.
Six or eight times a year, I am invited to give a talk on value investing to undergraduate and graduate students at the University of Colorado Denver or Denver University. I really enjoy these talks. They are always structured in a Q&A format (I thereby pass the burden of class preparation on to the students). As part of their homework, they have to read my articles and come to the class with questions.

Continue reading

Gold Miners’ Q2’17 Fundamentals

By Adam Hamilton – Re-Blogged From http://www.Gold-Eagle.com

The gold miners’ stocks have spent months adrift, cast off in the long shadow of the Trumphoria stock-market rally. This vexing consolidation has left a wasteland of popular bearishness. But once a quarter earnings season arrives, bright fundamental sunlight dispelling the obscuring sentiment fogs. The major gold miners’ just-reported Q2’17 results prove this sector remains strong fundamentally, and super-undervalued.

Four times a year publicly-traded companies release treasure troves of valuable information in the form of quarterly reports. Companies trading in the States are required to file 10-Qs with the US Securities and Exchange Commission by 45 calendar days after quarter-ends. Canadian companies have similar requirements. In other countries with half-year reporting, some companies still partially report quarterly.

Continue reading

The Fiscal Benefits Of Free Trade

By Alasdair Macleod – Re-Blogged From http://www.Silver-Phoenix500.com

Western governments have an overriding problem, and that is they have reached or exceeded the bounds of taxation, at a time when legally mandated welfare costs are accelerating. Treasury departments in all the welfare nations are acutely aware of this problem, to which there’s no apparent solution. The economic recovery, so consistently forecast since the great financial crisis, has hardly materialised and has added to the problem.

There is, if treasury economists could only understand it, a solution in free trade.

One of the UK’s leading economists and Brexiteers, Patrick Minford, produced an interesting paper, which brought up this subject. It got little coverage in the press, and even that was extremely negative. Trading on the Future was the only economic modelling exercise that showed significant benefits for Britain from free trade.

Continue reading

Bond Bear Bubbleheads

By Brady Willet – Re-Blogged From http://www.Silver-Phoenix500.com

Conventional wisdom holds that with central banks’ beginning to throw their experimental policies into reverse the strings holding the asset price boom together are slowly being cut.  No disagreement here. But while the divergence between the fundamentals and asset prices suggests things like equities are in/near bubble territory, the bond market is not so much a ‘bubble’ as simply a rigged game. Some would disagree…

Former Fed Chairman Alan Greenspan recently offered his opinion about market ‘bubbles’ (or the very subject matter he spent his tenure at the Fed proclaiming could never be identified):

“By any measure, real long-term interest rates are much too low and therefore unsustainable. When they move higher they are likely to move reasonably fast. We are experiencing a bubble, not in stock prices but in bond prices. This is not discounted in the marketplace.”  Bloomberg

Continue reading

Trump’s DOJ Ends Obama’s Highly Controversial ‘Operation Choke Point’ Program

By Julio Rosas – Re-Blogged From Independent Journal Review

“Operation Choke Point” — a program started by the Department of Justice (DOJ) during the Obama administration — will no longer continue under Attorney General Jeff Sessions.

Republicans have claimed the goal of “Operation Choke Point” was to pressure the banks to cut off legitimate companies from using their services. In particular, critics said the program was used to specifically target gun sellers.

The DOJ announced the discontinuation of the program Wednesday in a letter to House Judiciary Chairman Bob Goodlatte. In the letter, Assistant Attorney General Stephen Boyd called “Operation Choke Point” a “misguided initiative conducted during the previous administration”:

We share your view that law abiding businesses should not be targeted simply for operating in an industry that a particular administration might disfavor. Enforcement decisions should always be made be based on facts and the applicable law.

“All of the Department’s bank investigations conducted as part of Operation Choke Point are now over, the initiative is no longer in effect, and it will not be undertaken again,” Boyd added.

Boyd also stated because of subpoenas, some criminal activity had been discovered, adding “the Department continues to pursue those ancillary investigations, none relates to or seeks to deter lawful conduct.”

CONTINUE READING –>

Anatomy of a Stock Market Crash

By David Haggith – Re-Blogged From The Great Recession Blog

The 1929 stock market crash became the benchmark to which all other market crashes have been compared. The following graphs of the crash of 1929 and the Great Depression that followed, the dot-com crash, and the stock market crash during the Great Recession show several interesting similarities in the anatomy of the world’s greatest financial train wrecks. They also show some surprises that run against the way many people think of these most infamous of crashes.

Graphing the 1929 Stock Market Crash

The stock market roared through the 1920’s. Building construction, retail, and automobile sales advanced from record to record … but debt also climbed as a way to finance all of that. This crescendoed in 1929 when the stock market experienced two particularly exuberant rallies about a month apart (one in June and one in August with a plateau between).

Then retail, housing and automobile sales started to fall apart.

Sound familiar?

Continue reading

Iran’s President Tacitly Admits Iran is Cheating on Nuclear Agreement

By David Haggith – Re-Blogged From The Great Recession Blog

Countering recent US sanctions and President Trump’s talk of ending the “bad” nuclear agreement with Iran, Iran’s president threatened to restart its nuclear program. If his threat is true as stated, he unwittingly admitted something highly supportive of Trump’s position:

Mr. Rouhani said that a reconstituted nuclear program would be “far more advanced,” a veiled threat that the country could start enriching uranium up to the level of 20 percent…. “Iran will definitely revert to a far more advanced situation than it had before the negotiations, not in a matter of weeks or months but in a matter of days or hours.” (New York Times)

Continue reading

On the Ground in Paraguay

By Mark Svoboda – Re-Blogged From International Man

Over the course of the last several months, we’ve followed the journeys of Mark Svoboda as he’s traveled from Singapore to Tanzania, Malaysia to Colombia. Today Mark stops off in Paraguay, where he and his wife traveled to start their residency process…

Paraguay – the Heart of America

In April of 2012, my wife and I traveled to Paraguay to start our residency process in the so-called “heart of America.” Our hope is to eventually receive a citizenship in the country without actively residing there. Since I suspect many International Man readers are, like me, interested in 1) obtaining Paraguayan residency in hopes of eventually receiving a citizenship, and 2) buying some of that cheap productive land, I thought it was high time I report on the country.

Continue reading

Is It Time to Escape to Your Personal Alamo?

By Nick Giambruno – Re-Blogged From International Man

Doug Casey, Jeff Thomas, and Nick Giambruno recently discussed a topic they all think about often—pulling the trigger and leaving your home country to sit out an economic or political crisis.

Nick Giambruno: It seems like each week there’s a new attack or mass shooting. Racial tensions are on the rise. Europe is experiencing a migrant crisis that’s tearing the continent apart.

There’s no doubt the world has become a crazier place in the past couple of years. Unfortunately, I think it’s only going to get worse.

At what point do you decide that conditions at home are likely to worsen and set up an escape route with the intention of moving to another country?

Continue reading

Fed’s Dudley Drops Bombshell: Low Inflation “Actually Might Be a Good Thing”

By Wolf Richter – Re-Blogged From Wolf Street

QE unwind in September, “another rate hike later this year.”

The media have been talking themselves into a lather about how the less-than-2% inflation would force the Fed to stop hiking rates. But William Dudley, president of the New York Fed and one of the most influential voices on the policy-setting Federal Open Markets Committee (FOMC), just dropped a stunning bombshell about low inflation – why it might be low and how that “actually might be a good thing.”

The kickoff for unwinding QE appears to be in the can. There’s unanimous support for it on the FOMC. It appears to be scheduled for the September meeting. The market has digested the coming “balance sheet normalization.” Stocks have risen and long-term yields have fallen, and financial conditions have eased further, which is the opposite of what the Fed wants to accomplish; it wants to tighten financial conditions. So it will keep tightening its policy until financial conditions are tightening.

Continue reading

Spain’s New Big Bubble Begins to Wobble

By Don Quijones – Re-Blogged From WOLF STREET.

Since hitting rock bottom in 2013, Spain has been one of the biggest engines of economic growth in Europe, expanding at around 3% per year. But according to a report by the Bank of Spain, most of the factors behind this growth — such as cheaper global oil prices, the ECB’s expansionary monetary policy, and the subsequent decline in value of the Euro — are externally driven and transitory in nature.

This is particularly true for arguably the biggest driver of Spain’s economic recovery, its unprecedented tourism boom, which some local economists are finally beginning to call a bubble.

In large part the boom/bubble is a result of the recent surge in geopolitical risks affecting rival tourist destinations like Turkey, Egypt, Tunisia and, in smaller measure, France, which helped boost the number of foreign visitors to Spain in 2016 to a historic record of 75.3 million people — an 11.8% increase on 2015.

Continue reading

Drilling Set to Begin in British Shale

By Daniel Graeber – Re-Blogged From https://www.upi.com

Cuadrilla Resources says its drilling rig is on site and ready to tap into a natural gas basin in Lancashire.

Drilling is set to begin in a British shale natural gas basin, where Cuadrilla Resources said there is no precedent in the country. Photo courtesy of Cuadrilla Resources
 

Green Lunacy: Fossil Fuel Mandates, to Stabilise Mandated Renewable Electricity Supplies

By Eric Worrall – Re-Blogged From http://www.WattsUpWithThat.com

h/t JoNova – A government green idea so stupid even Tesla is worried; The South Australian Government, the world’s renewable crash test dummy, plans to implement fossil fuel mandates to halt the loss of baseload capacity caused by their renewable mandates.

Tesla, energy companies concerned prices won’t drop under South Australian Government’s plan

By political reporter Nick Harmsen

Battery giant Tesla has joined power generators, retailers, major energy users and experts in voicing concerns about a central component of the South Australian Government’s $550 million energy plan.

Continue reading

A Conversation With Gerald Celente

By Mike Gleason – Re-Blogged From http://www.Gold-Eagle.com

Mike Gleason: It is my privilege now to welcome in Gerald Celente, publisher of the renowned Trends Journal. Mr. Celente is a well-known trends forecaster and highly sought-after guest on news programs throughout the world and has been forecasting some of the biggest and most important trends before they happen for more than 30 years now. It’s always great to have him on with us.

Mr. Celente, thanks so much for the time today, and we appreciate you joining us.

Gerald Celente (Trends Journal): Thanks for having me on, Mr. Gleason.

Mike Gleason: Well, I want to start out talking about the first half of the year of Donald Trump’s presidency. Trump had an ambitious agenda to get the economy going but hasn’t been able to push any significant legislation through this Congress. How do you see that playing out from here, and what bearing does all this have on the dollar, Gerald, because the greenback has been taking it on the chin here recently?

Continue reading

Support for Hard Brexit in the UK Hardens

By Don Quijones – Re-Blogged From WOLF STREET.

“Significant economic damage” is a “price worth paying.” But businesses are not so sure.

Europhiles hoping that time might heal or at least narrow the rift separating the UK and the EU after last year’s Brexit vote are likely to be sorely disappointed by the findings of a new poll jointly conducted by Oxford University and London School of Economics.

The survey reveals that there is more support for harder Brexit options because Leavers and a substantial number of Remainers back them. The survey’s findings bolster the case for the hard-Brexit-or-nothing position favored (at least publicly) by British Prime Minister Theresa May. The alternative — a so-called “soft” Brexit — would imply having to accept full freedom of movement for all EU citizens in return for some form of privileged access to the single market. Given that regaining control of UK borders was one of the key issues that swung the referendum in Brexit’s favor, such a proposition was always unlikely to sway a majority of British voters.

Continue reading

Here’s How to Avoid Climate Panics

By Dennis Avery – Re-Blogged From TownHall

Americans have suffered needless climate-related panic for the past 40 years—not realizing that, since 1850, our newspapers have given us a climate scare about every 25 years. And none of them was valid.

Fortunately, climate science is now good enough to predict the key abrupt climate cycles that Mother Nature visits upon earth through the millennia. After the cold of the Maunder Sunspot Minimum at 1715, for example, earth’s temperature warmed 0.3 degrees in less than 25 years. Then two centuries later, the temperatures dropped equally swiftly into the cold of the Dalton Minimum. These abrupt shifts occurred over decades rather than centuries. Some shifts have been favorable, an equal number were unfavorable – and none involved carbon dioxide.

Here's How to Avoid Climate Panics

Continue reading

A Study on the Design Possibilities Enabled by Rope-Less, Non-Vertical Elevators Project

From the Council on Tall Buildings and Urban Habitat

Project Started: September 2016
Anticipated Project Completion: September 2018
Funding Sponsor: thyssenkrupp AG
Principal Investigators: Dario Trabucco, CTBUH and Antony Wood, CTBUH
The CTBUH research project, “A Study on the Design Possibilities Enabled by Rope-less, Non-vertical Elevators,” has received $264,000 in funding from thyssenkrupp AG to embark on an ambitious 24-month comprehensive study. The research will investigate how technological innovation in elevators, specifically rope-less non-vertical cabins, could impact the design outcomes of tall buildings and cities. The study seeks to remove the evolutionary bottleneck created by exclusively vertical elevator systems, as conventional systems to date have limited the height and influenced the shape of skyscrapers, which have historically been designed as a vertical repetition of floors.

To support this research, a Steering Committee and various Expert Panels were formed, which will be crucial for generating and evaluating the final results of this research.

Funding Sponsor
  Continue reading

New Research Suggests Memory Loss in Alzheimer’s Patients May be Reversible

By Karla Lant – Re-Blogged From Futurism

New research from a team at MIT indicates symptoms of Alzheimer’s disease (AD) affecting patient’s memories may be reversible. AD causes memory loss by setting up genetic “blockades” formed when the enzyme HCAC2 condenses the genes of the brain responsible for memory. Eventually, those genes become useless; unexpressed, the genes are unable to cause the formation of new memories or retrieval of existing ones.

Clearly, blocking HCAC2 in the brain is an obvious fix; however, it has to date been impossible, in that all prior attempts have negatively affected the internal organs which require other enzymes in the histone deacetylase (HDAC) family for normal function. Researchers at MIT have now found something they hope might be the answer: LED lights which they use to prevent HCAC2 alone from binding with Sp3, its genetic blockade formation partner in crime (and Alzheimer’s).

Image Credit: jarmoluk/Pixabay

Continue reading

Nuclear Fusion / Fission Hybrid?

By Eric Worrall – Re-Blogged From http://www.WattsUpWithThat.com

Building an energy producing nuclear fusion reactor remains elusive, but some companies are re-considering an old idea – combining nuclear fusion with nuclear fission in a single reactor, to overcome the disadvantages of both.

Fusion-fission hybrids: nuclear shortcut or pipe dream?

While nuclear fusion’s key milestones remain elusive, could fusion-fission hybrid reactors represent the best of both worlds? Start-up Apollo Fusion aims to make this complex concept a commercial reality, but formidable obstacles remain.

Continue reading