New Insight Into What Weakens Antarctic Ice Shelves

By Anthony Watts – Re-Blogged From http://www.WattsUpWithThat.com

New research describes for the first time the role that warm, dry winds (katabatic winds) play in influencing the behaviour of Antarctic ice shelves. Presenting this week at a European conference scientists from British Antarctic Survey (BAS) explain how spring and summer winds, known as föhn winds, are prevalent on the Larsen C Ice Shelf, West Antarctica and creating melt pools. The Larsen C Ice Shelf is of particular interest to scientists because it of the collapse of Larsen A in 1995 and Larsen B in 2002.

The researchers observed the föhn winds, which blow around 65% of the spring and summer period, extend further south and are more frequent than previously thought, and are likely to be a contributing factor that weakens ice shelves before a collapse. The results are presented this week (Tuesday 25 April) at the European Geosciences Union General Assembly (EGU) in Vienna.

In 1995 and 2002, the Larsen A and B ice shelves collapsed, depositing an area the size of Shropshire into the Weddell Sea. Whilst ice shelf collapse doesn’t directly contribute to sea level rise, the glaciers which fed into the ice shelves accelerated, leading to the loss of land ice, and subsequently indirect sea level rise. The processes responsible for the collapse of these ice shelves were largely debated, and it is now thought that crevasses on the ice shelf were widened and deepened by water draining into the cracks. Föhn winds are thought to be responsible for melting the ice shelf surface and supplying the water.

The findings describe when and where the warm, dry winds occur over the Larsen C Ice Shelf, the largest remaining ice shelf on the Antarctic Peninsula (roughly the size of Wales). Föhn winds were measured from near-surface weather stations and regional climate model data over a five year period and observed all year-round, but were most frequent in spring.

PhD student and lead scientist on this project from British Antarctic Survey (BAS) and Leeds University, Jenny Turton says:

“What’s new and surprising from this study is that föhn winds occur around 65% of the time during the spring and summer. And we didn’t know how much they influence the creation of melt pools and therefore are likely to weaken the ice shelf. Whilst a high number occur in spring, the combined warming over a number of days leads to much more surface melting than was experienced during days without föhn winds. This is important, as melting during summer and re-freezing during winter weakens the ice surface, and makes it more at risk of melting again the following season.

“We know the ice shelf often melts a little during summer, however we have found that when föhn events occur as early as September (three months earlier than the start of the summer melt season), the ice shelf surface is melting. Now that we know how prevalent and spatially extensive these winds are, we can look further into the effect they are having on the ice shelf.”

###

CONTINUE READING –>

Message For Janet Yellen: “Be Terrified”

By Tyler Durden – Re-Blogged From Zero Hedge

Billionaire investor Paul Tudor Jones has a message for Janet Yellen and investors: Be very afraid.

Echoing a number of recent high profile managers’ warnings…

Guggenheim Partner’s Scott Minerd said he expected a “significant correction” this summer or early fall,  citing as potential triggers President Donald Trump’s struggle to enact policies, including a tax overhaul, as well as geopolitical risks.

Philip Yang, a macro manager who has run Willowbridge Associates since 1988, sees a stock plunge of between 20 and 40 percent, according to people familiar with his thinking, citing events like a severe slowdown in China or a greater-than-expected rise in inflation that could lead to bigger rate hikes.

Continue reading

Stock Markets Sit Blithely on a Powerful Time Bomb

By Wolf Richter – Re-Blogged From Wolf Street

No one knows the full magnitude, but it’s huge.

How big is margin debt really, and how much of a threat is it to the stock market and to “financial stability,” as central banks like to call their concerns about crashes? Turns out, no one really knows.

What we do know: Margin debt, as reported monthly by the New York Stock Exchange, spiked to another record high of $528 billion. But it’s only part of the total outstanding margin debt – which is when investors borrow money from their broker, pledging their portfolio as collateral.

An example of unreported margin debt: Robo-advisory Wealthfront, a so-called fintech startup overseeing nearly $6 billion, announced that it would offer its clients loans against their portfolios.

Continue reading

The Housing Bubble Is Back

By John Rubino – Re-Blogged From Dollar Collapse

Last week I ran into a friend whom I’d been worrying about. He’s a real estate appraiser and his work had been drying up as interest rates rose and homeowners stopped refinancing their mortgages.

But now he’s back to being happily swamped because instead of refinancing, everyone is buying — often, he says, for above the asking price.

A couple of days later my wife and I were at a slide show put on by friends just back from New Zealand. They’d heard that a neighbor was thinking about selling his house and on an impulse made him an offer. He accepted, and our friends became instant homeowners.

Continue reading

Current Solar Activity Resembles Dalton Minimum…Weakest Month 75-100 Period Recorded!

By Frank Bosse & Prof. Fritz Vahrenholt – Re-Blogged From No Tricks Zone

Our source of energy continued to be especially quiet last month. The mean sunspot number (SSN) was 17.7 and the sun was completely blank for 16 days.

It is important to recall once again that the SSN is not simply the sum of the observed sunspots, rather it is generated by the number of spots multiplied by the 10-fold of the observed sunspot regions. When one single spot is observed in an active region, this yields an SSN of 11.

The mean SSN for all cycles recorded so far, up to month 100 into the cycle, is 48.6, which means that the current cycle has seen a solar activity that is only 36% of the mean. It’s a weak cycle.

Fig. 1: The current solar cycle (SC) 24 is shown in red and is compared to the mean of the previous 23 cycles (blue) and the similar SC 5 (black).

Continue reading

Trade Negotiations Sow Seeds of Doubt for U.S. Agriculture

Re-Blogged From Stratfor

Grant Wood’s 1930 painting “American Gothic” is quintessential Americana. The austere depiction of a farmer and his land evokes the agrarian core that has long underpinned the United States’ geopolitical strength. Today, the U.S. agricultural system is still central to the country’s success, though it looks much different now than it did in Wood’s time.

Small family farms have given way to massive industrial operations, and the agricultural sector as a whole has become far more globalized. In fact, despite its reputation as the “breadbasket of the world,” the U.S. agricultural sector depends as much on other countries as they depend on it. The United States exports more than 20 percent of its agricultural production by volume, and export revenues account for about 20 percent of net farm income. As productivity improves each year with help from technological advancements, moreover, it will outpace domestic demand, leaving exports to sustain the U.S. agricultural sector. But the extent to which they can depends in large part on the future of international trade deals such as NAFTA.

U.S. agriculture lobbies have long advocated trade agreements to keep the country competitive with other major producers. Since the mid-20th century, the General Agreement on Tariffs and Trade and subsequent deals have sent the value of agricultural exports from the United States soaring. NAFTA gave U.S. agriculture another boost when it took effect in 1994. The deal opened a massive market in Mexico to corn producers in the American Midwest, while also providing American consumers with a wealth of fruits and vegetables from their southern neighbor. It was hardly surprising, then, that the U.S. agricultural sector rallied behind the Trans-Pacific Partnership agreement before it fell apart. Nor is it surprising that the United States’ farming industry is wary of the current administration’s plans to renegotiate NAFTA.

A Vulnerable Sector

Of all the sectors that have benefited from NAFTA, the agricultural industry is the most vulnerable to changes in its terms. Though agricultural exports represent only about 8 percent of the United States’ exports to Mexico, they would account for roughly 42 percent of the total increase in tariffs on U.S. exports in the unlikely event that NAFTA were revoked. (Meat and sugar would be among the hardest hit exports from the United States, while sugar and vegetables would be two of the imports most affected by the reintroduction of tariffs.)

Mexico is already working to expand its trade ties with South American countries such as Brazil and Argentina to offset the repercussions that a NAFTA overhaul would have on its agricultural imports and exports. And the U.S. agricultural industry will likewise have to deal with the fallout of a revised trade deal. Farm states — especially those, such as Texas, that do more trade with nearby Mexico — would feel the effects of the changes most acutely. After all, much like manufacturing, agricultural processing depends on intricate cross-border ties. Cattle born in Mexico, for instance, are often raised and slaughtered in Texas for export back across the border.

Rehashing a Familiar Problem

This won’t be the first time a disagreement in the NAFTA bloc has targeted the United States’ agricultural sector. In 2009, Mexico imposed tariffs on specific facets of the U.S. agricultural industry during a dispute about trucking. By targeting individual congressional districts, including parts of Oregon and California that rely on agricultural and processed food exports, Mexico City pressured Washington into complying with the trucking ruling.

But agricultural communities are losing their political clout; in 2006, researchers at Montana State University found that farming is the primary economic activity in just 40 of the United States’ 435 congressional districts. The agricultural lobby has atrophied enough, in fact, that the last farm bill — a hefty piece of legislation proposed every five years to fund the country’s agricultural activities — failed in 2013, before Congress eventually passed a diluted version. Spending on lobbying activities for agribusiness, too, has declined in recent years, though it is still higher than it was at the start of the 21st century.

An Uncertain Future

Considering their waning influence in U.S. politics, the country’s agricultural producers were particularly nervous as President Donald Trump took aim at their two most important trade partners, China and Mexico, during his campaign for the presidency. Several agribusinesses and farming concerns joined together to write a letter to the president after his inauguration in January outlining the importance of open trade to the agricultural sector. If Trump’s efforts to correct the United States’ trade imbalances prompted countries such as China or Mexico to curtail their imports of U.S. agricultural products, for instance, the American farming sector would suffer disastrous consequences. So far, though, the worst-case scenario appears unlikely. Geographic constraints, coupled with 20 years of progressive integration in numerous industries, including agriculture, will limit the Trump administration’s options to deliver on its promises to amend or discard international trade deals.

That’s good news for U.S. agriculture. Domestic demand alone would fall far short of supporting production in several agricultural sectors. Without China and Mexico, the pork, beef, grain and dairy industries would have no clear alternative market to turn to that could match those countries’ demand, and excess supply could cause prices to plummet. But until the negotiations over NAFTA are complete, the U.S. agricultural sector’s future will be tinged with uncertainty.

CONTINUE READING –>

Weekly Climate and Energy News Roundup #267

By Ken Haapala, President, The Science and Environmental Policy Project

Brought to You by www.SEPP.org

Joint Petition to Reconsider: Although not discussed in prior TWTWs, SEPP joined the Competitive Enterprise Institute (CEI) in filing a joint petition to the EPA to reconsider its 2009 finding that greenhouse gases, especially carbon dioxide, endanger public health and welfare. The petition was filed on February 17, 2017, and slightly revised on February 23.

Such actions fall under the “right to petition” stated in the 1st Amendment of the Constitution. The petition has added weight because both CEI and SEPP originally objected to the endangerment finding. The filing has been in the news, but TWTW has mentioned it only in passing. The legal issues were handled by CEI. The chance of success is not high, but the action is important.

By necessity the petition is short, and concise. It focuses on the strongest empirical science available in January, but not available in 2009, that contradicts the assertion that CO2 endangers public health and welfare. The testimony of John Christy on February 2, 2016, was chosen. [Christy’s written testimony to the House Science, Space, and Technology Committee on March 29, 2017, was even stronger evidence, but did not yet exist.]

Continue reading

What Color Is Southern Europe’s Parachute?

Re-Blogged From Stratfor

Uneven Recoveries

In Spain and Portugal, unemployment figures decreased more quickly last year than anywhere else in Europe. In February, the year-to-year unemployment rate fell from 20.5 percent to 18 percent in Spain, while Portugal’s rate decreased from 12.2 percent to 10 percent. A drop in official unemployment rates can be misleading: people who have given up looking for work are not included in certain joblessness calculations. But in Spain and Portugal, the drop in unemployment was accompanied by an actual increase in the number of people working. In Spain, 16.9 million people had jobs in 2014; by 2016 that number had increased to 18.5 million. In Portugal, job numbers rose from 4.4 million to 4.6 million between 2013 and 2016.

Eurozone Unemployment Chart

Nonetheless, the improvement in both countries largely reflects an increase in non-permanent jobs. Around 90 percent of new contracts in Spain and 80 percent in Portugal are for temporary jobs. Among EU states, according to Eurostat, only Poland has more workers under temporary contracts than Spain and Portugal. In addition, the demographic profile of workers in those countries under temporary contracts is anomalous: More than half of temporary workers in countries like Germany or Sweden are younger than 29. In Spain and Portugal, on the contrary, around a third of temporary workers are 30–39 years old. According to a report by the European Parliament, only one in five workers under temporary contracts in Spain and Portugal actually transition to permanent contracts.

EU Part-Time Employment Chart

To some extent, the way Spanish and Portuguese employment markets operate account for the labor precariousness. Despite recent reforms, labor laws in both countries are still relatively rigid, and the cost of dismissing workers remains high. This makes many companies opt for temporary contracts when hiring staff. Sometimes employers offer temporary contracts to reduce costs, but the practice also gives them flexibility to downsize during times of economic uncertainty.

Economic structures also play a significant role in the type of jobs created. For example, the services and agriculture sectors rely more heavily on temporary jobs than does industry. In services and agriculture, jobs tend to be more seasonal, with particularly high rates of temporary and off-the-books positions. In Southern European countries, tourism (and associated service industries like hotels and restaurants) provides a bigger source of employment than in most of their northern peers. Tourism accounts for roughly 12 percent of all employment in Spain, the second highest proportion in the European Union after Malta. Spain and Portugal have experienced a recent tourism boom, driven partially by rising political and security concerns in competing destinations in the Mediterranean. A significant agricultural sector in both countries also adds to their temporary job totals. Agriculture represents around 7 percent of employment in Portugal and 4 percent in Spain, compared with around 2 percent in countries like Germany or Sweden. These economic structures mean that job precariousness is not an entirely new phenomenon in Spain or Portugal.

Meanwhile, the effects of the European economic crisis have led to shrinking salaries. In Spain, the average wage fell by about 3.5 percent between 2011 and 2014. (It has recovered slightly since then.) In Portugal, the average wage has been stagnant since 2012. Recent reforms in labor legislation, which had the goal of making these economies more competitive, account for a portion of the earnings decrease. As members of the eurozone, Spain and Portugal cannot use currency devaluation to regain competitiveness during crises. As a result, they chose to take measures to reduce labor costs (a process commonly known as “internal devaluation”), including reducing the role of collective bargaining in salary negotiations and making it easier and cheaper for employers to dismiss workers. In Spain, hourly labor costs increased by 28 percent between 2004 and 2011 but remained flat between 2011 and 2015. In Portugal, labor costs per hour had increased by almost 18 percent between 2004 and 2012, but dropped by 0.7 percent between 2012 and 2015.

Both countries are grappling with high rates of youth unemployment, which during the peak of the crisis in Spain exceeded 50 percent of the active young population (people under 24 who are looking for a job, excluding students) and reached about 40 percent in Portugal. To a certain extent, emigration and financial support from family members mitigated the effect of high unemployment rates among young workers.

Long-term joblessness presents perhaps a greater problem. Roughly half of those unemployed in Spain and Portugal have been out of work for more than a year. In general, the longer that people are out of the workforce, the harder it is for them to find a job. Motivation to keep looking for work also drops off. This suggests that even if the economic recovery consolidates in these countries, the pace of the decline in unemployment rates could slow in coming years.

Misleading Improvements

In another southern economy, Greece, structural factors and the economic crisis have likewise fueled high unemployment. But even before the crisis, the highest share of employment came from sectors with more temporary positions: tourism, retail and agriculture. Meanwhile, labor productivity rates in the portion of the Greek industrial sector with labor- and resource-intensive activities were often below the EU average.

EU Temporary Employment Chart

Greece’s unemployment rate has shrunk over the past five years, but so has the labor force (that is, the number of people either employed or looking for a job) — a result of multiple factors, including retirement, emigration and the fact that unemployed people who quit looking for a job are not considered part of the workforce. In the fourth quarter of 2016, according to the Hellenic Statistics Office, 3.64 million people had jobs in Greece, virtually the same as a year earlier. As in Portugal and Spain, unemployment in Greece fluctuates seasonally: employment rates rise in summer and decrease in winter. While the total number of people with jobs in Greece rose from 2015 to 2016, the figure is still below what it was in 2012. Moreover, official statistics tend to hide the fact that hundreds of thousands of Greek workers have accepted jobs that offer either low pay or only part-time or temporary status. According to Eurostat, the percentage of involuntary part-time jobs — those held by workers who would rather have full-time employment — in Greece rose from 45 percent to 72 percent over the past decade.

As in Spain and Portugal, the Greek government sought to limit wage growth during the height of the economic crisis with policies such as those replacing collective bargaining with company-based collective agreements. According to data compiled by the Organization for Economic Co-operation and Development, the average wage in Greece fell by 20 percent between 2009 and 2015. Meanwhile, labor costs per hour have dropped by 15 percent since 2008. Greece is the only eurozone country where the minimum wage is lower today than it was a decade ago. But broader and deeper reforms to make the Greek economy more competitive, including modernizing the country’s education system, moving toward deregulation in some areas, and accelerating the privatization process, have been only partially introduced. Tax hikes and spending cuts introduced during Greece’s three bailout programs have cut domestic consumption and turned a recession into a depression.

Slow Rebounds

Italy offers another interesting case: unemployment never reached the levels it did in Greece, Spain or Portugal, but joblessness has resisted the efforts by several governments to decrease it. Since 2012, the unemployment rate has consistently hovered above 11 percent. (It was 6 percent a decade ago.) Unlike Spain or Portugal, Italy did not respond to the crisis with a unified package of comprehensive labor reforms, but instead implemented a series of smaller reforms, most notably during the governments led by Mario Monti in 2012 and Matteo Renzi in 2014. These reforms sought to weaken protections against dismissal for permanent workers and to increase protection for the unemployed.

In spite of the reforms, the pace of job creation has not fulfilled government promises. According to Italy’s statistics office, 22.8 million Italians had a job in the fourth quarter of 2016 — a modest increase from the 22.4 million registered four years earlier. At the same time, the proportion of the country’s working-age population with jobs grew from 56.3 percent to 57.4 percent. However, many people have been unable to find the kind of job they want. According to Eurostat, the number of Italians working part-time grew by almost 10 percent between 2002 and 2015, while the average increase for the European Union during that period was 4 percent. Of Italians working part-time, six in 10 would rather have a full-time position. That ratio in 2007 was less than 4 in 10.

Political issues certainly play a role in the trend, as Italian governments tend to be fragile and subject to pressure from multiple sources, including from trade unions and local and regional economic and political interests, making reforms difficult to introduce. Italy also remains beset with pronounced geographic contrasts, as employment rates remain considerably higher in its relatively less-developed and largely agricultural south than in its industry-heavy and more prosperous north. Italy’s weak economic growth does not support robust job creation, another factor keeping unemployment numbers high. Since Italy’s economy emerged from recession in 2014, it has not exceeded 1 percent annual growth.

A Potential Threat to Long-Term Growth

Job insecurity is not exclusive to the southern members of the eurozone: Countries in Central and Eastern Europe like Poland and Bulgaria also have high rates of temporary employment or jobs with low salaries. And a high number of jobs in wealthier countries like the United Kingdom and Austria include employment conditions that labor rights, such as “zero hour contracts” that offer no guaranteed minimum hours of work.

But unemployment rates rose faster in Southern Europe, where the crisis hit harder. High unemployment and insufficient economic growth in that region exposed the fragility of the banking sectors in several countries, raised questions about the sustainability of their public and private debts, and created a fertile ground for the emergence of anti-system political parties that could threaten the survival of the eurozone.

The creation of temporary and precarious forms of employment is a normal phenomenon during the early stages of an economic recovery. Over time, however, they could drag down an economy by limiting the room for growth in domestic demand, for example. In addition, rising income inequality feeds growing social and political tensions. While unemployment rates are dropping across the board, issues such as job insecurity, low pay, long-term unemployment, and few opportunities for training or career advancement could weigh down Southern Europe’s incipient economic recovery.

CONTINUE READING –>

Wall Street And Bear Scat

By Mark J Lundeen – Re-Blogged From http://www.Gold-Eagle.com

As seen in the Bear’s Eye View (BEV) chart below, the last all-time high for the Dow Jones Index happened on March 1st.  Since then however, it’s been slowly deflating.  The post-election run up in the Dow Jones (enclosed in the Red Circle) was an excellent advance; one of the best in the post March 2009 market.  So we shouldn’t begrudge the bulls should they now take a rest before their next upward assault on the stock market, which I’m sure they are planning.  However, some plans never get past their conception stage.

I like this BEV chart for the Dow Jones.  It displays each advance and percentage decline of the Dow from its 09 March 2009 bottom (6,547) to its last all-time high of 01 March 2017 (21,115.55).  The typical correction was a little over 5%, with only four double digit declines (none greater than 17%) since March 2009.

Continue reading

The Dangers of Temporary Employment

Re-Blogged From Stratfor

The Dangers of Temporary Employment

About six in ten jobs in the European Union today are full-time permanent positions. But jobs offered under part-time and temporary contracts account for an increasing share of total employment. In 2003, well before Europe’s economic crisis, 15 percent of workers in the European Union were employed under part-time contracts. By 2015, that had risen to 19 percent. During the same period, temporary contracts rose from 9 percent of total employment to 11 percent. Temporary jobs offer less security than even part-time permanent ones. They often come with lower salaries and fewer training and career advancement opportunities, making it harder for workers to access credit, plan their consumption decisions or qualify for unemployment benefits.

Continue reading

What If The Fed Lowers Instead Of Hiking Interest Rates!

By Gijsbert Groenewegen – Re-Blogged From http://www.Silver-Phoenix500.com

The Fed Will Have To Admit Defeat

If it’s not in May it will very likely be in June. When the Fed will finally be forced to be honest and admit that it is wrong on the economy and therefore will have to drop its tighter monetary policy with as consequence that precious metals assets will skyrocket. A U-Turn by the Fed changing their policy direction from tightening to easing will create a massive market shock. That day is rapidly approaching.

As Michael Belkin says “financial markets are a case of the blind leading the blind. The Fed keeps talking economic strength and has broadcast a series of interest rate hikes and even balance sheet reductions, those consensus trades have virtually all market participants incorrectly positioned.” This is in my point of view because almost everybody got drugged for the last 8 years and thus lost perspective of what is real and what isn’t.

Continue reading

Median Household Wealth Has Declined by 40 Percent Since 2007

By John Mauldin – Re-Blogged From http://www.newsmax.com

Nominal US household wealth is at an all-time high. But my friend Marc Faber (publisher of the Gloom Boom & Doom Report) says that’s mostly an illusion.

Below, Marc looks at the relationship between asset prices and US household wealth, and the effect of that relationship on the economy.

It seems the wealth of the top 0.1% has vastly improved in recent decades (and the top 10% haven’t done at all badly). But “the median household’s or asset owner’s wealth has declined by close to 40% in real terms (adjusted by the CPI) from its peak in 2007.”

Image: Marc Faber: Median Household Wealth Has Declined by 40 Percent Since 2007

Continue reading

Americans Spending More on Taxes Than Food, Clothing, and Housing Combined

By Kimberlee Kaye – Re-Blogged From BlabberBuzz

31% of the nation’s earnings are going to fill the tax coffers

April 23rd of this year was Tax Freedom Day, or “the day when the nation as a whole has earned enough money to pay its total tax bill for the year”.

A whopping 31% of the nation’s earnings are confiscated by the government for federal and state taxes for a total of $5.1 trillion. Amazingly, that’s still not enough to pay off state and national deficits.

Continue reading

Expect No More Than 3 Percent Returns in the Next 20 Years

By John Mauldin – Re-Blogged From http://www.newsmax.com

We didn’t have all of this big data or computing power when I started in the investment publishing industry in 1982. But we do today.

This data not only shows us the present state of the stock market, but also tells what that means in terms of probable returns over the coming 7, 10, and 12 years and what it means in terms of relative risks.

Below, I picked out three telling charts that reveal how pricey the stock market is and what that tells us about probable forward returns.

Median P/E (Price-to-Earnings Ratio) Shows That We Are At The Second Most Overvalued Point Since 1964

Continue reading

The Meaning and Utility of Averages as it Applies to Climate

By Clyde Spencer – Re-Blogged From http://www.WattsUpWithThat.com

Introduction

I recently had a guest editorial published here on the topic of data error and precision. If you missed it, I suggest that you read it before continuing with this article. This will then make more sense. And, I won’t feel the need to go back over the fundamentals. What follows is, in part, prompted by some of the comments to the original article. This is a discussion of how the reported, average global temperatures should be interpreted.

Averages

Averages can serve several purposes. A common one is to increase accuracy and precision of the determination of some fixed property, such as a physical dimension. This is accomplished by confining all the random error to the process of measurement. Under appropriate circumstances, such as determining the diameter of a ball bearing with a micrometer, multiple readings can provide a more precise average diameter. This is because the random errors in reading the micrometer will cancel out and the precision is provided by the Standard Error of the Mean, which is inversely related to the square root of the number of measurements.

Continue reading

Cyber Attack Fears As MULTIPLE CITIES HIT With Power Grid Failures

By Mac Slavo -Re-Blogged From http://www.libertyheadlines.com

The U.S. power grid appears to have been hit with multiple power outages affecting San Francisco, New York and Los Angeles.

Cyber Attack Fears As MULTIPLE CITIES HIT With Power Grid FailuresOfficials report that business, traffic and day-to-day life has come to a standstill in San Francisco, reportedly the worst hit of the three major cities currently experiencing outages.

Power companies in all three regions have yet to elaborate on the cause, though a fire at a substation was the original reason given by San Francisco officials.

A series of subsequent power outages in Los Angeles, San Francisco, and New York City left commuters stranded and traffic backed up on Friday morning. Although the outages occurred around the same time, there is as of yet no evidence that they were connected by anything more than coincidence.

Continue reading

A ‘Red Team/Blue Team’ Exercise Would Strengthen Climate Science

By Steven Koonin (originally published in the Wall Street Journal, sent to WUWT by the author)

Re-Blogged From http://www.WattsUpWithThat.com

Put the ‘consensus’ to a test, and improve public understanding, through an open, adversarial process.

The March for Science will draw many thousands in support of evidence-based policy making and against the politicization of science. A concrete step toward those worthy goals would be to convene a “Red Team/Blue Team” process for climate science, one of the most important and contentious issues of our age.

The national-security community pioneered the “Red Team” methodology to test assumptions and analyses, identify risks, and reduce—or at least understand— uncertainties. The process is now considered a best practice in high-consequence situations such as intelligence assessments, spacecraft design and major industrial operations. It is very different and more rigorous than traditional peer review, which is usually confidential and always adjudicated, rather than public and moderated.

Continue reading

Why All Muslims Benefit from Terrorism

[This video is controversial, to say the least. -Bob]

By David Wood  – Re-Blogged From Freedom Outpost

While many Muslims are just as horrified by terrorism as the rest of us are, all Muslims nevertheless benefit from Islam. This is because both peaceful and violent Muslims tend to share two important goals: (1) the conversion of non-Muslims to Islam, and (2) the silencing of critics of Islam. Since terrorism helps achieve these goals, all Muslims benefit from Islam.

How Did the Roadmap for the Green Juggernaut Fare Over 30 years?

By Neil Lock – Re-Blogged From http://www.WattsUpWithThat.com

Thirty years ago, in April 1987, a new United Nations report was published. It came from the recently established World Commission on Environment and Development, and its title was Our Common Future. It was 300 pages long; and its preparation, which took two and a half years, had involved 23 commissioners and 70 or so experts and support staff. In addition, they solicited inputs from people and organizations, in many different countries, who had concerns about environmental and development issues. You can find the full text of the report at [[1]].

Today, most people seem unaware of this report. That’s a pity. For this is the document, which set in motion the green political juggernaut that has had such a huge, adverse effect on the lives of all good people in the Western world. The 30th anniversary is, I think, a good time to look back at, and to re-evaluate, this report. Not only in its own terms, such as asking how significant the issues it raised have proven to be, and how well these issues have been dealt with in the meantime. But also from a broader perspective, asking how well the process, both scientific and political, has measured up to the reasonable expectations of the people who have been subjected to its consequences.

Continue reading

The Great Western Economic Depression

By Jeff Nielson – Re-Blogged From http://www.Gold-Eagle.com

Western economies are “recovering”. How do we know this? We are told this, over and over and over again by our governments. Then this assertion is repeated thousands of times more by the dutiful parrots of the corporate media.

The problem is that in the real world there is not a shred of evidence to support this assertion. In the U.S.; ridiculous official lies were created claiming the creation of 15 million new jobs . In reality, there are three million less Americans with jobs today than at the official end of the “recession”.

These imaginary jobs are invented by assorted statistical frauds, with the primary deceit being so-called “seasonal adjustments”. To be legitimate, all seasonal adjustments must to net to zero at the end of each year. Instead, in the U.S.A., the biggest job creator in the nation every year is the calendar.

Continue reading

Is The Economy At The Cusp Of The Next Recession…Or Maybe Worse? (Part II)

By Burt Coons (AKA the Plunger) – Re-Blogged From

http://www.Silver-Phoenix500.com

Part II takes a look at the macro economic backdrop for the trade of the year. Spoiler alert- its not a pretty picture, but don’t think doom and gloom, instead embrace crisis and opportunity! With our understanding of the history of oil we now focus on the macro backdrop for our Big Trade.

When the tide goes out you find out who has been swimming naked”– Warren Buffet

“This time around everything gets revealed in the next recession”-Plunger

In the next recession those leaning the wrong way… the levered players, will be forced to heave out their non-productive assets at fire sale prices. Commodity producers with entrenched costs will have to increase production as lower prices beget even lower prices since insufficient cash flows can only be recovered through higher volume production.

Continue reading

Weekly Climate and Energy News Roundup #266

By Ken Haapala, President – Brought to You by www.SEPP.org

The Science and Environmental Policy Project

Bounding the Fear: Last week’s TWTW discussed a presentation by Hal Doiron of The Right Climate Stuff Team (TRCS). TRCS is a group of retired and highly experienced engineers and scientists from the Apollo, Skylab, Space Shuttle and International Space Station eras who have volunteered their time and effort to conduct an objective, independent assessment of the carbon dioxide (CO2)-caused global warming to assess the reality of the actual threat, and separate that from unnecessary alarm. They have applied the techniques they learned for space missions to this task. A rough engineering analogy is: How can they be confident that an astronaut will not cook or freeze in a space station or a space suit?

As a young engineer, Doiron approached the modeling of the lunar lander by bounding the risks. Similarly, he approached the problem of what would happen, in the worst case, with a doubling of carbon dioxide (CO2) by establishing an upper bound. The team created a simple, rigorous earth surface model using principles established in Conservation of Energy. He shows how the model is validated using 165 years of atmospheric greenhouse gas data and HadCRUT surface temperature data.

Continue reading

Five Reasons Blog Posts are of Higher Scientific Quality Than Journal Articles

By Daniel Lakens – Re-Blogged FromThe 20% Statistician

A blog on statistics, methods, and open science. Understanding 20% of statistics will improve 80% of your inferences.

The Dutch toilet cleaner ‘WC-EEND’ (literally: ‘Toilet Duck’) aired a famous commercial in 1989 that had the slogan ‘We from WC-EEND advise… WC-EEND’. It is now a common saying in The Netherlands whenever someone gives an opinion that is clearly aligned with their self-interest. In this blog, I will examine the hypothesis that blogs are, on average, of higher quality than journal articles. Below, I present 5 arguments in favor of this hypothesis.  [EDIT: I’m an experimental psychologist. Mileage of what you’ll read below may vary in other disciplines].

1. Blogs have Open Data, Code, and Materials

When you want to evaluate scientific claims, you need access to the raw data, the code, and the materials. Most journals do not (yet) require authors to make their data publicly available (whenever possible).

Continue reading

Fed Will Cause a 2008 Redux

By Michael Pento – Re-Blogged From http://www.PentoPort.com

Truth is a rare commodity on Wall Street. You have to sift through tons of dirt to find the golden ore. For example, main stream analysis of the Fed’s current monetary policy claims that it will be able to normalize interest rates with impunity. That assertion could not be further from the truth.

The fact is the Fed has been tightening monetary policy since December of 2013, when it began to taper the asset purchase program known as Quantitative Easing. This is because the flow of bond purchases is much more important than the stock of assets held on the Fed’s balance sheet. The Fed Chairman at the time, Ben Bernanke, started to reduce the amount of bond purchases by $10 billion per month; taking the amount of QE from $85 billion, to 0 by the end of October 2014.

The end of QE meant the Fed would no longer be pushing up MBS and Treasury bond prices (sending yields lower) with its $85 billion per month worth of bids. And that the primary dealers would no longer be flooded with new money supply in the form of excess bank reserves. In other words, the Fed started the economy down the slow path towards deflation.

Continue reading

The Lure of Free Energy

By John Popovich – Re-Blogged From http://www.WattsUpWithThat.com

The U.S. government tried to get private industry to process nuclear fuel but had a difficult time finding takers. Union Carbide made an offer that required government guarantees and big upfront cash. Maybe Union Carbide knew something about nuclear fuel processing cost since they were operating a government nuclear fuel processing plant in Tennessee which happened to be the biggest electricity user in the U.S. Other concerns about nuclear electricity cost include the fact that much of the nuclear fuel available today is a result of a scaling back in nuclear weapons by the U.S. and the U.S.S.R. and of course the processing waste and the plant closure cost.

Continue reading

Are Claimed Global Record-Temperatures Valid?

[This excellent article on accuracy and precision of temperature data under-exagerates. Less than 10% of official NWS stations meet required standards on siting – the best stations have a precision of only +/- 1 degree Celcius! The worst are over +/- 5 degrees. -Bob]
By Clyde Spencer – Re-Blogged F5rom http://www.WattsUpWithThat.com

The New York Times claims 2016 was the hottest year on record. Click for article.

Guest essay by Clyde Spencer

Introduction

The point of this article is that one should not ascribe more accuracy and precision to available global temperature data than is warranted, after examination of the limitations of the data set(s). One regularly sees news stories claiming that the recent year/month was the (first, or second, etc.) warmest in recorded history. This claim is reinforced with a stated temperature difference or anomaly that is some hundredths of a degree warmer than some reference, such as the previous year(s). I’d like to draw the reader’s attention to the following quote from Taylor (1982):

“The most important point about our two experts’ measurements is this: like most scientific measurements, they would both have been useless, if they had not included reliable statements of their uncertainties.”

Before going any further, it is important that the reader understand the difference between accuracy and precision. Accuracy is how close a measurement (or series of repeated measurements) is to the actual value, and precision is the resolution with which the measurement can be stated. Another way of looking at it is provided by the following graphic:

image

The illustration implies that repeatability, or decreased variance, is a part of precision. It is, but more importantly, it is the ability to record, with greater certainty, where a measurement is located on the continuum of a measurement scale. Low accuracy is commonly the result of systematic errors; however, very low precision, which can result from random errors or inappropriate instrumentation, can contribute to individual measurements having low accuracy.

Accuracy

For the sake of the following discussion, I’ll ignore issues with weather station siting problems potentially corrupting representative temperatures and introducing bias. However, see this link for a review of problems. Similarly, I’ll ignore the issue of sampling protocol, which has been a major criticism of historical ocean pH measurements, but is no less of a problem for temperature measurements. Fundamentally, temperatures are spatially-biased to over-represent industrialized, urban areas in the mid-latitudes, yet claims are made for the entire globe.

There are two major issues with regard to the trustworthiness of current and historical temperature data. One is the accuracy of recorded temperatures over the useable temperature range, as described in Table 4.1 at the following link:

http://www.nws.noaa.gov/directives/sym/pd01013002curr.pdf

Section 4.1.3 at the above link states:

“4.1.3 General Instruments. The WMO suggests ordinary thermometers be able to measure with high certainty in the range of -20°F to 115°F, with maximum error less than 0.4°F…”

In general, modern temperature-measuring devices are required to be able to provide a temperature accurate to about ±1.0° F (0.56° C) at its reference temperature, and not be in error by more than ±2.0° F (1.1° C) over their operational range. Table 4.2 requires that the resolution (precision) be 0.1° F (0.06° C) with an accuracy of 0.4° F (0.2° C).

The US has one of the best weather monitoring programs in the world. However, the accuracy and precision should be viewed in the context of how global averages and historical temperatures are calculated from records, particularly those with less accuracy and precision. It is extremely difficult to assess the accuracy of historical temperature records; the original instruments are rarely available to check for calibration.

Precision

The second issue is the precision with which temperatures are recorded, and the resulting number of significant figures retained when calculations are performed, such as when deriving averages and anomalies. This is the most important part of this critique.

If a temperature is recorded to the nearest tenth (0.1) of a degree, the convention is that it has been rounded or estimated. That is, a temperature reported as 98.6° F could have been as low as 98.55 or as high as 98.64° F.

The general rule of thumb for addition/subtraction is that no more significant figures to the right of the decimal point should be retained in the sum, than the number of significant figures in the least precise measurement. When multiplying/dividing numbers, the conservative rule of thumb is that, at most, no more than one additional significant figure should be retained in the product than that which the multiplicand with the least significant figures contains. Although, the rule usually followed is to retain only as many significant figures as that which the least precise multiplicand had. [For an expanded explanation of the rules of significant figures and mathematical operations with them, go to this Purdue site.]

Unlike a case with exact integers, a reduction in the number of significant figures in even one of the measurements in a series increases uncertainty in an average. Intuitively, one should anticipate that degrading the precision of one or more measurements in a set should degrade the precision of the result of mathematical operations. As an example, assume that one wants the arithmetic mean of the numbers 50., 40.0, and 30.0, where the trailing zeros are the last significant figure. The sum of the three numbers is 120., with three significant figures. Dividing by the integer 3 (exact) yields 40.0, with an uncertainty in the next position of ±0.05 implied.

Now, what if we take into account the implicit uncertainty of all the measurements? For example, consider that, in the previously examined set, all the measurements have an implied uncertainty. The sum of 50. ±0.5 + 40.0 ±0.05 + 30.0 ±0.05 becomes 120. ±0.6. While not highly probable, it is possible that all of the errors could have the same sign. That means, the average could be as small as 39.80 (119.4/3), or as large as 40.20 (120.6/3). That is, 40.00 ±0.20; this number should be rounded down to 40.0 ±0.2. Comparing these results, with what was obtained previously, it can be seen that there is an increase in the uncertainty. The potential difference between the bounds of the mean value may increase as more data are averaged.

It is generally well known, especially amongst surveyors, that the precision of multiple, averaged measurements varies inversely with the square-root of the number of readings that are taken. Averaging tends to remove the random error in rounding when measuring a fixed value. However, the caveats here are that the measurements have to be taken with the same instrument, on the same fixed parameter, such as an angle turned with a transit. Furthermore, Smirnoff (1961) cautions, ”… at a low order of precision no increase in accuracy will result from repeated measurements.” He expands on this with the remark, “…the prerequisite condition for improving the accuracy is that measurements must be of such an order of precision that there will be some variations in recorded values.” The implication here is that there is a limit to how much the precision can be increased. Thus, while the definition of the Standard Error of the Mean is the Standard Deviation of samples divided by the square-root of the number of samples, the process cannot be repeated indefinitely to obtain any precision desired!1

While multiple observers may eliminate systematic error resulting from observer bias, the other requirements are less forgiving. Different instruments will have different accuracies and may introduce greater imprecision in averaged values.

Similarly, measuring different angles tells one nothing about the accuracy or precision of a particular angle of interest. Thus, measuring multiple temperatures, over a series of hours or days, tells one nothing about the uncertainty in temperature, at a given location, at a particular time, and can do nothing to eliminate rounding errors. A physical object has intrinsic properties such as density or specific heat. However, temperatures are ephemeral and one cannot return and measure the temperature again at some later time. Fundamentally, one only has one chance to determine the precise temperature at a site, at a particular time.

The NOAA Automated Surface Observing System (ASOS) has an unconventional way of handling ambient temperature data. The User’s Guide says the following in section 3.1.2:

“Once each minute the ACU calculates the 5-minute average ambient temperature and dew point temperature from the 1-minute average observations… These 5-minute averages are rounded to the nearest degree Fahrenheit, converted to the nearest 0.1 degree Celsius, and reported once each minute as the 5-minute average ambient and dew point temperatures…”

This automated procedure is performed with temperature sensors specified to have an RMS error of 0.9° F (0.5° C), a maximum error of ±1.8° F (±1.0° C), and a resolution of 0.1° F (0.06° C) in the most likely temperature ranges encountered in the continental USA. [See Table 1 in the User’s Guide.] One (1. ±0.5) degree Fahrenheit is equivalent to 0.6 ±0.3 degrees Celsius. Reporting the rounded Celsius temperature, as specified above in the quote, implies a precision of 0.1° C when only 0.6 ±0.3° C is justified, thus implying a precision 3 to 9-times greater than what it is. In any event, even using modern temperature data that are commonly available, reporting temperature anomalies with two or more significant figures to the right of the decimal point is not warranted!

Consequences

Where these issues become particularly important is when temperature data from different sources, which use different instrumentation with varying accuracy and precision, are used to consolidate or aggregate all available global temperatures. Also, it becomes an issue in comparing historical data with modern data, and particularly in computing anomalies. A significant problem with historical data is that, typically, temperatures were only measured to the nearest degree (As with modern ASOS temperatures!). Hence, the historical data have low precision (and unknown accuracy), and the rule given above for subtraction comes into play when calculating what are called temperature anomalies. That is, data are averaged to determine a so-called temperature baseline, typically for a 30-year period. That baseline is subtracted from modern data to define an anomaly. A way around the subtraction issue is to calculate the best historical average available, and then define it as having as many significant figures as modern data. Then, there is no requirement to truncate or round modern data. One can then legitimately say what the modern anomalies are with respect to the defined baseline, although it will not be obvious if the difference is statistically significant. Unfortunately, one is just deluding themselves if they think that they can say anything about how modern temperature readings compare to historical temperatures when the variations are to the right of the decimal point!

Indicative of the problem is that data published by NASA show the same implied precision (±0.005° C) for the late-1800s as for modern anomaly data. The character of the data table, with entries of 1 to 3 digits with no decimal points, suggests that attention to significant figures received little consideration. Even more egregious is the representation of precision of ±0.0005° C for anomalies in a Wikipedia article wherein NASA is attributed as the source.

Ideally, one should have a continuous record of temperatures throughout a 24-hour period and integrate the area under the temperature/time graph to obtain a true, average daily temperature. However, one rarely has that kind of temperature record, especially for older data. Thus, we have to do the best we can with the data that we have, which is often a diurnal range. Taking a daily high and low temperature, and averaging them separately, gives one insight on how station temperatures change over time. Evidence indicates that the high and low temperatures are not changing in parallel over the last 100 years; until recently, the low temperatures were increasing faster than the highs. That means, even for long-term, well-maintained weather stations, we don’t have a true average of temperatures over time. At best, we have an average of the daily high and low temperatures. Averaging them creates an artifact that loses information.

When one computes an average for purposes of scientific analysis, conventionally, it is presented with a standard deviation, a measure of variability of the individual samples of the average. I have not seen any published standard deviations associated with annual global-temperature averages. However, utilizing Tchebysheff’s Theorem and the Empirical Rule (Mendenhall, 1975), we can come up with a conservative estimate of the standard deviation for global averages. That is, the range in global temperatures should be approximately four times the standard deviation (Range ≈ ±4s). For Summer desert temperatures reaching about 130° F and Winter Antarctic temperatures reaching -120° F, that gives Earth an annual range in temperature of at least 250° F; thus, an estimated standard deviation of about 31° F! Because deserts and the polar regions are so poorly monitored, it is likely that the range (and thus the standard deviation) is larger than my assumptions. One should intuitively suspect that since few of the global measurements are close to the average, the standard deviation for the average is high! Yet, global annual anomalies are commonly reported with significant figures to the right of the decimal point. Averaging the annual high temperatures separately from the annual lows would considerably reduce the estimated standard deviation, but it still would not justify the precision that is reported commonly. This estimated standard deviation is probably telling us more about the frequency distribution of temperatures than the precision with which the mean is known. It says that probably a little more than 2/3rds of the recorded surface temperatures are between -26. and +36.° F. Because the median of this range is 5.0° F, and the generally accepted mean global temperature is about 59° F, it suggests that there is a long tail on the distribution, biasing the estimate of the median to a lower temperature.

Summary

In summary, there are numerous data handling practices, which climatologists generally ignore, that seriously compromise the veracity of the claims of record average-temperatures, and are reflective of poor science. The statistical significance of temperature differences with 3 or even 2 significant figures to the right of the decimal point is highly questionable. One is not justified in using the approach of calculating the Standard Error of the Mean to improve precision, by removing random errors, because there is no fixed, single value that random errors cluster about. The global average is a hypothetical construct that doesn’t exist in Nature. Instead, temperatures are changing, creating variable, systematic-like errors. Real scientists are concerned about the magnitude and origin of the inevitable errors in their measurements.

CONTINUE READING –>

The Next Crisis Is The Mother Of All Counter-Party Risks (Part 2)

[This is a long article – part valuable information and part rant. -Bob]

By Gijsbert Groenewegen – Re-Blogged From http://www.Gold-Eagle.com

In Part I I explained the counter-party risk that is all around us – and will come to the fore in the next financial crisis. In this second part I reflect on the rescue operations of the Fed following the 2008/2009 recession and the following QEs and ZIRP policies that have led to diminishing returns and that will ultimately weaken the US dollar: the biggest counter-party risk of all counter-party risks.

Addendum 8 – CDS, Credit Default Swaps. Ultimately it should be considered that when we encounter these systemic events that it will impact the underlying currency.  For example when the pension underfunding gets so problematic that the Government has to print more money to meet and rescue the obligations the counter-party risk will be reflected in the devaluation of the currency or the loss of purchasing power, the goods that you can buy with the same amount of nominal money will tumble.

Continue reading

Gold-Stock-Bull Upside Targets

By Adam Hamilton – Re-Blogged From http://www.Gold-Eagle.com

The get-no-respect gold-stock sector is in a strong young bull market. Past gold-stock bulls have grown to utterly-massive proportions before giving up their ghosts, greatly multiplying the wealth of contrarian investors and speculators. Today’s gold-stock bull is very likely to grow vastly larger before fully running its course. Fundamental gold-stock-bull upside targets reveal the lion’s share of gains are still yet to come.

A little over a year ago in January 2016, a monstrous gold-stock bear finally climaxed. The gold miners’ stocks fell to fundamentally-absurd 13.5-year secular lows as measured by their leading index, the HUI NYSE Arca Gold BUGS Index. Out of those dark depths of despair, a new gold-stock bull was stealthily born. And it soon started flexing its muscle, rocketing 182.2% higher in just 6.5 months by early August!

Continue reading

New Infrared-Emitting Device Could Allow Energy Harvesting From Waste Heat

By Anthony Watts – Re-Blogged From http://www.WattsUpWithThat.com

Researchers create first MEMS metamaterial device that displays infrared patterns that can be quickly changed

THE OPTICAL SOCIETY

WASHINGTON — A new reconfigurable device that emits patterns of thermal infrared light in a fully controllable manner could one day make it possible to collect waste heat at infrared wavelengths and turn it into usable energy.

The new technology could be used to improve thermophotovoltaics, a type of solar cell that uses infrared light, or heat, rather than the visible light absorbed by traditional solar cells. Scientists have been working to create thermophotovoltaics that are practical enough to harvest the heat energy found in hot areas, such as around furnaces and kilns used by the glass industry. They could also be used to turn heat coming from vehicle engines into energy to charge a car battery, for example.

Continue reading

Titanic Parallel To The Federal Reserve

By GE Christenson – Re-Blogged From http://www.Gold-Eagle.com

Thinking about the 105th anniversary of the sinking Titanic, the Titanic-sized debt in the world, and the role of central bankers…

The RMS Titanic departed Southampton, England at noon on April 10, 1912 and struck an iceberg in the North Atlantic just before midnight on April 14. She sunk less than three hours later. Her maiden voyage lasted about 110 hours.

The Federal Reserve, the central bank of the United States, was created by Congress late in 1913. Her “maiden voyage” devaluing the dollar has lasted over 103 years.

Continue reading

A Bankruptcy of Nuclear Proportions

Re-Blogged From Stratfor

In any given year, a handful of companies file for Chapter 11 bankruptcy in the United States. Rarely, however, does one of these filings reverberate beyond the boardroom and into the realm of geopolitics. Those that do — Lehman Brothers in 2008, or the “Big Three” U.S. automakers in 2008-10 — usually involve hundreds of billions of dollars. But the next big geopolitically relevant bankruptcy may be on the horizon, and the amount of money involved is tiny next to the collapses of the past decade.

Analysis

On March 29, Westinghouse Electric Co., a subsidiary of Japanese conglomerate Toshiba, filed for bankruptcy. The U.S.-based nuclear power company has been building two state-of-the-art nuclear power plants in Georgia and South Carolina, but it has been plagued by delays and cost overruns. The filing sent Toshiba scrambling to cut its losses by March 31, the end of Japan’s fiscal year. The Japanese conglomerate ended up writing down over $6 billion on its nuclear reactor business. But Toshiba’s troubles don’t end there; the firm is also working to sell off a portion of its chip manufacturing holdings.

The U.S. government is worried about what the sale of Westinghouse could mean for the future of traditional nuclear power in the United States and for nuclear power in China, which is keen to learn the secrets of a Western firm such as Westinghouse. The Japanese government, meanwhile, is wary of how Beijing could benefit in the long term, should a Chinese firm acquire Toshiba’s semiconductor unit.

A Setback for the Nuclear Renaissance

Even though the current and previous U.S. administrations have supported nuclear energy — and the first new reactor in the United States in two decades started last October — the future of traditional American nuclear power is not bright. High capital costs, climbing operating costs, sustained low natural gas prices and unfavorable electricity markets all limit its expansion. And with the failure of Westinghouse — one of the two major nuclear power firms in the country (the other, GE, is also scaling back its plans) — the picture looks even bleaker.

Westinghouse’s plants in Georgia and South Carolina are supposed to feature its new AP1000 pressurized water reactors, which were designed to be both safer and easier to build. The projects, however, have been hamstrung by setbacks and cost overruns totaling some $3 billion for each project. Westinghouse’s bankruptcy filing now puts them in limbo. Though there’s still a chance the projects will be completed, it’s hard to envision Westinghouse, even if it is sold, fulfilling its one-time plan of building perhaps dozens of plants in the United States.

But all hope is not lost for growth in the U.S. nuclear sector. The difference is that growth, if it is to occur, may come not from traditional nuclear powerhouses, which are expensive and inflexible, but from a new technology: small modular reactors (SMRs). SMRs are reactors smaller than 300 megawatts that are, as the name suggests, built in a modular fashion. In theory, they can be manufactured offsite and then assembled where needed, significantly lowering initial capital costs, one of nuclear power’s biggest constraints. Installation can also be done as needed, avoiding potential underutilization of capacity and, again, large capital costs, enabling nuclear energy to serve markets that would otherwise be unreachable.

The U.S. Department of Energy has supported the development of SMRs in the past. Two companies, Babcock & Wilcox and NuScale Power, have received federal funding to develop SMRs in recent years. Babcock & Wilcox has since scaled back its operations, but NuScale is forging ahead. The company recently filed plans with the Nuclear Regulatory Commission to deploy SMRs at the Idaho National Laboratory.

SMRs are promising, but the first pilot plants won’t be operational until at least the mid-2020s. And as with any unproven technology, the costs and benefits aren’t yet known and won’t be for some time. Supporters have proposed public-private partnerships to aid in the commercialization of SMR technology. But given the uncertainty surrounding the U.S. federal budget and the administration’s specific plans for infrastructure, it remains to be seen whether SMR technology will be able to get off the ground. Traditional nuclear power plants would be helpful to bridge the gap, and that is where Westinghouse’s bankruptcy will be felt the most in the United States. SMRs may provide the clearest path to a future of nuclear power in the country, but it won’t be an easy one.

A Motivated Buyer

The shedding of Westinghouse is not the only part of Toshiba’s financial restructuring that has been causing waves. As Toshiba’s board approved Westinghouse’s filing for Chapter 11 bankruptcy, U.S. officials raised concerns about national security. Chinese corporate espionage has targeted Westinghouse in the past, and U.S. officials are worried that a Chinese firm could simply buy access to the secrets it has tried before to steal.

Japan has concerns as well, though they are centered not on Westinghouse but on the sale of Toshiba’s semiconductor unit. On March 30, Toshiba’s shareholders voted to split off its NAND flash memory unit. Apple, Amazon, Google and several other U.S. firms expressed interest in acquiring it, as did Asian bidders from South Korea. Toshiba said April 7 that it had narrowed the list of bidders down to 10. But the group still includes Taiwan’s Hon Hai Precision (otherwise known as Foxconn), with a bid of $27 billion, which could set the stage for a dispute down the road. Should a Chinese company — or even a Taiwanese company with extensive operations on the mainland — acquire the semiconductor business, it would undermine the competitiveness of Japan’s tech sector relative to China’s in the long run.

The timing couldn’t be much better for Beijing, which is making semiconductor mergers and acquisitions the focal point of its overseas mergers and acquisitions strategy in much the same way it focused on oil and natural gas in the mid-2000s. On March 28, Tsinghua Unigroup, China’s largest chipmaker, finalized $22 billion in funding from the China Development Bank and the National Integrated Circuit Industry Investment Fund to build up the country’s semiconductor sector and push for global mergers and acquisitions. Tsinghua Unigroup is serious about growth; in January, it announced plans to build a $30 billion fabrication plant in Nanjing. Such growth would pose an existential threat to the semiconductor industries of Japan and South Korea, and the sale of Toshiba’s semiconductor business to a Chinese company would only make such a scenario more likely.

None of these potential concerns about the fallout from Toshiba’s corruption, mismanagement and financial problems is surprising. The United States has always had an interest in the sale of nuclear-related technology, and Japan’s tech sector has long been one of its most important and most competitive industries. But the struggles of Toshiba and the demise of Westinghouse are a rare instance in which a corporate breakdown has important geopolitical consequences.

CONTINUE READING –>

Coal’s Unexpected Ally: Natural Gas

By David Middleton – Re-Blogged From http://www.WattsUpWithThat.com

I concluded my previous coal post, The Resurgence of the American Coal Industry, with the following:

The U.S. coal industry is doing exactly what the oil & gas industry did from 2014-2016.  In the face of oversupply relative to demand and a collapsing commodity price, the industry is making itself “leaner and meaner.”  Mr. Denning referred to natural gas as the “enemy” of coal.  That’s funny, I find oil & gas for a living and have never thought of coal or nuclear power as enemies.  Fair competition is good for business… And as an electricity consumer, I don’t like paying more than 10¢ per kWh for electricity.

Natural gas prices are unlikely to remain this low for very long.  $2.50/mmbtu is uneconomic in most of the shale plays and very uneconomic in the Gulf of Mexico, except on a cost-forward basis.   When natural gas production and consumption come back into balance, it will probably be at a price of $3.50 to $5.00/mmbtu.  Coal is very competitive with natural gas above $3.50/mmbtu.

Continue reading

G7 Joint Climate Statement “Scuttled”

By Eric Worrall – Re-Blogged From http://www.WattsUpWithThat.com

Politico claims that efforts to formulate a joint G7 statement on energy policy were abandoned, because President Trump would not agree to guarantee the USA would remain signed up to the Paris Climate Agreement.

Trump’s climate demands roil U.S. allies

Documents show the administration pushed other G-7 countries to embrace larger roles for nuclear power and fossil fuels. They refused.

Continue reading

The Mystery Behind Economic Growth

By Alasdair Macleod – Re-Blogged From http://www.Silver-Phoenix500.com

We learn, out of the blue, that “the Eurozone is performing well, but with opinions divided on the causes, doubts linger over whether it is a sustainable recovery” (Daily Telegraph, 19 April). We are also told that economic growth in the US is stalling, as evidenced by downward revisions by the Atlanta Fed, and the fact that the rate of increase in Loans and Leases by commercial banks is also stalling. The Bank of England was unable to forecast the strength of the UK economy in the wake of Brexit.

This article explains why this confusion occurs. It is clear the economics profession is ill-informed about the one thing it is paid to know about, and the commentary that trickles down to the ordinary person is accordingly incorrect. State-educated and paid-for economists always assume the private sector is the problem, when it is the burden of the state, and the state’s futile attempts to manage the consequences of its actions through the corruption of money.

Continue reading

2017 Annual World Forecast

[This is a very comprehensive, very long article. Please be ready. -Bob]

Re-Blogged From Stratfor

The convulsions to come in 2017 are the political manifestations of much deeper forces in play. In much of the developed world, the trend of aging demographics and declining productivity is layered with technological innovation and the labor displacement that comes with it. China’s economic slowdown and its ongoing evolution compound this dynamic. At the same time the world is trying to cope with reduced Chinese demand after decades of record growth, China is also slowly but surely moving its own economy up the value chain to produce and assemble many of the inputs it once imported, with the intent of increasingly selling to itself. All these forces combined will have a dramatic and enduring impact on the global economy and ultimately on the shape of the international system for decades to come.

Continue reading

Retailers Going Bankrupt at Staggering Rate

By F McGuire – Re-Blogged From http://www.newsmax.com

Retailers reportedly are filing for bankruptcy protection at a disturbing rate that’s flirting with recessionary levels.

Meanwhile, a steady stream of store closures continues to haunt the battered American retail industry.

“It’s only April, and nine retailers have already filed for bankruptcy since the start of the year — as many as all of last year,” Business Insider explained.

“2017 will be the year of retail bankruptcies,” Corali Lopez-Castro, a bankruptcy lawyer, told Business Insider. “Retailers are running out of cash, and the dominoes are starting to fall.”

More than 3,500 stores are expected to close over the next several months, BI reported.

Continue reading

California Doubles Down on Stupid

By Anthony Watts – Re-Blogged From http://www.WattsUpWithThat.com

From the LA Times and the “let’s double down on stupid” department

A cornerstone of California’s battle against climate change was upheld on Thursday by a state appeals court that ruled the cap-and-trade program does not constitute an unconstitutional tax, as some business groups had claimed.

The 2-1 decision from the 3rd District Court of Appeal in Sacramento does not eliminate all the legal and political questions that have dogged the program, which requires companies to buy permits to release greenhouse gases into the atmosphere.

Continue reading

Trump’s Biggest Enemy is the Fed

By Michael Pento – Re-Blogged From PentoPort

Right on the heels of Donald Trump’s stunning election victory, Democrats began to diligently work on undermining his presidency. That should surprise no one. It’s just par for the course in partisan D.C.

However, what appears to be downright striking is that the Keynesian elites may have found a new ally in their plan to derail the new President…the U.S. Federal Reserve.

First, it’s important to understand that the Fed is populated by a group of big-government tax and spend liberal academics who operate under the guise of an apolitical body. For the past eight years, they have diligently kept the monetary wheels well-greased to prop up the flat-lining economy.

However, since the election the Fed has done a complete about-face on rate hikes and is now in favor of a relatively aggressive increase in its Fed Funds Rate. And I use the term relatively aggressive with purpose, because the Fed raised interest rates only one time during the entire eight-year tenure of the Obama Presidency. Technically speaking, the second hike did occur in December while Obama still had one full month left in office. But coincidentally, this only took place after the election of Donald Trump.

Keep in mind a rate hiking cycle is no small threat. The Federal Reserve has the tools to bring an economy to its knees and has done so throughout its history of first creating asset bubbles and then blowing them up along with the entire economy.

Remember, it was the Fed’s mishandling of its interest rate policy that both created and burst the 2008 real estate bubble. By slashing rates from 6.5 percent in January 2001, to 1 percent in June 2003, it created a massive credit bubble. Then, it raised rates back up to 5.25 percent by June of 2006, which sent home prices, stock values and the economy cascading lower.

In the aftermath of the carnage in equity prices that ended in March of 2009, the Standard & Poor’s 500 stock index soared 220 percent on the coat tails of the Federal Reserve’s money printing and Zero Interest Rate Policies. But during those eight years of the Obama Administration, the Fed barely uttered the words asset bubble. In fact, it argued that asset bubbles are impossible to detect until after they have burst.

But since the November election, the Fed’s henchmen have suddenly uncovered a myriad of asset bubbles, inflation scares and an issue with rapid growth. And are preparing markets for a hasty and expeditious rate hike strategy. The Fed has even indicated in the minutes from its latest FOMC meeting that it actually intends on beginning to reduce its massive $4.5 trillion balance sheet by the end of this year. In other words, trying to raise the level of long-term interest rates.

In a recent interview, Boston Fed President Eric Rosengren has suddenly noted that certain asset markets are “a little rich”, and that commercial real-estate valuations are “pretty ebullient.” The Fed is anticipating as many as four rate hikes during 2017 with the intent to push stocks lower, saying that “rich asset prices are another reason for the central bank to tighten faster.” Piling on to this hawkish tone, San Francisco Fed President John Williams’s also told reporters that he, “would not rule out more than three increases total for this year.”

The Fed is tasked by two mandates, which are full employment and stable inflation. However, it has redefined stable prices by setting an inflation goal at 2%. Therefore, a surge in inflation or GDP growth should be the primary reasons our Fed would be in a rush to change its monetary policy from dovish to hawkish.

Some people may argue that the Fed has reached its inflation target and that is leading to the rush to raise rates, as the year over year inflation increase is now 2.8%. The problem with that logic is that from April 2011 all the way through February 2012 the year-over-year rate of Consumer Price Inflation was higher than the 2.8% seen today. Yet, the Fed did not feel compelled to raise rates even once. In fact, it was still in the middle of its bond-buying scheme known as Quantitative Easing.

Perhaps it isn’t inflation swaying the Fed to suddenly expedite its rate hiking campaign; but instead a huge spike in GDP growth. But the facts prove this to be totally false as well. The economy only grew at 1.6 % for all of 2016. That was a lower growth rate than the years 2011, 2013, 2014, 2015; and only managed to match the same level as 2012. Well then, maybe it is a sudden surge in GDP growth for Q1 2017 that is unnerving the Fed? But again, this can’t be supported by the data. The Atlanta Fed’s own GDP model shows that growth in the first three months of this year is only growing at a 1.2 percent annualized rate.

If it’s not booming growth, and it’s not run-away inflation and it’s not the sudden appearance of asset bubbles…then what is it that has caused the Fed to finally get going on interest rate hikes?

The Fed is comprised of a group of Keynesian liberals that have suddenly found religion with its monetary policy because it is no longer trying to accommodate a Democrat in the White House. It appears Mr. Trump was correct during his campaign against Hilary Clinton when he accused the Fed of, “Doing Political” regarding its ultra-low monetary policy. Now that a nemesis of the Fed has become President…the battle has begun.

CONTINUE READING –>

Weekly Climate and Energy News Roundup #265

By Ken Haapala, President – Brought to You by www.SEPP.org

The Science and Environmental Policy Project

Upper Bound for 21st Century Warming: The Right Climate Stuff Team (TRCS) is a group of retired and highly experienced engineers and scientists from the Apollo, Skylab, Space Shuttle and International Space Station eras. They have volunteered their time and effort to conduct an objective, independent assessment of the carbon dioxide (CO2)-caused global warming to assess the reality of the actual threat, and separate that from unnecessary alarm. They have applied the techniques they learned for space missions to this task. A rough engineering analogy is how can they be confident that an astronaut will not cook or freeze in a space station or a space suit.

To do this, the TRCS created an energy flow model (energy balance or energy conservation model) that accurately correlates with empirical global surface temperature data, using HadCRUT surface temperature data since 1850. HadCRUT is a dataset of monthly instrumental temperature records formed by combining the sea surface temperature records compiled by the Hadley Centre of the UK Met Office and the land surface air temperature records compiled by the Climatic Research Unit (CRU) of the University of East Anglia. (Note: atmospheric temperature data from satellites only dates to 1979.)

Continue reading

Bottom Drops Out of US Hurricanes in Past Decade

By Anthony Watts – Re-Blogged From http://www.WattsUpWithThat.com

Inconvenient data for those who still insist climate change is making hurricanes more frequent is displayed in these two slides from Dr. Philip Klotzbach. As noted by Dr. Roger Pielke Jr. The bottom dropped out of US hurricanes over the last 10 years.

CommonDreams.org quoted Al Gore back in 2005:

… the science is extremely clear now, that warmer oceans make the average hurricane stronger, not only makes the winds stronger, but dramatically increases the moisture from the oceans evaporating into the storm – thus magnifying its destructive power – makes the duration, as well as the intensity of the hurricane, stronger.

Last year we had a lot of hurricanes. Last year, Japan set an all-time record for typhoons: ten, the previous record was seven. Last year the science textbooks had to be re-written. They said, “It’s impossible to have a hurricane in the south Atlantic.” We had the first one last year, in Brazil. We had an all-time record last year for tornadoes in the United States, 1,717 – largely because hurricanes spawned tornadoes.

Continue reading

Colombia: The Space Between War and Peace

Continue reading

Economy Contracting But Expect Higher Stock Prices

By Chris Vermeulen – Re-Blogged From http://www.Gold-Eagle.com

The United States is the world’s largest and most diversified economy! It is currently suffering through a protracted period of slow growth which has held down job creation and labor market participation.  The Pew Research Center reported, in late 2015, that a mere 19% of Americans trust the government either always or most of the time.

The FED must print more money in order to keep the party going forward.

The bottom line is that this current bull market has been driven mostly by corporations which are buying back their shares, over the years. Individual investors have increasingly been moving out of equity mutual funds and into equity ETF’s.

Continue reading

Resurgence of the American Coal Industry

By David Middleton – Re-Blogged From http://www.WattsUpWithThat.com

America’s Biggest Coal Miner Is Joining the Comeback Under Trump

by Tim Loh
April 3, 2017, 4:28 PM CDT April 4, 2017, 1:14 PM CDT

  • Peabody is following rival Arch Coal out of bankruptcy
  • The coal miner has for decades served as industry’s bellwether

Peabody Energy Corp., America’s largest coal miner, is back.

After almost a year in bankruptcy, the St. Louis-based giant began trading again on the New York Stock Exchange on Tuesday. Its return to Wall Street comes as the entire U.S. coal sector is staging a comeback amid growing interest from investors.

Continue reading

Making Sense of the Attack in St. Petersburg

Re-Blogged From Stratfor

Russia is no stranger to terrorist attacks. But for the past four years, the country (beyond its restive North Caucasus region) has been free of the kinds of large assaults that have periodically rocked Europe and the United States. Then on Monday, an explosion ripped through a subway train in St. Petersburg, killing 11 people and injuring nearly 50 more. (A second device was reportedly found and dismantled at a nearby metro station.) Russia’s Investigative Committee quickly declared the incident a terrorist attack, and media outlets across the country have proposed different theories to explain who staged the attack and why. Though some scenarios are more plausible than others, each comes with its own set of consequences for the Kremlin and the country.

Pension Crisis In US And Globally Is Unavoidable

By Lance Roberts – Re-Blogged From http://www.Silver-Phoenix500.com

There is a really big crisis coming.

Think about it this way. After 8 years and a 230% stock market advance the pension funds of Dallas, Chicago, and Houston are in severe trouble.

But it isn’t just these municipalities that are in trouble, but also most of the public and private pensions that still operate in the country today.

Continue reading

What’s Left To Drive The Recovery? Not Much

By John Rubino – Re-Blogged From Dollar Collapse

US growth, such as it is, has lately been driven by a handful of hot sectors. Car sales have set records, high-end real estate is generally way up, and federal spending – based on last year’s jump in the national debt – is booming.

But now the private sector part of that equation is shifting into low gear. Cars in particular:

Economy Will Miss That New-Car Smell

(Wall Street Journal) – The annual pace of light-vehicle sales fell to a seasonally adjusted 17.2 million in the first quarter from 18 million. That the decline has come despite generous incentives from car companies and still-low gasoline prices suggests that sales are past their peak.

Continue reading

Soaring Global Debt Sets Stage For “Unprecedented Private Deleveraging”

By John Rubino – Re-Blogged From Dollar Collapse

The UK’s Telegraph just published an analysis of global debt that pretty much sums up the coming crisis. Here’s an excerpt with a couple of the more hair-raising charts:

Global Debt Explodes At ‘Eye-Watering’ Pace To Hit £170 Trillion

Global debt has climbed at an “eye-watering” pace over the past decade, soaring to a fresh high of £170 trillion last year, according to the Institute of International Finance (IIF).

The IIF said total debt levels, including household, government and corporate debt, climbed by more than $70 trillion over the last 10 years to a record high of $215 trillion (£173 trillion) in 2016 – or the equivalent of 325pc of global gross domestic product (GDP).

Continue reading

Asia’s Dilemma: China’s Butter, or America’s Guns?

By Rodger Baker – Re-Blogged From Strafor 

Flying into Singapore’s Changi Airport, one is struck by the fleet of ships lined up off shore, the tendrils of a global trade network squeezing through the narrow Malacca Strait. Singapore is the hub, the connector between the Indian Ocean, South China Sea and Pacific. Since the late 1970s, with little exception, trade has amounted to some 300 percent of Singapore’s total gross domestic product, with exports making up between 150 and 230 percent of GDP. Singapore is the product of global trade, and the thriving multiethnic city-state can trace its trade role back centuries.

The Decline And Fall Of The Euro Union

By Alasdair Macleod – Re-Blogged From http://www.Gold-Eagle.com

This article identifies the headwinds faced by the EU in the wake of Brexit. Without the UK, not only does the EU lose much of its importance on the world stage, but the Commission’s budget is left with an enormous hole. That is the decline. The fall is well under way, with capital flight significantly worse than generally realised, as a proper understanding of TARGET2 imbalances shows. Not only is the ECB running out of options, but without major support from Germany, France and Italy, Brussels itself faces a financial crisis. In a highly unusual move, Jamie Dimon of JP Morgan in a letter to his shareholders this week backtracked on his earlier pre-Brexit threat to move jobs from London, declaring that the problem is Europe itself.

Continue reading