Oxygen Mystery on Mars

By NASA/Goddard Space Flight Center – Re-Blogged From Eureka Alert

For the first time in the history of space exploration, scientists have measured the seasonal changes in the gases that fill the air directly above the surface of Gale Crater on Mars. As a result, they noticed something baffling: oxygen, the gas many Earth creatures use to breathe, behaves in a way that so far scientists cannot explain through any known chemical processes.


This is a sunset at the Viking Lander 1 site, 1976.  Credit: NASA/JPL

This is a sunset at the Viking Lander 1 site, 1976. Credit: NASA/JPL

Continue reading

A Climate Modeller Spills the Beans

Does The Climate-Science Industry Purposely Ignore A Simple Aspect of Strong El Niño Events That Causes Long-Term Global Warming?

By Bob Tisdale – Re-Blogged From WUWT

It was a little more than 10 years ago that I published my first blog posts on the obvious upward steps in the sea surface temperatures of a large portion of the global oceans…upward steps that are caused by El Niño events…upward steps that lead to sunlight-fueled, naturally occurring global warming.

There is a very simple explanation for those El Niño-caused upward shifts that also make themselves known in the sea surface temperature data for much larger portion of the global oceans than I first presented a decade ago…the upward steps that are blatantly obvious in the satellite-era (starts November 1981) of sea surface temperature data for the South Atlantic, Indian and West Pacific Oceans, as shown in Figure 1, which together cover about 52% of the surfaces of the global oceans.

Figure 1

Continue reading

The Chill of Solar Minimum

Sept. 27, 2018: The sun is entering one of the deepest Solar Minima of the Space Age. Sunspots have been absent for most of 2018, and the sun’s ultraviolet output has sharply dropped. New research shows that Earth’s upper atmosphere is responding.

“We see a cooling trend,” says Martin Mlynczak of NASA’s Langley Research Center. “High above Earth’s surface, near the edge of space, our atmosphere is losing heat energy. If current trends continue, it could soon set a Space Age record for cold.”


Above: The TIMED satellite monitoring the temperature of the upper atmosphere

Continue reading

Spiders and Space Weather

Re-Blogged From Space Weather

Did you know that spiders can fly? Biologists call it “ballooning.” Spiders spin a strand of silk, it juts into the air, and off they go. Airborne arachnids have been found as high as 4 km off the ground. Originally, researchers thought spiders were riding currents of air, but there’s a problem with that idea. Spiders often take flight when the air is calm, and large spiders fly even when air currents are insufficient to support their weight. It’s a mystery.

Above: Just before ballooning, spiders adopt a posture shown here called “tiptoeing.”

Continue reading

New Atmospheric Theory Explains Radiosonde Data Without CO2

By Andy May – Re-Blogged From http://www.WattsUpWithThat.com


In 2014, Dr. Michael Connolly and Dr. Ronan Connolly posted three important, non-peer reviewed papers on atmospheric physics on their web site. These papers can be accessed online here. The papers are difficult to understand as they cover several fields of study and are very long. But, they present new data and a novel interpretation of energy flow in the atmosphere that should be seriously considered in the climate change debate.

By studying weather balloon data around the world Connolly and Connolly show that the temperature profile of the atmosphere to the stratosphere can be completely explained using the gas laws, humidity and phase changes. The atmospheric temperature profile does not appear to be influenced, to any measurable extent, by infrared active gases like carbon dioxide.

Figure 1, source Connolly and Connolly.

Continue reading

Weekly Climate and Energy News Roundup #257

By Ken Haapala, Pres., Science & Environmental Policy Project http://www.SEPP.org

Climate Change Understanding: Two separate guests to Watts Up With That discuss major issues in climate change modeling and the processes used by the UN Intergovernmental Panel on Climate Change (IPCC). Some of these issues have been present since the inception of the IPCC and carry over to other government-funded reports such as those by the United States Global Change Research Program (USGCRP). The issues may create substantial systematic errors in the reports, some of which continue after decades of research.
Tim Ball, a student of climate change pioneer H.H. Lamb, discusses the problem of developing a refined hypothesis (theory) that exceeds the capability of gathering the data needed to substantiate that hypothesis (theory). Lamb established the Climatic Research Unit (CRU) at the University of East Anglia as a center for collecting climate data. As Ball states, Lamb wrote:
“…it was clear that the first and greatest need was to establish the facts of the past record of the natural climate in times before any side effects of human activities could well be important.”
This goal has been thwarted by those who succeeded Lamb and used CRU as a means for reinforcing the belief that human activities dominate climate change, not merely contributed to climate change.
Very simply, we do not have the surface data and the processes to separate natural variation from the human activities. And the processes used lump natural variations with human activities, particularly in surface temperature data. The processes used by the IPCC and USGCRP are faulty, and need to be changed or ignored. These entities are not engaged in empirical science to understand climate change. Instead, they are engaged in processes “to prove” climate change is caused by humans, even if it requires ignoring history. Often the lack of rigorous research is disguised by vague language. Ball’s lengthy post is well worth reading. See links under Challenging the Orthodoxy.

Quote of the Week.
“When people learn no tools of judgment and merely follow their hopes, the seeds of political manipulation are sown.” Stephen Jay Gould [H/t Tim Ball]

Number of the Week: $23.7 Billion

Climate Change Modeling: As with Tim Ball, retired mathematician Mike Jonas discusses the issues involved in developing a hypothesis and models that go beyond the capabilities of meaningful measurement. Jonas emphasizes the problems of using expanded weather models as global climate models. The weather models are chaotic, thus produce different results with each model run.
As Fred Singer stated in a report by the Nongovernmental International Panel on Climate Change (NIPCC, 2010), based on an investigation of Japan’s climate model, at least ten separate runs are needed in hopes to obtain a central tendency for each individual model. Yet, the IPCC presents only one model run for each of the many models it uses it its reports. There is no logical reason to assume that any particular run represents the central tendency of that model.
Jonas discusses that the US National Center for Atmospheric Research (NCAR) performed 40 runs from 1920 to 2100 for its Community Earthy System Model (CESN), with a slight change in initial conditions – the global atmospheric temperatures was changed by less than one-trillionth of one degree for each run. The NACR / UCAR’s publication AtmosNews states that the results were called the CESM Large Ensemble that give “astounding” diverse climate projections. In little over a year, the Ensemble data were used in “100 peer-reviewed scientific journal articles.” One wonders how many of these articles questioned the utility of using such highly unstable models for estimating future climate. Jonas stated: “This NCAR report shows unequivocally that the climate models in their current form can never predict future climate.”
As H.H. Lamb demonstrated, the earth’s climate changes. However, compared with nature, climate models are highly unstable, yet EPA and other government organizations claim they represent nature.
In the post and an earlier one, Jonas discusses the problems involved in “tuning,” that is adjusting the models to produce appropriate results. A major problem, not particularly discussed, is adjusting the models to surface data, while atmospheric data are largely ignored. Yet, as stated in the 1979 Charney Report published by the National Academy of Sciences, the effects of greenhouses gases occur in the atmosphere, with the secondar effect at the surface.
Both the surface and atmospheric temperature data are affected by natural occurrences such as volcanoes and El Niños. However, surface data are affected by many additional human influences, such as the Urban Heat Island Effect, irrigation, location of instruments near airports, etc. In general, for decades the models have been “tuned” to the wrong data. As John Christy’s testimony to Congress demonstrated, except for the Russian model from the Institute of Numerical Mathematics, using a single run, the IPCC models greatly overestimate the warming of the atmosphere, where the effect of greenhouse gases occurs. See links under Challenging the Orthodoxy – NIPCC, Challenging the Orthodoxy, Defending the Orthodoxy, and Models v. Observations.

Bias in Publications: Patrick Michaels and Paul “Chip” Knappenberger discuss bias in publications, including the magazines Nature and Science. However, their discussion does not sufficiently underline the extreme bias in these publications.
In March 1990, Science published a paper by Roy Spencer and John Christy describing a method for using data collected from NOAA polar orbiting weather satellites to comprehensively calculate atmospheric temperatures for virtually the entire globe, except for the extreme poles. These data cover about 97 to 98 percent of the globe, including oceans, deserts, mountain ranges, jungles, etc. where there are few surface instruments. Initially, certain small errors in calculation were discovered, including orbital decay. These were acknowledged and corrected. This is how science advances.
However, these magazines largely ignore studies using comprehensive satellite data in favor of studies using sparse surface data. Further, often the missing data is infilled, that is calculated from other data that may result in systematic biases. For an extreme example, Nature published Mr. Mann’s hockey-stick which used a statistical technique, that had an internal bias, to create the impression that late 20th century warming was unprecedented as compared with the prior 1,000 years, at least in the Northern Hemisphere. This article has not yet been retracted. See links under Challenging the Orthodoxy.

TWTW Guilty of Bias: TWTW emphasizes reporting studies and articles that contradict and question establishment climate and environmental science because it does not have the staff to adequately explore all such articles.
When it comes to evidence of CO2-caused global warming / climate change, TWTW has focused on the lack of evidence, weaknesses in the evidence supporting CO2 is causing warming, and evidence of other causes. In part, this is justifiable because the UN and the US have enormous budgets dedicated to emphasizing CO2 as the cause, without properly considering natural variation. The US spends about $2.5 billion a year on “climate science.”
This problem is amplified by UN and US entities using the results of global climate models as proof of cause. These models are poorly implemented to understand the influences of greenhouse gases. They are “tuned,” that is adjusted to surface temperature measurements when the greenhouse gases effect occurs in the atmosphere. The fact that these models are not properly tested, that is, verified and validated, indicates the severity of the problem. The disconnect between the research efforts and the political demand to control greenhouse gas emissions is severe.
The NIPCC reports contain a more balanced view, but are largely ignored by the climate establishment and many US scientific organizations – though not by the Chinese Academy of Sciences. See links under Challenging the Orthodoxy – NIPCC.

Political Stability: Retired EPA scientist Alan Carlin asks a question about the US. “What will happen to the climate alarmist cause in two, four, or eight years?” Carlin considers the US political system more stable than the European systems, thus significant change is slow to take hold.
The difference in timeliness has been observed by others as well. Usually, in parliamentary systems, the leader of the executive branch is also the leader of the legislative branch, thus there is more cohesiveness in implementing political programs. The downside is actions such as the UK Climate Change Act of 2008, following the 2007 IPCC Fourth Assessment Report (AR-4). The Act requires that the UK becomes a low-carbon economy, to cut CO2 emissions by 80% as compared with a 1990 baseline. The public is suffering from misallocation of resources due to programs attempting to meet the requirements of this Act.
In the US, political power is far more diffuse, less concentrated. Often, the executive and legislative branch are controlled or led by different political parties. Also, state and regional interests are important. Thus, party discipline is reduced in the legislative branch, as compared with European parliaments. President Obama found this out when his version of the Climate Change Act, the American Clean Energy and Security Act of 2009, failed to pass the Senate, even though the Senate and House were controlled by his political party. What will happen over the next four years is very difficult to predict. See link under Questioning the Orthodoxy.

Snappy: Amusingly, recent newspaper reports labeled the author of TWTW a “swamp alligator.” He is guilty of touring and observing wildlife in southern swamps, such as the Okefenokee National Wildlife Refuge, some with a pH as low as 4. In the mid-20th century, the Okefenokee was the setting for a noted comic strip that appealed to adults, Pogo. One of the main characters was Albert the Alligator. But, that name is too docile for TWTW, so let’s call him Snappy. See links under Below the Bottom Line.

Number of the Week: $23.7 Billion. Lisa Linowes, who follows such matters, wrote in Master Resource that the US Joint Committee (House and Senate) on Taxation (JCT) now estimates the total cost of the wind production tax credit in the years 2016–2020 to be $23.7 billion. The $23.7 billion is for deployment of an electricity generating system that is not reliable. Wind production is not dispatchable, meaning it can be turned on when needed, within a known time-period. Wind power places a burden on reliable forms of electricity generation. Further, wind power is not needed for electricity in the US, except for isolated conditions. See link under Alternative, Green (“Clean”) Solar and Wind.

1. Trump Dams the Regulatory Flood
His executive order should change the bureaucratic incentives.
Editorial, WSJ, Jan 30, 2017
SUMMARY: After recognizing the contradictions in early actions the editorial states: “President Trump signed an executive order adopting a ‘two-for-one’ regulatory budget that will help accelerate growth and innovation.
The Obama years were a boom era for rule-making, but the truth is that obsolete and onerous rules have been accumulating for decades. In a working paper for George Mason’s Mercatus Center, Bentley Coffey,Patrick McLaughlin and Pietro Peretto estimate that the economy would be about 25% larger if the level of U.S. regulation had stayed constant since 1980. That’s now more than $4 trillion a year, or $13,000 per person.
The Trump order aims to prevent such waste by requiring the agencies to repeal two old rules for every new one they publish. This is in some sense a gimmick, since some regulations are far more significant, costly or distorting of investment choices than others. But the text of the order suggests that for every dollar of new cost imposed on the private economy, each agency will have to find two dollars of burden to relieve.”
“The permanent bureaucracy lives to justify its own existence, regardless of which party holds the White House, and rules inevitably beget more rules. Mr. Trump’s order starts to change the institutional incentives.
“Under a two-for-one policy, each individual department will need to scrutinize its own books in search of offsets and rules needing modernization, which will make deregulation as high a priority as rule-making. The Environmental Protection Agency can’t poach savings uncovered at, say, the Fish and Wildlife Service. This could lead to more realistic cost-benefit tests, focus the bureaucracy on trade-offs and strengthen regulatory accountability.”
“One key appointment to watch will be Mr. Trump’s choice to run the White House Office of Information and Regulatory Affairs (OIRA), who will be crucial to ensuring that the rollback will work in practice. The White House has failed to appoint a regulatory task force to supervise regulation until the OIRA job is filled, and this means that some agencies will now try to expand their writ while no one is watching. Democrats will try to delay approving a nominee for as long as possible.
“Meanwhile, the House this week will begin the job of repealing some of President Obama’s worst regulations. Republicans plan to rescind Mr. Obama’s midnight rules under the Congressional Review Act (CRA) that gives Congress an up-or-down vote on new rules. The House and Senate will vote on joint disapproval resolutions, which need only a majority before they are sent to the President.
“On the chopping block is one EPA regulation related to streams that is estimated to threaten up to one-third of the remaining jobs in the coal industry. Another target is a Bureau of Land Management rule designed to undermine oil and gas fracking on federal land. A third is a Securities and Exchange Commission rule that forces U.S. companies to report payments to foreign governments, which can mean disclosing proprietary information that competitors can use against them.
The CRA is a exceptionally powerful reform tool, as our Kimberley Strassel reported last week. Amid the rush to pump out ever more rules, the Obama Administration may have failed to comply with many CRA mandates. The more the Trump Administration works with Congress to codify reform, the more durable the economic progress will be.
“…If Mr. Trump can break up the Washington central planning that is again misallocating resources, the resulting job creation and new investment would be a great legacy.”

2. What Kind of a Judge Is Neil Gorsuch?
He carefully follows the law, and writes as engagingly as Scalia, without the abrasiveness.
By David B. Rivkin Jr. and Andrew M. Grossman, WSJ, Jan 31, 2017
SUMMARY: After lengthy praise of the Judge’s literary style, the authors write:
“Judge Gorsuch’s textualism extends to the Constitution, quite emphatically: ‘That document,’ he wrote, ‘isn’t some inkblot on which litigants may project their hopes and dreams for a new and perfected tort law, but a carefully drafted text judges are charged with applying according to its original public meaning.’ Looking to the ‘original public meaning’ of the Fourth Amendment, for example, Judge Gorsuch has rejected the government’s view that a search warrant could be applied across jurisdictional lines. He also disputed its claim that police officers may ignore ‘No Trespassing’ signs to invade a homeowner’s property without a warrant.
“What about the Constitution’s separation of powers, intended to safeguard liberty? Judge Gorsuch has been at the vanguard of applying originalism to the questions raised by today’s Leviathan state, which is increasingly controlled by unaccountable executive agencies. These questions loom large after the rash of executive actions by President Obama, and now the whiplash reversals by the Trump administration.
“The deference that judges now must give to agencies’ interpretations of the law, he wrote in an opinion last year, permits the executive ‘to swallow huge amounts of core judicial and legislative power and concentrate federal power in a way that seems more than a little difficult to square with the Constitution of the framers’ design.’
“Judge Gorsuch added: ‘Maybe the time has come to face the behemoth.’ His addition to the Supreme Court would give the justices a better chance than ever to do precisely that.”


New Santer et al. Paper on Satellites vs. Models: Even Cherry Picking Ends with Model Failure

By Roy W. Spencer, Ph. D. – Re-Blogged From http://www.WattsUpWithThat.com

(the following is mostly based upon information provided by Dr. John Christy)

Dr. John Christy’s congressional testimonies on 8 Dec 2015 and 2 Feb 2016 in which he stated that climate models over-forecast climate warming by a factor of 2.5 to 3, apparently struck a nerve in Climate Consensus land.

In a recently published paper in J. Climate entitled Comparing Tropospheric Warming in Climate Models and Satellite Data, Santer et al. use a combination of lesser-known satellite datasets and neglect of radiosonde data to reduce the model bias to only 1.7 times too much warming.

Wow. Stop the presses.

Continue reading

Strange New Climate Change Spin: The Hottest Year Ever Inside a Global Warming ‘Pause’?

By – Re-Blogged From The Stream

There are two stories floating around about the state of the earth’s atmosphere. Both are believed true by government-funded scientists and the environmentally minded. The situation is curious because the stories don’t mesh. Yet, as I said, both are believed. Worse, neither is true.

Story number one is that this year will be the hottest ever. And number two is that the reason it is not hot is because “natural variation” has masked or stalled man-caused global warming.

Which is it? Either it’s hotter than ever or it isn’t. If it is, then (it is implied) man-caused global warming has not “paused.” If it isn’t, if man-caused global warming has “paused,” then it is not growing hotter.

Continue reading