The Week That Was: November 2, 2019, Brought to You by www.SEPP.org
By Ken Haapala, President, Science and Environmental Policy Project (SEPP)
Appropriate Models: Mathematics is the language of science, but that does not mean that mathematical models correctly describe physical phenomena. Or that a mathematical process in a model used in analyzing physical evidence (data) from observations and / or experiments suitable for one phenomenon is suitable for other phenomena. The model may or may not describe the subject phenomenon. That is one reason why Richard Feynman stated that hypotheses (guesses) must be tested against all relevant data. Experimental data is preferred because other possible influences are controlled to the extent possible, but observational data may be necessary.
For example, Euclidian geometry generally fit the concepts of the day, until Newtonian mechanics were needed to better described the orbits of planets in classical physics. But Newtonian mathematics, which included calculus, did not describe Brownian motion and the movement of electrons around an atom. Eventually, Quantum mechanics (physics) was developed to better explain such motion. This field requires a far more highly specialized forms of mathematics than Euclidian mathematics. (Describing Brownian motion has to do with statistics: numbers of molecules bumping into macroscopic particles like dust, knocking them around in random directions with random speeds. Relativity also required specialized mathematics.)
A similar problem appears to be rising in the study of climate. Mathematical models used to study weather are improving over short time periods, say one to seven days. But they blow apart in about 10 days. The wished-for goal of the European Centre for Medium-Range Weather Forecasts (ECMWF) is to predict extreme weather events 30 days out – but success is doubtful. Mathematical chaos develops in models of coupled dynamic nonlinear systems, of which the atmospheric fluid in an example.
It is becoming apparent that the mathematical models developed for forecasting weather may not be suitable for forecasting future climate, which requires long-range predicting / forecasting capability. On his blog, Meteorologist Cliff Mass, who is not a climate sceptic, describes the difficulties of forecasting beyond two weeks, sometimes beyond one week. As he states:
Such poor forecasts even a month out are not unusual. UW graduate student Nick Weber and I evaluated the skill of the main U.S. long-term forecasting model (the CFSv2) and found that skill is typically lost after roughly 2 weeks (see below and published in the peer-reviewed literature). This figure shows the forecast error (root mean square error) at 500 hPa—about 18,000 ft, a good level to view atmospheric predictability. The situation is the same over Washington [State], the western U.S., the continental U.S. or global. Skill is rapidly lost the second week out.
While meteorologists struggle to produce improved forecast skill past two weeks, we have gained a great deal of skill at the shorter time ranges, particularly for days 3-8.
So why is our skill improving rapidly for the shorter periods, but not the longer ones?
Because the forecasting problem is very different at the different temporal scales.
For the short periods, forecasting is an initial value problem. We start with a good description of the 3D atmosphere and our models simulate how things evolve. Because of weather satellites and other new data sources, our initial description of the atmospheric has gotten MUCH better. And our models are much better: higher resolution, much better description of key physical processes, and more. That is why a plot of the skill of the 1-10-day forecasts of the European Center has improved greatly over the past decades (see below [a graph is given])
But small errors in the initial description of the atmosphere and deficiencies in our models inevitably lead to growing errors, and by 2 weeks such errors swamp the forecast. The forecasts are not much better than simply using the average conditions (or climatology).
[Boldface in the original, Mass illustrates the problems with two graphs that are omitted in the above.]
According to NCAR-UCAR, the Japan Meteorological Agency has the best set of reanalysis data used for initiating daily forecasts:
“Spanning 1958-present, JRA-55 is the longest third-generation reanalysis that uses the full observing system (in contrast, products like ERA-20C and NOAA 20CR assimilate a very limited set of observations while NCEP R1 uses an antiquated model and assimilation scheme).
· Longest-running full observing system reanalysis with 4DVar
· Incorporation of several new observational datasets, new radiation scheme, and variational bias correction results in many improvements over JRA-25
· Two companion datasets are available that allow users to address the impact of data assimilation: JRA-55C using conventional observations only and JRA55-AMIP using no data assimilation
· As with most reanalyses, diagnostic variables including precipitation and evaporation should be used with extreme caution
· Dry bias in upper and middle troposphere and in regions of deep convection
· Time-varying warm bias in the upper troposphere”
Again, we see there is no ideal dataset. But JAR-55 gives a long-term record of weather that becomes a record of climate change. Yet, important to weather modelers, it is ignored by climate modelers.
When pressed on why climate models diverge so widely, a typical excuse used by climate modelers is “different initial condition.” Why the initial conditions vary so widely is not explained? See links under Seeking a Common Ground and https://climatedataguide.ucar.edu/climate-data/jra-55
“Model-land is a hypothetical world in which mathematical simulations are evaluated against other mathematical simulations, mathematical models against other (or the same) mathematical model, everything is well-posed and models (and their imperfections) are known perfectly.”
The abstract of the paper states:
“Both mathematical modelling and simulation methods in general have contributed greatly to understanding, insight and forecasting in many fields including macroeconomics. Nevertheless, we must remain careful to distinguish model-land and model-land quantities from the real world. Decisions taken in the real world are more robust when informed by our best estimate of real-world quantities, than when “optimal” model-land quantities obtained from imperfect simulations are employed. The authors present a short guide to some of the temptations and pitfalls of model-land, some directions towards the exit, and two ways to escape.”
The paper further states:
“Model-land is a hypothetical world in which our simulations are perfect, an attractive fairytale state of mind in which optimising a simulation invariably reflects desirable pathways in the real world. Decision-support in model-land implies taking the output of model simulations at face value (perhaps using some form of statistical post-processing to account for blatant inconsistencies), and then interpreting frequencies in model-land to represent probabilities in the real-world. Elegant though these systems may be, something is lost in the move back to reality; very low probability events and model-inconceivable “Big Surprises” are much too frequent in applied meteorology, geology, and economics. We have found remarkably similar challenges to good model-based decision support in energy demand, fluid dynamics, hurricane formation, lifeboat operations, nuclear stewardship, weather forecasting, climate calculators, and sustainable governance of reindeer hunting.”
This is followed by an imaginary “map” of Model-Land, stating the only way out is via the black hole in the middle.
It is appropriate that economists write such a parody. For years macro-economic modelers have dominated discussions of economic policy, invoking the name of Keynes in Keynesian Analysis, though they go far beyond anything he wrote. Perhaps the biggest travesty was the balanced-budget multiplier. An equal increase in government spending and tax revenues will expand prosperity. Taken to its logical conclusion, this concept would predict that the Soviet Union was the most powerful economic system ever conceived – until it imploded.
As Judith Curry discovered when she began to question climate models, leaving model-land can be a painful experience. What Christopher Booker described as groupthink is very much enforced. Curry’s overview is instructive. See links under Challenging the Orthodoxy and https://www.thegwpf.org/content/uploads/2018/02/Groupthink.pdf
Cloudiness: As Richard Lindzen and others have discussed, one of the major deficiencies in the reports of the UN Intergovernmental Panel on Climate Change (IPCC) and its global climate models used is the inability to treat clouds. An international program, the International Satellite Cloud Climatology Project (ISCCP) was established to address this issue. It’s web site states:
“ISCCP was established in 1982 as part of the World Climate Research Programme (WCRP) to collect and analyze satellite radiance measurements to infer the global distribution of clouds, their properties, and their diurnal, seasonal, and interannual variations. Data collection began on 1 July 1983 and is currently planned to continue through 30 June 2010. The resulting datasets and analysis products are being used to improve understanding and modeling of the role of clouds in climate, with the primary focus being the elucidation of the effects of clouds on the radiation balance. These data can also [be] used to support many other cloud studies, including understanding of the hydrological cycle.
“Data are collected from the suite of weather satellites operated by several nations and processed by several groups in government agencies, laboratories, and universities. For each operational satellite, a Satellite Processing Center (SPC) collects the raw satellite data and sends it to the Global Processing Center (GPC). The Correlative Data Center (CDC) coordinates the delivery of other satellite and conventional weather data to the GPC. The Satellite Calibration Center (SCC) normalizes the calibration of the geostationary satellites with respect to a polar orbiter satellite standard. All ISCCP data products are archived at the ISCCP Central Archive (ICA) and at NASA Langley Research Center (LARC).”
The Russian Academy of Sciences published a paper by O. M. Pokrovsky of the Russian State Hydrometeorological University. The Abstract states:
“The results of analysis of climatic series of global and regional cloudiness for 1983–2009. Data were obtained in the framework of the international satellite project ISCCP. The technology of statistical time series analysis including smoothing algorithm and wavelet analysis is described. Both methods are intended for the analysis of non-stationary series. The results of the analysis show that both global and regional cloudiness show a decrease of 2–6%. The greatest decrease is observed in the tropics and over the oceans. Over land, the decrease is minimal. The correlation coefficient between the global cloud series on the one hand and the global air and ocean surface temperature series on the other hand reaches values (–0.84) — (–0.86). The coefficient of determination that characterizes the accuracy of the regression for the prediction of global temperature changes based on data on changes in the lower cloud, in this case is 0.316.”
On her web site, Jo Nova posted two graphs from the Russian study. One graph showed a sharp decrease in Percentage of Global Cloud cover from almost 68% in 1986 to less than 64% in 1999-2000. One may recall that in the 1998-99-time frame there was a spike in satellite temperatures. The second graph gives a scatter gram of the relationship between surface-air temperatures and percentage cloudiness with a line drawn by a regression analysis.
“The regression linear approximation model suggests that a 1% increase in global cloud cover corresponds to a global decrease in temperature of about 0.07ºC and vice versa.”
The relationship in the graph is important but not compelling. There may be many independent variables explaining changes in temperatures other than cloudiness. A coefficient of determination (the correlation coefficient squared) is valuable but the data is very scattered. For example, it is not as compelling as the relationship between the solar activity proxy of Carbon 14 and the climate proxy of oxygen 18 shown in the 2008 report of the Nongovernmental International Panel on Climate Change (NIPCC), page 14. In that graph both variables move significantly, yet together.
What makes the relationship in the Russian report more compelling is that it is reinforced by other studies as discussed by Kenneth Richard in No Tricks Zone. Richard brings up five other studies showing changing cloudiness or changing solar intensity and surface temperatures: Herman et al., 2013; Goode and Palle, 2007; Loeb et al., 2018; Hofer et al., 2017 and Simpkins, 2017.
The Russian paper is another example why the IPCC and its followers have little basis for expressing high certainty in their products and the models on which their assertions are based. See links under Challenging the Orthodoxy.
Scientocracy: Climate model critic Norman Rogers reviews a book called Scientocracy, a play on the word aristocracy. The scientocracy depends on the honest reputations of past investigators of the nature world. However, the scientocracy serves bureaucrats and will fake science and evidence to serve their goals. Specific examples are given in the eleven essays covering different parts of science. What happens when the credibility of science is damaged and destroyed appears to be of concern to the scientocracy. See links under Health, Energy, and Climate.
Wind Blown Travesty? The International Energy Administration (IEA) is an autonomous organization established in 1974 under the Organization for Economic Development (OECD) to address the 1973 oil crisis caused by the Arab Oil Embargo. Member states include most of North America, western Europe, and developed states in Asia. China and Russia are not members. TWTW frequently cites studies by IEA
On October 25, IEA issued Offshore Wind Outlook 2019: World Energy Outlook Special Report. This report ranks with the special reports issued by the IPCC. It is difficult to separate science from science fiction because the part of the report available for review is largely free of physical evidence supporting critical details such as storage and corrosion from saltwater spray.
The Conclusions are:
“• Offshore wind is set to be a $1 trillion industry over the next two decades, but the promise of growth hinges on government policies and industry strategies
• Policy makers need to provide long-term visibility for supply chains to be efficient, and need to manage maritime planning and onshore grid development
• Offshore wind can become one of the most competitive sources of electricity if market conditions are right and technology cost reductions materialize.
• Offshore wind contributes to electricity security and makes energy transitions more affordable. Hydrogen and further innovations, such as floating turbines, expand opportunities
• The IEA will continue to focus on ‘all fuels and all technologies’ to provide the world’s best energy data, independent & rigorous analysis & real-world solutions”
If the reports of actual experiences from Germany and United Kingdom of wind generation in the North Sea are considered, there will be a major problem from over-generation part of the time and no generation part of the time. The California Duck Curve may apply. On his blog, Paul Homewood summarizes the promotion of this report well:
“What the IEA do say, and which the Mail does mention further down, is that offshore wind capacity will increase 15-fold in the next two decades.
“Given that it only accounts for 0.3% of global generation, this would increase its share to a still tiny 4.5%. And that assumes that total demand does not increase, which is highly unlikely.
“Somehow I don’t see it saving the world!”
Human imagination regarding reports such as this is boundless See links under Communicating Better to the Public – Make things up, Alternative, Green (“Clean”) Solar and Wind, and Energy Issues – US.
California Fires: The California fires driven by winds from the east continue but appear to be subsiding in Northern California, above the San Francisco Bay area. In the northern areas the winds are called Diablo winds in the south Santa Ana winds. One of the confusing issues is that these warm, dry winds appear to be driven by cold, dry air from Canada moving into the Great Basin. The Great Basin is the area between the Sierra Nevada Mountains and the Rockies, between Oregon and Arizona, centered near the Great Salt Lake. The Great Basin does not drain into the sea.
In brief, the cold, dry air from Canada pushes the warm, dry air over the mountains, causing the warm, dry air to flow down canyons on the other side, resulting in dangerous fire conditions. If anything, carbon dioxide-caused global warming will result in fewer incidences of cold, dry air moving into the Great Basin, thus lessening the threat of these fires. See links under Changing Weather and California Dreaming
Number of the Week: 30 to 40°F (17 to 22°C) below average According to meteorologist Chris Martz, on October 29 cold air from Western Canada caused temperatures to drop some 30 to 40°F below average in the Rockies and the Great Basin. This cold air caused high winds in California.
Senseless in Seattle
Students debate Seattle’s plan to ‘rehumanize’ K-12 mathematics with talk of ‘privilege’ and ‘oppression.’
Editorial, WSJ, Oct 29, 2019
TWTW Summary: A few of the responses to the Wall Street Journal question:
Editor’s note: This Future View is about Seattle’s plan to teach math with an ethnic-studies and social-justice twist.
‘Privilege’ and ‘Oppression’ Don’t Add Up
Coverage of the Seattle school district’s push to inject lessons of privilege, oppression and ethnic studies into K-12 math classes reads like the satirical writings of the Onion. Classroom questions such as “What is my mathematical identity?” and “Who gets to say if an answer is right?” make the curriculum into a joke.
The beauty of mathematics lies in its impartiality. Calculations can be misconstrued, but the numbers themselves are unbiased, representing factual, objective answers. Seattle’s curricular framework includes the question, “Can you advocate against oppressive mathematical practices?” But “oppressor” and “oppressed” share the same math—that’s the whole point.
Before this era of social-justice education, students may have read George Orwell’s dystopian novel “1984.” Winston Smith, the novel’s protagonist, is taught and ultimately convinced that “2 + 2 = 5.” Seattle educators, like the fictional regime in Orwell’s novel, hope to “complicate” whether an answer is right or wrong. How progressive!
— Aaron Schnoor, Campbell University, trust and wealth management
This Is Why We Need School Choice
If Seattle’s educators want to teach math with social-justice themes, that’s their decision. Education is best controlled at the local level. But parents should be given choices, too.
I’m willing to be persuaded that math should be taught differently. The remedial math courses at American universities are way too full for educators to say confidently they know how to teach math effectively in high school or earlier. Even if social-justice education doesn’t seem like the most promising solution, math should be taught in different ways at different schools—so long as parents and students can choose between schools to find the right fit.
School choice would help lower the stakes of curriculum decisions by expanding options. Seattle’s curricular framework is big news primarily because it threatens to trap a generation of children in social-justice math classes. There’s no reason for that. Seattle’s schools should teach math however they want, but they should also let parents enroll students wherever they want.
— Dominic Pino, George Mason University, economics
Students often complain about math class because they fail to see its practical applications. That’s a shame—math is used in nearly every discipline. Introducing a holistic learning program that uses math to tell stories of power, privilege and oppression will make math a more engaging subject, encouraging students to pursue it further.
Social justice should be incorporated into mathematics education at a young age, helping to attract and form the next generation of social scientists. Calculating changes in incarceration rates over the course of the war on drugs is a good example of the practical use of math for facilitating discussion. Exposing children to the difficult realities of the world at a formative age is imperative to ending the covert racism and discrimination that is built into society. Students need to learn to think critically and identify problems in the systems that surround them, even if that provokes a bit of controversy.
Students need to learn to think critically and identify problems in the systems that surround them, even if that provokes a bit of controversy.
— Martin Ryan, Siena College, economics
Keep This Dogma Out
One of the central dogmas of social justice is that there are only two groups of people: the oppressors and the oppressed. Increasingly progressives sort people into these groups based on their race, sex, class or other identity groups. When incorporated into an educational curriculum, this ideology prods students to reduce other people to their identity groups—and ultimately to the binary of good and evil—rather than understand them as unique human beings.
The result is a dark, distorted view of our society that fosters unreasonable hostility and makes dialogue more difficult. It is troubling that our country’s youngest students, whose minds and views are most malleable, could be exposed to this ideology. Social-justice politics have no place in U.S. elementary and secondary schools, and especially not in math classes, which should be grounded in numerical reasoning and objective truth. The gloomy doctrine has already infected universities and, in many places, high schools. Following Seattle, will it now demand entry into primary schools, too?
— Jack Chapman, University of Texas at Austin, finance and government”
There were more responses reported