The Week That Was: July 25, 2020
By Ken Haapala, President, SEPP
Quote of the Week: “When we are planning for posterity, we ought to remember that virtue is not hereditary.” —Thomas Paine (1776)
Number of the Week: 12 datasets of evidence
July Summary Part III; Models and Observations: Two weeks ago TWTW reviewed Richard Lindzen’s new paper summarizing what we know with reasonable certainty, what we suspect, and what we know is incorrect about climate change, the greenhouse effect, temperature trends, climate modeling, ocean chemistry, and sea level rise. Key parts included:
1) The climate system is never in equilibrium.
2) The core of the system consists of two turbulent fluids interacting with each other and unevenly heated by the sun, which results in transport of heat from the equator towards the poles (meridional) creating ocean cycles that may take 1,000 years to complete.
3) The two most important substances in the greenhouse effect are water vapor and clouds, which are not fully understood and are not stable.
4) A vital component of the atmosphere is water in its liquid, solid, and vapor phases and the changes in phases have immense dynamic consequences.
5) Doubling carbon dioxide, (CO2), creates a 2% disturbance to the normal flow of energy into the system and out of the system, which is similar to the disturbance created by changes in clouds and other natural features.
6) Temperatures in the tropics have been extremely stable. It is the temperature differences between the tropics and polar regions that is extremely important. Calculations such as global average temperature largely ignore this important difference.
Last week, TWTW used the work of William van Wijngaarden and William Happer (W & H) to summarize what we know with reasonable certainty, what we suspect, and what we know is incorrect about the greenhouse effect. Both the gentlemen are experts in Atomic, Molecular, and Optical physics (AMO), which is far from simple physics, but is necessary to understand how greenhouse gases interfere (delay) the radiation of energy from the surface into space – how the earth loses its heat every day, mainly at night.
1) There is no general understanding of the greenhouse effect sufficient to develop elegant equations.
2) The optical depth or optical thickness of the atmosphere (transparency) changes as altitude changes. The depth is measured in terms of a natural logarithm and, in this instance, relates to distance a photon of a particular frequency can travel before it is absorbed by an appropriate molecule (one that absorbs and re-emits photons of that frequency).
3) Unlike other natural greenhouse gases, water vapor, the dominant greenhouse gas, is not well distributed in the atmosphere, its irregular. [SEPP Comment: It is variability during the daytime, the formation of clouds from H2O, etc., all combine to make it impossible to do theoretical computational “climate” dynamics with any value at all. Because H2O is known to be “all over the map” the Charney Report recognized a decent calculation was impossible. So, it went down the erroneous path of ignoring H2O and assumed a CO2 value; and then coming back in later with a “feedback” argument to try to account for H2O. It didn’t work then, now, or into the future.]
4) There is a logarithmic relationship between greenhouse gases and temperature.
5) “Saturation” means that adding more molecules causes little change in Earth’s radiation to space. The very narrow range in which Methane (CH4) can absorb and emit photons is already saturated by water vapor (H2O), the dominant greenhouse gas, below the tropopause, where the atmosphere is thick. Thus, adding methane has little effect on temperatures because its influence is mostly where the atmosphere is thin, transparent.
6) Their (W & H) calculations show that a doubling of CO2 will increase temperatures by no more than 1.5 ⁰ C.
Problems with Models: In September 2019, established Japanese climate modeler Mototaka Nakamura, wrote a book that is available on Kindle, which contains an English summary. Nakamura is the author of about 20 published papers on fluid dynamics, one of the complex subjects in climate change. Interestingly, Richard Lindzen was one of Nakamura’s thesis advisors at MIT. Nakamura mentions this in his discussion of ocean currents, namely the Thermohaline circulation. This circulation includes the Gulf Stream, which keeps Western Europe far warmer than it would be otherwise. [The late Bill Gray, who was a pioneer in forecasting hurricanes, was a strong advocate of the importance of the Thermohaline circulation.]
Based on Nakamura’s discussion, he is a stronger advocate of the Thermohaline circulation than Lindzen, particularly in the cold southward flowing water on the bottom of the Atlantic. In his discussion on this phenomena, Nakamura states Professor Lindzen may disagree, asking how do you know?
As presented in the September 28, 2019, TWTW, Australian reporter Tony Thomas, who has followed the climate issue for years, reviews the book, emphasizing that the certainty claimed by the UN Intergovernmental Panel on Climate Change (IPCC) and its followers is hollow.
Among other important changing phenomena, the climate system is largely made up of two fluids in dynamic motion, the ocean, and the atmosphere, and we simply do not know enough about fluid dynamics to make long-term predictions about the interactions of these fluids. According to Nakamura the climate models are useful tools for academic purposes, but useless for prediction. As quoted by Thomas, Nakamura writes:
“These models completely lack some critically important climate processes and feedbacks and represent some other critically important climate processes and feedbacks in grossly distorted manners to the extent that makes these models totally useless for any meaningful climate prediction.
“I myself used to use climate simulation models for scientific studies, not for predictions, and learned about their problems and limitations in the process.”
Nakamura and his colleagues tried to repair the errors:
“…so, I know the workings of these models very well. For better or worse I have more or less lost interest in the climate science and am not thrilled to spend so much of my time and energy in this kind of writing beyond the point that satisfies my own sense of obligation to the US and Japanese taxpayers who financially supported my higher education and spontaneous and free research activity. So please expect this to be the only writing of this sort coming from me.
“I am confident that some honest and courageous, true climate scientists will continue to publicly point out the fraudulent claims made by the mainstream climate science community in English. I regret to say this, but I am also confident that docile and/or incompetent Japanese climate researchers will remain silent until the ’mainstream climate science community’ changes its tone, if ever.”
Thomas writes some of the gross model simplifications are:
- Ignorance about large and small-scale ocean dynamics.
- A complete lack of meaningful representations of aerosol changes that generate clouds.
- Lack of understanding of drivers of ice-albedo (reflectivity) feedbacks: “Without a reasonably accurate representation, it is impossible to make any meaningful predictions of climate variations and changes in the middle and high latitudes and thus the entire planet.”
- Inability to deal with water vapor elements.
- Arbitrary “tunings” (fudges) of key parameters that are not understood.
As Richard Lindzen has stated for years, the models fail to capture changes in clouds including changing cloud area and that the sizes of clouds are too small for grid scale modeling.
Nakamura’s work reinforces what many, including Lindzen, have stated. But it is refreshing to see that a modeler who spent years trying to model the climate system recognizes how unsuccessful this 40 plus year effort has been.
To the above, one can quote from the beginning of the English appendix of Nakamura’s book:
“Before pointing out a few of the serious flaws in climate simulation models, in defense of those climate researchers who use climate simulation models for various meaningful scientific projects, I want to emphasize here that climate simulation models are fine tools to study the climate system, so long as the users are aware of the limitations of the models and exercise caution in designing experiments and interpreting their output. In this sense, experiments to study the response of simplified climate systems, such as those generated by the ‘state-of-the-art’ climate simulation models, to major increases in atmospheric carbon dioxide or other greenhouse gases are also interesting and meaningful academic projects that are certainly worth pursuing. So long as the results of such projects are presented with disclaimers that unambiguously state the extent to which the results can be compared with the real world, I would not have any problem with such projects. The models just become useless pieces of junk or worse (worse, in a sense that they can produce gravely misleading output) only when they are used for climate forecasting.
“All climate simulation models have many details that become fatal flaws when they are used as climate forecasting tools, especially for mid- to long-term (several years and longer) climate variations and changes. These models completely lack some of critically important climate processes and feedbacks, and represent some other critically important climate processes and feedbacks in grossly distorted manners to the extent that makes these models totally useless for any meaningful climate prediction. It means that they are also completely useless for assessing the effects of the past atmospheric carbon dioxide increase on the climate. I myself used to use climate simulation models for scientific studies, not for predictions, and learned about their problems and limitations in the process. I, with help of some of my former colleagues, even modified some details of these models in attempts to improve them by making some of grossly simplified expressions of physical processes in the models less grossly simplified, based on physical theories. So, I know the internal workings of these models very well. I find it rather bewildering that so many climate researchers, many of whom are only ‘so-called climate researchers’ in my not-so-humble opinion, appear to firmly believe in the validity of using these models for climate forecasting. I have observed that many of those climate researchers who firmly believe in the global warming hypothesis view the climate system in a grotesquely simplified fashion: many of them view the climate system as a horizontally homogeneous (no variations in the north-south and east-west directions) or zonally homogeneous (no variations in the east-west direction) system whose dynamics are dominated by the radiative-chemical-convective processes, smooth vertical-north-south motions in the atmosphere, and stationary oceans, and completely neglect the geophysical fluid dynamics, an extremely important and strong factor in the maintenance of the climate and generation of climate variations and changes. So, in their view, those climate simulation models that have ostensible 3 D flows in the atmosphere and oceans may be more than good enough for making climate predictions. They are not good enough. Incidentally, I never liked the term, ‘model validation’, often used by most climate researchers to refer to the action of assessing the extent to which the model output resembles the reality. They should use a more honest term such as ‘model assessment’ rather than the disingenuous term, ‘model validation’, and evaluate the model performance in an objective and scientific manner rather than trying to construct narratives that justify the use of these models for climate predictions. [Boldface in original]
“The most obvious and egregious problem is the treatment of incoming solar energy — it is treated as a constant, that is, as a ‘never changing quantity’. It should not require an expert to explain how absurd this is if ‘climate forecasting’ is the aim of the model use. It has been only several decades since we acquired an ability to accurately monitor the incoming solar energy. In these several decades only, it has varied by 1 to 2 Watts per square meters. Is it reasonable to assume that it will not vary any more than that in the next hundred years or longer for forecasting purposes? I would say ‘No’.
“One can stop here and proclaim that we can never predict climate changes because of our inability to predict changes in the incoming solar energy. Nevertheless, for the sake of providing some useful pieces of information that can help countervail rampantly bold and absurd claims such as ‘We can correctly predict climate changes that are attributable only to increasing atmospheric carbon dioxide to assess the human impact on the climate’, I will describe two problematic aspects of climate simulation models below. I also hear somewhat less bold claims such as ‘These models can correctly predict at least the sense or direction of climate changes that are attributable only to increasing atmospheric carbon dioxide.’ I want to point out a simple fact that it is impossible to correctly predict even the sense or direction of the change of a system when the prediction tool lacks and/ or grossly distorts important nonlinear processes, feedbacks in particular, that are present in the actual system.” [Boldface added.]
The major problems in the climate models that Nakamura describes further are ocean flows (ocean circulation) and water in the atmosphere. See links under Challenging the Orthodoxy.
Testing Models: Repeatedly, John Christy of the Earth System Science Center at the University of Alabama in Huntsville (UAH) and others, have shown that the models used by the UN Intergovernmental Panel on Climate Change (IPCC) grossly overestimate the warming of the atmosphere over the tropics, where the greenhouse effect occurs. The one exception is the model from the Institute of Numerical Mathematics of the Russian Academy of Sciences. A new fleet of models is coming out called the Coupled Model Intercomparison Project version 6 (CMIP6).
As demonstrated by the Paris Agreement, the goal of the UN Framework Convention on Climate Change (UNFCCC), the IPCC, and its followers is to reduce carbon dioxide influence on surface temperatures. Before the CO2 influence on surface temperatures is reduced, the CO2 influence on atmospheric temperatures must be reduced. Thus, using trends from widely scattered surface instruments as a proxy of what is occurring in the atmosphere is a poor choice, because comprehensive atmospheric temperature trends have been available for 30 years, with measurements beginning in 1979, forty years ago.
In a forthcoming paper in Earth and Science, Ross McKitrick and John Christy compare the “historic” values calculated from 38 new CMIP6 models with datasets from three different types of observations.
“(1) Radiosonde (or sonde) data are measured by thermistors carried aloft by balloons at stations around the world which radio the information down to a ground station. Sondes report temperatures at many levels, and we use here annual averages at the standard pressure-levels: 1000 (if above the launch site), 850, 700, 500, 400 300, 200 150, 100, 70, 50, 30 and 20 hPa.”
“(2) Since late 1978, several polar-orbiting satellites carried some form of a microwave sensor to monitor atmospheric temperatures. These spacecraft would circle the globe roughly pole-to-pole making a complete orbit in about 100 minutes. They were (and are) sun-synchronous so the Earth would essentially rotate on its axis underneath as the spacecraft orbited pole to pole so that essentially the entire planet is observed in a single Earth-rotation (or day). The intensity of microwave emissions from atmospheric oxygen are directly proportional to temperature, thus allowing a conversion of these measurements to temperature. Since the emissions come from most of the atmosphere, they represent a deep layer-average temperature. For our purposes we shall focus on two deep layers, the lower troposphere (LT, surface to ~ 9 km) and the midtroposphere (MT, surface to ~ 15 km).” [Boldface added.]
“(3) The third category of these datasets are known as Reanalyses. In this category, a global weather model with many atmospheric layers ingests as much data as possible, from surface observations, sondes and satellites, to generate a global depiction of the surface and atmosphere that is made globally consistent through the model equations. We will access the temperature data from these datasets at 17 pressure levels from the surface to 10 hPa and will be able to calculate the deep-layer averages that match those of the satellite measurements.”
The model runs came from the Lawrence Livermore National Laboratory archive. The time period covered was 1979 to 2014 for which data for both models and observations were complete.
“For this study we used the period 1979-2014 from the simulation set that represents 1850-2014 in which the models were provided with ‘historical’ forcings. These time-varying forcings are estimates of the amount of energy deviations that occurred in the real world and are applied to the models through time. These include variations in factors such as volcanic aerosols, solar input, dust and other aerosols, important gases like carbon dioxide, ozone and methane, land-surface brightness and so on. With all models applying the same forcing as believed to have occurred for the actual Earth, the direct comparison between models and observations is appropriate. The models and runs are identified in Table 2 [not presented here]. We also list the estimated Equilibrium Climate Sensitivity (ECS) values for the 31 models for which we were able to find values, usually through unpublished online documentation (sources available on request.”
As stated above, the climate is never in equilibrium, so the Equilibrium Climate Sensitivity is an idealized concept of how much the global average temperature of the earth will increase if carbon dioxide is doubled. As stated by Lindzen, above, global average temperature is an idealized concept that is not particularly important.
Global climate models are notorious for producing significantly different results for different runs of the model. This is what produces the spaghetti-like mess when the model results are displayed in a graph. So, McKitrick and Christy developed 95% confidence intervals for all the model runs and average observations from the observing systems for the lower troposphere (surface to about 9 km (30,000 feet)) and the middle troposphere (surface to about 15 km (49,000 feet))
The authors conclude:
“The literature drawing attention to an upward bias in climate model warming responses in the tropical troposphere extends back at least 15 years now (Karl et al. 2006). Rather than being resolved the problem has become worse, since now every member of the CMIP6 generation of climate models exhibits an upward bias in the entire global troposphere as well as in the tropics. The models with lower ECS values have warming rates somewhat closer to observed but are still significantly biased upwards and do not overlap observations. Models with higher ECS values also have higher tropospheric warming rates and applying the emergent constraint concept implies that an ensemble of models with warming rates consistent with observations would likely have to have ECS values at or below the bottom of the CMIP6 range. Our findings mirror recent evidence from inspection of CMIP6 Equilibrium Climate Sensitivities (Vosen 2019) and paleoclimate simulations (Zhu et al. 2020) which also reveal a systematic warm bias in the latest generation of climate models.”
TWTW observes that three different types of datasets from observations are grouped tightly both for global and the tropics. For most of the models, the mean for satellite observations are below the lower confidence interval, for that model. The more money that has been spent on climate science, the worse the models have become when compared with observations. The US models are among the worst, to be discussed in a later TWTW. As Nakamura has written, they have no predictive value. The UN IPCC and its followers have clearly departed from the scientific method into the world of wild speculation. See links under Challenging the Orthodoxy and Defending the Orthodoxy.
New Guy in Town: A new paper claimed that the broadly accepted range of values given in the 1979 Charney Report for a doubling of CO2 of 3 ⁰C plus or minus 1.5 ⁰C (or 1.5 ⁰C to 4.5 ⁰C) was too low and using questionable statistics asserted that the 5 to 95% confidence interval for a doubling of CO2 should be 2 to 5.7 K (⁰C). TWTW agrees that the values in the Charney Report need to be changed. Based on observations of the atmosphere they should be lowered not raised. The paper by McKitrick and Christy indicate the need for a lowering, with the datasets ending in 2014. Thus, it is obvious that the authors of the new paper ignored the physical data from the atmosphere.
The lead author of the new paper is from Climate Change Research Centre at the University of New South Wales (UNSW) and ARC Centre of Excellence for Climate Extremes, a consortium of five Australian universities and others. It is supported by the Australian Research Council. Apparently physical data is not important for conducting science in Australia.
Tracing articles advocating the increasing of Equilibrium Climate Sensitivity (ECS), leads to the World Climate Research Programme (WCRP) whose web site reads:
The World Climate Research Programme (WCRP) leads the way in addressing frontier scientific questions related to the coupled climate system — questions that are too large and too complex to be tackled by a single nation, agency, or scientific discipline. Through international science coordination and partnerships, WCRP contributes to advancing our understanding of the multi-scale dynamic interactions between natural and social systems that affect climate. WCRP engages productively through these partnerships to inform the development of policies and services and to promote science education. Most critically, WCRP-supported research provides the climate science that underpins the United Nations Framework Convention on Climate Change, including national commitments under the Paris Agreement of 2015, and contributes to the knowledge that supports the 2030 Agenda for Sustainable Development, the Sendai Framework for Disaster Risk Reduction, and multilateral environmental conventions. [Boldface added]
The three co-sponsors are: The World Meteorological Organization (WMO), Intergovernmental Oceanographic Commissions of UNESCO, The International Science Council, which was “created in 2018 as the result of a merger between the International Council for Science (ICSU) (previously a sponsor of WCRP) and the International Social Science Council (ISSC).”
The WCRP appears to be another UN effort to expand influence by using fear in the name of science. See links under Defending the Orthodoxy and https://www.wcrp-climate.org/about-wcrp/wcrp-overview
Vote for Aprils Fools Award: The voting for the SEPP’s April Fools Award will be continued until July 31. Due to changes in schedules, there are no conferences held before then to announce the results. So, get your votes in now.
Number of the Week: 12 datasets of evidence. The McKitrick and Christy paper used 12 different datasets of evidence to establish that the new IPCC models, CMIP6, are exaggerating the warming of the atmosphere even more than the previous models, CMIP5, did.
By contrast, the new papers insisting that the influence of CO2 is greater than previously estimated use the concept of lines of evidence instead of current data. Lines of evidence are concepts developed by those trying to reconstruct past conditions or justify concepts that develop slowly. For example, the science of evolution uses several lines of evidence such as fossil evidence, homologies (common ancestors), and distribution in time and space (as the earth changed). Time can become a major problem in the imperfect record of the earth changing.
Mr. Gore demonstrated a major problem with time in his famous film in which he had time backwards. He showed CO2 increasing before Antarctic ice cores showed a warming. Actually, the ice cores showed warming before CO2 increasing. Mr. Gore was wrong. See links under Defending the Orthodoxy.
Corporations Seek Tax-Credit Cash-Out in Next Coronavirus Relief Plan
Duke Energy, Ford poised to benefit if Congress lets firms accelerate accumulated tax breaks
By Richard Rubin, WSJ, July 20, 2020
TWTW Summary: The article is summarized in its beginning:
“Many large U.S. corporations are sitting on piles of tax credits they may not be able to use for years. They want Congress to let them have the money now.
“Duke Energy Corp., Ford Motor Co., Occidental Petroleum Corp. and others could benefit if Congress includes a tax credit cash-out proposal in its next economic-relief legislation. Such a move, which is among ideas being considered by lawmakers and the Trump administration, could improve corporate cash flow by tens of billions of dollars.
“Duke has been unable to use all the corporate-research and renewable-energy credits it accumulated because it has been using accelerated tax deductions for capital investments to lower its taxable income, said Dwight Jacobs, the company’s chief accounting officer. That bumped it up against tax-code rules that limit tax credits, leaving $1.8 billion in unused credits on Duke’s books. Under the proposal, the company could get that within months instead of years.”
TWTW Comment: Of course, those selling tax credits for wind and solar will embrace the idea of getting cash flow without producing anything.