By Kip Hansen – Re-Blogged From WUWT
In today’s digital and Internet-of-Things world, it is easy to transform information into images — graphs, charts and other visuals that are colorful and informative. Modern math and statistical software packages can do it all for you with a few clicks of your mouse or taps on the screen. These visuals can be very powerful in conveying your message to the public. That’s on the upside.
On the downside is the fact that these visuals can be very powerful in driving false or misleading messages into the public consciousness.
These all-too-easy-to-create visuals are “a Blessing and a Curse”.
We have seen this in the last few days with the scandal of the Twit-o-verse banging on about fires in the Amazon promulgating photos that are more than a decade old (not of this time) or even of some other place. One of the results of the twitter-storm has been a special meeting at the G7 about Brazil’s fires and shock at the fact that the U.S. President didn’t attend that meeting. In the case of the fires in Brazil, it turns out that the numbers have become more important, in the public mind, than the what.
“Take a look at the Forest Fires Map at Global Forest Watch.
“Like the NY Times piece , mentioned above, the map makes it looks like whole countries are on fire. This is an artifact of the size of the dots marking fires. ZOOM IN ON BRAZIL. Zoom in until you can see the Federal District of Brasilia clearly. See how the fires clump together in agricultural areas — sometimes, close enough, you can see that it is a series of [burning] local fields or pastures that [it] records [as] fires.”
“Now, zoom out and find the Dominican Republic — the island of Hispaniola — just to the west of Puerto Rico. Looks awful doesn’t it, island on fire. Zoom in and in and in and you will find that it is not prime cane harvest yet, only a few cane fields burning — all set intentionally as a necessary part of the cane harvest. Some may be rice paddies — where the rice stubble is burnt off after the harvest of the rice and rice straw.”
“My point here is all these alarming stories require local knowledge and local ground truthing….” “.
I recently saw an article in some news outlet (memory fails me…) that led me to chase up a graphic used — about the number of natural disasters. This is the graphic:
The image comes from a site called Our World In Data. They announce themselves as having “Research and data to make progress against the world’s largest problems. Scroll to all research. 2989 charts across 297 topics. All free: open access and open source.” The Our World In Data web site tells us that they are based at Oxford and are trusted in research and media and used in teaching.
Yet, there is something that does not seem right in that “Number of reported natural disasters” graphic. And there is something that is a clue (and thus its saving grace…). It is the word “reported”. The second clue is the gray text at the bottom giving the source of the data as EMDAT.
Trust me, I checked up on them. First to see who Our World In Data was, with the results above, and then checked out who EMDAT is.
Our first question is: Who supplied the data? Answer: EMDAT or, more completely, Centre for Research on the Epidemiology of Disasters – CRED School of Public Health at the Université catholique de Louvain (and in their native tongue: Centre de Recherche sur L’Epidemiologie des Désastres Institut de Recherche Santé et Société). They research and maintain databases about Emergencies and Disasters around the world. They are respectable and respected.
Then I checked the data itself — I wrote to EMDAT:
[After introducing myself and under a copy of the bar graph of Reported Natural Disasters above]
“The data shown does not align well with my understanding of Global Natural Disasters, in that it shows a HUGE increase from 1970 to about 1998. My guess would be that 1970 to 1998 represents an increase in REPORTING and not in actual Natural Disasters.
Can you confirm this please — or correct me if I am wrong”.
I received a pleasant reply, albeit nearly a month later, as follows:
“Thank for your e-mail. You are right, it is an increase in the reporting. I share your e-mail with your director, Prof. D. Guha-Sapir, who may want to add her input.”
(this reply is from the database manager at EMDAT)
The importance of checking the data becomes shockingly clear. EMDAT data and Our World In Data visuals are used — and reportedly trusted — by major media outlets — as shown in this graphic at Our World in Data — New York Times, The Wall Street Journal, Financial Times, CNN, PNAS, The Guardian, Science, BBC, Nature, and more ….. you see the depth of their penetration into the media and journals. When these outlets use Our World in Data graphics, or re-use the data underlying the graphics, it may well be that the journalists don’t check the data itself.
The result is that when a news organization Googles “natural disasters world” it gets the image below:
There we see it again. Huge rise in natural disaster since the 1970s to the turn of the century — and it is an entirely a false impression. The Google search makes it look like the data is from the World Health Organization — and who would doubt them? Of course, the actual data is correct — in its own way — those are the numbers of reported natural disasters and everything before the 1998 or so was due to spotty, incomplete reporting and the rise is solely “an increase in the reporting.”
Once reporting infrastructure was set up properly by the late 1990s, we see the opposite — a decline in reported natural disasters.
Readers are urged to guess how many journalists will have taken time and made the effort to check the data purported to come from the World Health Organization?
Just one more:
Geography of loss—a global look at the uneven toll of suicide by Meagan Weiland and Nirja Desai (Aug. 23, 2019). “This paper is part of Science’s special series on unraveling suicide.”
How much trust should we invest in the data on the graphic? Is it factual that suicide rates are plummeting in China and Greenland and India, and rising in the US and Argentina?
We needn’t look too far — starting with the graphic first:
“Mauritius: …is the only sub-Saharan country
the that consistently records and reports suicide rates. “ (emphasis mine — kh)
“India: even though its overall suicide rate has decreased, India accounts for more than 35% of female suicide deaths worldwide.”
“Greenland: Although Greenland’s rate has declined dramatically, it remains the highest in the world”.
Now, does anyone think that countries like the Democratic Republic of the Congo, which current has a raging ebola epidemic, are carefully recording each death of their citizens with ICD-10 (cause of death) codes for each death? And then reporting them to some international record-keeping organization? What about Venezuela? — which is currently in total political and social disarray? Ridiculous ideas, of course they are not recording and reporting suicide deaths because they cannot.
From the paper:
“Suicide is a worldwide problem, but its effects are uneven. Although suicide rates—all rates noted here are annual deaths per 100,000 people—are rising in some countries, including the United States, most countries are seeing declines, for reasons that include restrictions on access to lethal means and improved mental health care. According to the World Health Organization (WHO), most countries do not collect detailed data on suicide; data for many countries here were drawn from rates estimated by organizations such as WHO and the Institute for Health Metrics and Evaluation’s Global Burden of Disease project.”
Only the more modern countries, with functional national health organizations and modern hospitals backed by medical bureaucracies, can even hope to accurately record and report suicides. In many nations, suicide is stigmatic, and coroners and other medical professionals have often erred on the side of compassion (for the families) and recorded suicides as “natural death” or “heart attack” — anything but suicide. As reported in this paper “Comparative Analysis of Suicide, Accidental, and Undetermined Cause of Death Classification” — “It is likely that suicide may be under reported due to both the social stigma associated with suicide as well as the reluctance of a medical examiner or coroner to make this classification if supporting data are uncertain (Timmermans, 2005).”
So, what of the suicide rate map from SCIENCE magazine and the Weiland and Desai paper? It appears most countries should have simply been marked “Not Enough Data”. No mention is made of any confounding factors such as “improved reporting” in the United States and other western countries. Guesses are not appropriate for the purpose of guiding International Policy decisions.
The effort put into the graphic for Science has possibly been wasted as it only serves to misinform readers about the rates of suicide in the various nations.
1. Pretty graphics and fancy images do not mean that information/data is correct or dependable. They may not convey a factual visual impression of the data.
2. Just because a fancy image or graph comes attached to the name of a respectable organization is no reason to accept the data as it has been presented to you. If it is important to you — check it.
3. Pretty graphics can easily overcome or slip past readers’ critical thinking skills and thus misinform them.
4. For my money? The prettier the picture, the closer I look at the underlying data.