When the media get the math wrong – badly wrong

By Keith Devlin @profkeithdevlin

As I was looking around for a topic for this month’s post, I wondered idly what I had written about in Devlin’s Angle exactly a quarter century ago, in August 1996. (The column had begun in January of that year, as had the MAA’s website.) Given the many major changes in the world since then (for example, Google would not be founded for over two more years!), would my words then have any relevance today? They do. And how! 

As I read that old post, my media feeds were rife with articles and comments lambasting journalists for badly mangling a news item to the point where what was published massively misrepresented the facts. I’ll come to that snafu momentarily; but first, let me tell you about my August 1996 post.

Titled “Of Men, Mathematics, and Myths,” my post began with my taking the news media to task for constantly referring to a notorious domestic terrorist as “a mathematician,” a characteristic of the individual having no relevance to his terrorist activities, but which journalists nevertheless found irresistible since it pulled on the old trope of the mathematician as a strange, possible mentally disturbed, antisocial loner – an image that is no more typical of actual mathematicians than it is for any other profession.

Unfortunately, as I went on to note, mathematicians themselves have not always shied away from leaning on the “strange loner” stereotype when they have set out to communicate mathematics to a general audience. A major example of such – “major” because his popular writing was so successful – was the mathematician Eric Temple Bell, whose 1937 book Men of Mathematics wove a gloriously romantic, but totally fictitious tale of the great algebraist Evariste Galois. (See my earlier post, linked above, for the gory details.)

The “math genius” trope in action. Dr Kiesenhofer is to be congratulated for both her Ph.D. and her Olympic Gold Medal in cycle racing, but both result from a lot of hard work, not “genius”.

The “math genius” trope in action. Dr Kiesenhofer is to be congratulated for both her Ph.D. and her Olympic Gold Medal in cycle racing, but both result from a lot of hard work, not “genius”.

The trouble is, romanticizing or sensationalizing an individual or an event in order to attract an audience can have long-lasting, negative consequences. Perpetuating the myth of mathematicians as “strange loners” or suggesting (however implicitly) that pursuit of mathematics can lead to domestic terrorism (as described in my August 1996 post) can push impressionable young people away from the subject before they have time to learn enough about it to realize that the reality bears no relation to those fantasies.

So too can drawing on the even more ubiquitous “math genius” trope to describe anyone with sufficient mathematical ability to earn a doctorate (see image). Portraying mastery of mathematics as requiring “genius” (whatever that is) obscures the fact that success in math is, like success in most other things in life, “5 percent inspiration and 95 percent perspiration.” You just gotta keep working at it!

Unfortunately, as I remarked at the end of that 1996 post, once a “good fiction” is out there, the truth is rarely going to stop it.

So what of the misleading story dominating the news cycle when I was preparing my article? The topic was the latest data concerning the coronavirus pandemic; in particular, the results of a new study showing the increased danger posed by the virus’s Delta variant, which the US Centers for Disease Control and Prevention (CDC) viewed as sufficiently alarming to reverse their previous advice that fully vaccinated individuals could stop wearing masks when together with a group of likewise vaccinated people.

This July 29 tweet from the New York Times badly misrepresented the facts by misunderstanding the math. According to the data in the report, vaccinated people are in fact highly unlikely to spread the virus.

This July 29 tweet from the New York Times badly misrepresented the facts by misunderstanding the math. According to the data in the report, vaccinated people are in fact highly unlikely to spread the virus.

Unfortunately, even such highly regarded news outlets as the New York Times and the Washington Post (which broke the story) got the math badly wrong, which meant the stories they published were grossly misleading — and dangerously so since they gave the impression that the vaccines were nothing like as effective as previously claimed.

In fact, the new data emphasized just how effective the vaccines are. The confusion at the Times, the Post, and elsewhere, came from falsely interpreting, and hence misreporting, conditional probabilities.

“Ah, say no more!” experienced math teachers will say. “People – and courts of law for that matter –  have a devil of a job getting conditional probabilities right.” Though they involve nothing more than simple counting, they can indeed be tricky. In this case, the journalists, including the editors who write the headlines and the social media ledes, may have believed they were accurately reporting what the study actually said. But for such an important story, about a quite literally “life-and-death” issue, they should have checked with folks who know about data science.

The Washington Post also misunderstood the mathematics in the (then unpublished) new CDC report. Their choice of data to highlight was an instance of the classic “Prosecutor’s Fallacy” — a frequently far too successful arithmetical sleight-of-hand.

The Washington Post also misunderstood the mathematics in the (then unpublished) new CDC report. Their choice of data to highlight was an instance of the classic “Prosecutor’s Fallacy” — a frequently far too successful arithmetical sleight-of-hand.

Certainly, once the errors were pointed out, both newspapers were quick to provide corrected stories. But they screwed up their initial reporting with a Tour de Force of basic innumeracy.

In doing so, they appear to have been heavily influenced by what they thought were newsworthy “man bites dog” aspects of the report, though for the most part the real story was a mundane, though important, “dog bites man” issue.

One of the worst errors was to highlight, as the Post did, the fact that “vaccinated people made up three quarters of those infected in a massive Massachusetts covid-19 outbreak.” The three-quarters figure (74% to be precise) refers to the 469 cases studied, not the entire outbreak, and reflects the fact that Massachusetts has a high vaccination rate. It is essentially an irrelevant number, not significant to personal safety.

What is significant is the probability of a vaccinated person getting sick; that’s a very different number — notice how the two factors are switched round — and it is super low. In fact, only 4 of the vaccinated people were hospitalized (and none died). That was the figure the Post should have highlighted. The study actually confirmed that these vaccines are amazingly effective!

This error is an instance of what is often referred to as the “Prosecutor’s Fallacy,” where an unscrupulous prosecutor will try (all too often with success) to mislead a judge or jury by quoting the probability of finding a certain piece of evidence assuming the accused is guilty (usually a high number), whereas what is significant in terms of determining culpability is the probability that the accused is guilty given the evidence (often a very low number). Those two (conditional) probabilities are in fact connected mathematically by what is known as Bayes’ Formula, one important consequence of which is that when one probability is high, the other can be low, depending on the circumstances.

The Prosecutor’s Fallacy is a classic, sleight-of-hand switcheroo that, when executed with sufficient panache, can easily confuse and mislead the less numerate among us — and on occasion even we more numerate types, if we let our numerical guard down. The usual approach is to quote not the often high probability of finding the evidence, assuming guilt, but the complementary, tiny probability of finding that evidence, assuming innocence. Judges and juries frequently take this as proving that the probability the accused is innocent is that tiny figure, and hence they must be guilty, but it does nothing of the kind. In this case, I don’t think the Post intended to mislead, but misled they most definitely were, as were many of their readers it appears.

One of the first to draw attention to the errors was Ben Wakana, the deputy director of strategic communications and engagement at the White House, who jumped onto Twitter to try to correct the mistake. (See image.)

The deputy director of strategic communications and engagement at the White House took to Twitter to try to correct the reporting errors.

The deputy director of strategic communications and engagement at the White House took to Twitter to try to correct the reporting errors.

The often-sensationalist New York Post reported on Wakana’s activities in a fairly factual way, without going into any of the mathematics.

A more mathematically informed account of what the Post and the Times got wrong can be gleaned from the Twitter thread following the Post’s initial tweets.

Several of the early commentators in the thread clearly know their stuff when it comes to conditional probabilities. It’s definitely worth looking over the first 20 to 30 comments for some good insights. Follow the link.

In addition, the UK Daily Mail (not generally known for reporting accuracy) has published a readable summary of the CDC report’s findings. Read it with care. In doing and reporting mathematics, the order of words and phrases can be much more important than in daily life.

This tweet, from a teacher, presents a major lesson for the education world.

This tweet, from a teacher, presents a major lesson for the education world.

I note in particular that the Twitter thread linked above includes a comment from a teacher (see image), who lamented on the lack of adequate education on handling and interpreting data, in the United States. That is to my mind one of the most important lessons to be learned from this media snafu.

In battling a pandemic, mathematics provides our main eyes into the spread of the virus and can guide our actions. Scientists know how to use the math, but populations (and journalists) need to be able to understand the advice the scientists provide, and doing so requires some basic numeracy and data science skills.

I wrote about the way mathematics and data science (and graphs in particular) provide our main (and almost our only) way to follow and manage a pandemic, in my Devlin’s Angle post of May 1 last year.

The next day the New York Times tweeted a more accurate version of their disastrous initial post.

The next day the New York Times tweeted a more accurate version of their disastrous initial post.

My Stanford colleague Jo Boaler has also been pushing for more data science education in schools through her influential youcubed program. As this recent media snafu indicates, this is sorely needed in today’s society.

And that’s the main thought I want to leave you with. In fact, we need it so much, I think I’ll end by sensationalizing it (in true, all caps, Twitter fashion):

WE.

NEED.

DATA.

SCIENCE.

EDUCATION.