How Journalism Distorts Reality

Journalism provides us with important information about what’s going on in the world. But when you consider the incentives that journalists have, combine that with their usual lack of scientific training, and add in the constraints of the medium in which they work, serious distortions of reality can result. Many journalists produce excellent work. But others leave you less informed after reading their articles than before you began.

What causes journalistic distortion?

1. Equal time to each side. There are many issues for which there are two or more reasonable positions that a person can hold. Then there are those issues where one side is supported by nearly everyone who has relevant expertise, and only a few fringe people oppose that view. The trouble is that stories about highly unbalanced issues can lead to a false impression of balance, either because the journalist feels compelled to spend equal time discussing each view-point, or because the journalist is himself unaware of which side is more trustworthy. And a person with highly unrepresentative but highly quotable opinions may be quoted in the article more than is warranted. It may seem less biased to present both sides, but when one side is almost certainly right, an equal presentation may distort more than it informs.

2. Selective reporting. Since news organizations are in the business of selling the news (or, at least, driving traffic to their websites) they have a monetary incentive to produce news that people will be eager to read. Feel-good stories about a dog saving someone’s life can beat out information that might be more important or relevant to most people. What’s problematic from a reality distortion perspective though is that the rate at which events occur and the rate at which they are reported are massively out of sync. For each story about someone coming home from work only to be murdered by their ex-boyfriend, we never hear the millions of tales of people coming home to work only to sit down and eat dinner. This is problematic because the way the human brain tries to estimate how likely something is to occur involves an attempt to retrieve instances of that thing in memory. The more easily you can retrieve those instances, the more frequent you will tend to assume that thing is. If you’ve recently read about a few murder cases, you may have the impression that murder has become more common than it used to be, even if this is merely an artifact of journalists choosing (for whatever reason) to report on more murders. If you can easily think of an example of a shark attack, you may overestimate the frequency of sharks killing people.

The vividness of the accounts we hear can also alter our perception. A vivid retelling not only increases the chance that we remember an account, but also tends to increase our emotional response to it. If you’ve recently read an article that described a gruesome murder in horrid detail, you may subsequently be more afraid when walking alone on an empty street. Through this mechanism, news reading can cause people to have excessive fear of things that aren’t very likely to harm them, and fail to fear far more dangerous things that are rarely reported on. You’re much more likely to die in a car crash than be killed by terrorism, yet in a world where terrorism is reported on constantly, you will likely fear terrorism more.

3. Mix-ups of correlation and causation. Just because X tends to occur together with Y doesn’t mean that X causes Y. In fact, it could be that Y causes X instead, or that both X and Y are caused by some third thing. Unfortunately, reporters frequently get this wrong (or at least fail to make the distinction clear to the reader), especially when reporting on scientific findings. Articles will insinuate that since the latest study found higher wine/broccoli/nicotine consumption was associated with greater longevity/health/focus, that means that wine/broccoli/nicotine must actually cause those benefits. A related problem that you’ll sometimes see (especially in articles about finance) is the implication that since Y came after X, that means that Y was caused by X. It may be true that many stock market investors reacted negatively to a new bill that was just signed into law, but that doesn’t mean that’s a causal explanation of why the stock market fell 1% today. There surely were many factors influencing the change in the market’s price, some tending to push it up, others tending to push it down. Even if it were true that the signing of the bill had a strong effect (which it might be difficult to confirm), that event certainly cannot take all the credit for determining the change in the market’s price.

by Jim Borgman

4. Use of low quality studies. Just because a study “proves” something, doesn’t make it so. In fact, most studies that are conducted are of poor quality for one reason or another. This could be due to a small number of study participants, lack of a control group, lack of randomization, the wrong choice of statistical test, flawed experimental protocol, poor choice of outcome measures, selective reporting of study results, or a variety of other reasons. Unfortunately, journalists rarely make it clear whether a study was of high quality, being mainly interested in what the study claimed to have found. Even the reporting of high quality studies can be problematic, if the journalist fails to mention other high quality studies that found different results. Given all the things that can go wrong in the design and execution of a study, we should be hesitant to accept the results even of those studies that look to be of high quality, until we have seen replication of the study by a different research team.

5. Lack of understanding. Many journalists write about a wide range of subjects. It is rare that they are true experts in the subject of a particular article. But as non-experts writing about what are sometimes very complicated subjects, there is the danger that journalists misunderstand the underlying subject matter. This problem occurs especially often in articles about highly technical research. The issue is compounded further by the fact that journalists are often working under tight deadlines, and so may lack the ability to do extensive background research.

6. Selective use of the facts. Even within a single story, the problem of selective reporting can be substantial. Not all facts in a case are equally entertaining or fit the narrative equally well. There is some incentive to favor those facts that improve the story over drier, though perhaps important material. Of course a political or other agenda on the part of the reporter can also determine which facts he chooses to report on. Since there is a tendency for liberals to read liberal news sources while conservatives read conservative sources, both groups may have their pre-existing views bolstered by selectively reported evidence.

7. Exaggeration of importance. News sells better if it sounds important, so news organizations have an incentive to make their news fit this criteria. One way to do this is to report on stories that actually matter to a lot of people, but sometimes it is better for the organization to just make whatever they’re reporting on sound more important than it is. The next big scientific breakthrough reported on turns out to be completely forgotten a few years or months later (but who remembers?) One of the most common forms of exaggeration in journalism is when a trend is constructed from a few data points. If a handful of celebrities are eating a lot of coconut, or museums have recently become a little more popular among people in their twenties, that doesn’t mean there’s a new fad that the world should hear about.

Choose your news sources carefully, because the information you consume determines what you believe about the world. And as incredibly valuable as journalism is, it can distort reality.


  

Comments

Leave a Reply to Ari Cancel reply

Your email address will not be published. Required fields are marked *