It seems you can’t open a newspaper or turn on the TV these days without a dose of medical information the latest risk to health, the newest treatment breakthrough, etc.
Unfortunately, much of what we read and see is grossly misleading. That can cause needless worry or give needless reassurance. It can also lead us to adopt unsound health practices.
To guard against medical misinformation, keep in mind some basic facts about science, reporting and human nature…
Science versus journalism
Scientific knowledge progresses slowly and tentatively. Most new studies add only a small piece to a very large puzzle.
On the other hand, reporters get paid for being dramatic. Pressure to produce an exciting story leads them to make studies sound more conclusive than they actually are. And bad news is inherently more compelling than good news.
Example: Finnish researchers found a higher rate of lung cancer among smokers who took beta-carotene supplements. Because many other studies have suggested that beta-carotene helps prevent cancer, this made for some pretty frightening headlines.
But the study wasn’t all it was cracked up to be. The people in the study had been smoking heavily for an average of 36 years and they continued to smoke during the six years of the study. Their experience was not necessarily applicable to others.
The researchers themselves conceded that the slight increase in cancer they found could have been simply the result of chance. That fact was ignored in most media reports.
Reading beyond the headlines
Newspaper headlines are designed to grab the attention of readers. TV spotlights sensational sound bites and eye-popping visuals. But in science, the truth usually lies in the details which are presented later, if at all.
Our emotions affect our interpretation of information presented to us. Once we see a picture of a child who is bedridden with leukemia, we’re unlikely to pay much attention to how or why he/she got there.
Remedy: Recognize your tendency to jump to conclusions. Read the whole story. As you do, ask yourself the same questions a good journalist would ask who, what, where, when, why, how and how likely. If details are lacking, don’t make any judgment until you get all the facts.
What do the numbers mean?
Statistics lend an air of authority to any health-related story. But what do the numbers refer to? Where did they come from? Reporters often ignore these all-important questions.
Example: It has been widely reported that condoms have a “16% failure rate.” The figure came from a 1990 survey in which people were asked whether their birth control method had ever failed. Though 16 in 100 condom users surveyed said that condoms had failed, that does not mean that condoms fail 16% of the time.
Be especially wary of surveys in which people “self-report” complex information. People don’t always tell the whole truth.
What’s the real risk?
To judge by the media, virtually everything puts us at risk of some dread disease.
First question to ask: Just how big is the risk? We tend to overreact to threatening, unfamiliar things, particularly when they lie outside our control.
Try to look at risk-related statistics rationally, not emotionally.
Example I: Illicit drugs can indeed be deadly. But even the scourge of heroin (6,000 deaths a year) and cocaine (8,000 deaths) pale in comparison to the toll taken by tobacco (400,000 deaths) and alcohol (90,000 deaths).
Example II: To sound dramatic, a reporter might emphasize that 2,500 people die each year from a particular disease and drive home the point with heart-wrenching interviews with their families. But 2,500 deaths out of the total US population are not very significant.
Example III: Most of us have heard that “one woman in eight will develop breast cancer.” But this figure is for lifetime incidence. It fails to take into account the fact that a woman’s risk rises with age. It also assumes that all women will live at least to age 85.
In fact, fewer than one 20-year-old woman in 200 will develop breast cancer by the time she reaches age 40 and only one 40-year-old in 25 will develop breast cancer by age 60.
The reported rise in breast cancer risk sounds even less terrifying when you consider that much of the rise can be attributed to widespread screening, which has brought more cases to light and that our ever-lengthening life spans are enabling more women to reach the age at which the disease usually strikes.
Bottom line: News reports often omit the context that gives meaning to statistics.
Respect the power of chance
Ten people with advanced cancer receive a drug and have a dramatic remission? It might have been the drug or they might have gotten better anyway.
When evaluating medical news, look for statistical significance. Statistically significant results are those that cannot be attributed to random chance.
The smaller the sample size (the number of people, animals, etc., participating in the study), the more likely that a dramatic result was due to chance.
When two things happen together, don’t assume that one causes the other. A question that news reports often ignore is whether confounding variables may be responsible.
Example: A study found that people who drink wine have fewer heart attacks than those who drink beer or hard liquor. Does wine have a magic ingredient? Or is there another explanation?
One explanation researchers mentioned (but news summaries didn’t) is that wine drinkers tend to be more educated and affluent than beer and whiskey drinkers. It’s well known that wealth and education go hand in hand with greater health and longevity.
This is one reason why double-blind, controlled studies are more convincing than others.
In such studies, two matched groups are compared. One group receives a drug, for example, the other doesn’t and neither the experimenters nor the experimental subjects know which group is which. This makes it more likely that observed effects are due to the drug and no other factors.
On the other hand, be skeptical of anecdotal reports. Anyone can select a striking story. With no basis for comparison, however, even the most vivid event is meaningless.
Example: When a talk show guest blamed his wife’s fatal brain cancer on her cellular phone, this caused panic among cellular phone users and even sparked a decline in cellular phone stocks.
But about 10 million people now use cellular phones, and the overall brain cancer rate in this country remains at six per 100,000. By chance alone, you’d expect 600 phone users to develop the disease each year. Far fewer cases have been reported.
Look to the source
Any study appearing in a scientific journal like The New England Journal of Medicine, the Journal of the American Medical Association or The Lancet has undergone a rigorous quality control process known as peer-review.
Peer review means that experts in the field have judged the study to be of sufficient quality and importance to merit publication. An unpublished finding reported on TV has not been subjected to peer review and is therefore less credible.
While “expert” commentary isn’t foolproof, it’s more reliable than a reporter’s interpretation. Of course, if you really want the details, you’ll need to read the original journal article.