Nutrition science isn’t broken, it’s just wicked hard

My dad is an old-school rancher who uses a flip-phone, refuses to wear a seat belt and swears by the Atkins diet. Like many Americans on both sides of the political aisle, he’s skeptical of science. But not because he thinks Al Gore invented climate change, vaccines cause autism or GMOs are an elaborate corporate conspiracy. He’s skeptical of science because of eggs.

As a truck driver in the 1970s, catching the news on late-night hauls somewhere between Willie Nelson ballads and CB radio chatter, he learned that cholesterol was public health enemy No. 1 and that eggs were golden syrupy orbs of artery-clogging cholesterol, heart-disease in a shell.

Three decades, the invention of the Internet and a cornucopia of superfoods later, a few studies showed that eating an egg or two a day did not lead to high cholesterol, and Americans put eggs back on the table.

My dad is skeptical of science because scientists can’t even seem to settle on whether it’s a good idea to eat an egg, and people have been eating eggs for eons.

So why is such a seemingly simple question so difficult to answer? As someone who has spent months on experiments that ended up in the “appendix of failures” at the back of my dissertation, I can testify that science is hard. And nutrition science is really, really hard.  “A wickedly difficult field,” as David Ludwig, professor of nutrition at Harvard School of Public Health, put it.

Because human nutrition is exceedingly complex, “truth can only emerge from many different studies with many different methods,” he said.

That’s why I get frustrated when I see headlines like “Drinking more coffee leads to a longer life.”

Had the authors dug deeper, they might have noticed the multitude of studiesshowing that coffee, like eggs, wine and practically everything else we eat, somehow simultaneously causes and prevents illness. In fact, hundreds of nutrition studies come out every week. Even more are conducted, but “only the exciting stuff gets published,” said John Dawson, assistant professor of nutrition at Texas Tech. And only the flashiest publications draw headlines.

The stereotypical problem with news covering nutrition, said David Klurfeld, a nutritional scientist with the USDA, is that studies not designed to answer specific questions are portrayed as though they do.

The coffee studies in the news last week were what scientists know as observational studies. In these studies, researchers followed coffee drinkers and non-coffee drinkers and monitored when and how they died. The problem is, when you go about searching for differences between any two groups, you’re going to find them. “That doesn’t prove that coffee is providing the benefit,” said Ludwig. For example, people who drink coffee regularly might have higher incomes, drink fewer sugary beverages, or lead more active lifestyles.

Observational studies like these are useful for identifying interesting trends, but they do not demonstrate cause and effect. To test whether coffee prevents a certain disease, the researchers would need to conduct a randomized controlled trial. In this type of experiment, volunteers (preferably hundreds of them or more) are randomly assigned to one of two groups. In this case, one that drinks coffee and one that abstains.

“Then you have to get them to comply,” Klurfeld said. People who drink coffee will be hesitant to give it up, and people who don’t might be reluctant to start the habit. And there is no way of knowing whether they’ve done what you’ve asked them to.

Assuming the volunteers in the study actually play by the rules, there’s still a possibility that some other important difference exists between the two groups by chance. In the end, “these variables make it very difficult to come up with a definitive answer,” Klurfeld said.

So while randomized controlled trials are the gold standard of nutrition research, they’re still not a silver bullet. They’re rare, long, painstaking and extraordinarily expensive. “Large clinical trials across multiple sites can easily cost millions,” Dawson said. And they sometimes raise ethical quandaries. If we had reason to suspect that coffee might be bad for you, for example, should a group of people be assigned to drink coffee regularly?

And even with randomized control trials, “one study can never answer a truly important question by itself,” Ludwig said. The next question he asks is, “Do we have a plausible reason to expect there might be an effect?” For example, is caffeine or some other compound in coffee known to influence cells in a way that could protect against heart disease or cancer or other common killers?

Studies in animals or cell cultures in test tubes are useful for answering this type of question. But humans have diverged a long way from mice, and the complexity of human nutrition cannot be replicated in a test tube, so these findings aren’t definitive on their own either.

Only when multiple observational studies, randomized controlled trials and experiments in animal models or individual cells all point to the same answer do responsible scientists begin to draw conclusions about nutrition. The results of all of these studies taken together can help inform us about how to improve our diets.

Unfortunately, news can’t wait until a consensus is reached. So here are a few strategies you can use to identify which headlines you should pay attention to. First, make sure the study was conducted in actual living humans. Then, determine whether the study was observational or based on a randomized controlled trial.

Stories covering observational studies will make very generalized statements about the populations studied, like: “In Europe, where the Mediterranean diet is common …” or “people who eat breakfast regularly …”

If you can’t tell from the news coverage, follow the link to the study abstract and look for terms like “prospective cohort,” “cross-sectional” or “case control.” These are hallmarks of observational studies.

Whether the news is reporting on an observational study or a randomized controlled trial, Dawson and Ludwig recommend applying the “sniff test.” Ask yourself whether the claims make sense with what you know of your own experiences and human evolution. For example, we’ve been drinking and studying coffee for decades. If it killed people or made them super healthy, wouldn’t we have noticed by now?

There are lots of dietary trends that don’t pass the sniff test. Consider the fat phobia that erupted about the same time eggs made the bad food list. Ludwig called the low-fat craze a “nutritional disaster” because it caused many Americans to give up things we now know to be exceedingly healthy, like avocados, nuts and full-fat yogurt, while reaching for sugar-packed alternatives. Claims that cutting any given food from our diets will cure us sound too good to be true because they are.

Most importantly, “don’t change your diet based on one study,” Klurfeld advised, especially if that study has a small effect or contradicts a whole lot of other studies. In the case of coffee, barring the fact that these two new studies are observational, they still only showed that drinking coffee reduced mortality by about 10 percent. To put that into perspective, your odds of getting lung cancer if you smoke increase by about 1,000 percent to 3,000 percent.

So next time you hear that chocolate will help you lose weight, cocktails protect you from heart disease, binging on sugary fruit juices cleanses your liver, ancient grains like wheat are toxic, or an extra two cups of joe a day will make you immortal, ask questions. How strong is the evidence? Are there multiple studies saying the same thing? And does it pass your common-sense sniff test?