Health Journal

This researcher may have discovered the antidote to health bullshit

Andy Oxman is obsessed with the study of bullshit health claims and how to prevent them from spreading.

For decades, he’s been trying to find ways to get adults to think critically about the latest diet fads, vaccine rumors, or “miracle cures.” But he realized these efforts are often in vain: Adults can be stubborn old dogs — resistant to learning new things and changing their minds.

So Oxman, now the research director at the Norwegian Institute of Public Health, started to wonder whether the best hope for bullshit prevention lay with children. To put this idea to the test, back in 2000 he visited his then-10-year-old son’s class.

“I told them that some teenagers had discovered that red M&Ms gave them a good feeling in their body and helped them write and draw more quickly,” Oxman said. “But there also were some bad effects: a little pain in their stomach, and they got dizzy if they stood up quickly.”

He challenged the kids to try to find out if the teens were right. He split the class into small groups and gave each group a bag of M&Ms.

The children quickly figured out they had to try eating M&Ms of different colors to find out what happens, but that it wouldn’t be a fair test if they could see the color of the M&Ms. In other words, they intuitively understood the concept of “blinding” in a clinical trial. (This is when researchers prevent study participants and doctors from knowing who got what treatment so they’re less likely to be biased about the outcome.)

In a short time, they were running their own blinded, randomized trials — the gold standard for testing medical claims — in the classroom. By the end of their experiment, Oxman said, “They figured out that there was little if any difference in the effects of the different colors and they asked me if the teenagers who made the claim really believed that.”

The little classroom visit convinced Oxman he had to start schooling people in the ways of bullshit detection early in life.

So he began working with other researchers from around the world to develop curricula — a cartoon-filled textbook, lessons plans — on critical thinking skills aimed at school children.

In 2016, Oxman tested the materials in a big trial involving 10,000 children from 120 primary schools in Uganda’s central region.

The results of the trial were just published in the Lancet, and they showed a remarkable rate of success: Kids who were taught basic concepts about how to think critically about health claims massively outperformed children in a control group.

This means Oxman now holds the best blueprint out there for how to get young people to think critically and arm them with the tools they need to spot “alternative facts” and misinformation. His work brings us closer to answering that important question that haunted him — the one that should haunt all of us who care about evidence and facts: How do you prevent fake news and bullshit from catching on in the first place?

How researchers taught kids to spot “alternative facts” about health

One indispensable reading for anyone interested in evidence-based thinking in health is Testing Treatments (downloadable for free). The basic idea behind the book, as the book’s co-author Sir Iain Chalmers put it, is that “you don’t need to be a scientist to think critically and ask good questions.” In plain language, he and the book’s co-authors explain concepts people need to understand in order to sort reliable health advice from nonsense.

Building on the M&M experiment, in 2012, Oxman asked Chalmers whether they might adapt concepts from the book and try to teach them to primary school children in Uganda. (Oxman already had strong ties to Uganda, where he’d been leading a World Health Organization project to bring more research evidence to policymaking.)

With the book, the researchers had a template for the kinds of things they could teach. And they knew this exercise of inculcating skepticism in children, while uncommon in high-income settings, was even rarer in a developing country like Uganda, where pseudoscientific medical advice can spread with abandon, just as it can in the US.

The researchers, along with others from Uganda, Kenya, Rwanda, Norway, and England, worked to identify the most important ideas a person would need to grasp to think critically about health claims, including:

  1. Just because a treatment is popular or old does not mean it’s beneficial or safe.
  2. New, brand-name, or more expensive treatments may not be better than older ones.
  3. Treatments usually come with both harms and benefits.
  4. Beware of conflicts of interest — they can lead to misleading claims about treatments.
  5. Personal experiences, expert opinions, and anecdotes aren’t a reliable basis for assessing the effects of most treatments.
  6. Instead, health claims should be based on high-quality, randomized controlled trials.

They also drew up lesson plans and collaborated with teachers in Uganda to make materials that would resonate with local schoolchildren. Allen Nsangi, a Ugandan researcher and co-investigator on the trial, told me that a big part of that process involved mining local medical myths.

For example, she said, “Some people have been told to use locally available stuff like cow dung [on burns] — it’s almost the best known treatment.” (Spoiler alert: It doesn’t work.)

[Source”timesofindia”]

I am a blogger with the main motive of writing articles at my choice of level. I do love to write articles and keep my website updated regularly , if you love my article then be sure to share with your friends as they would love to read my article...

What's your reaction?

Related Posts

1 of 20