Check out the session summary below.
Panelists:
- F. Perry Wilson, M.D., M.S.C.E., an associate professor at Yale School of Medicine
- Tara Haelle, M.A., AHCJ core topic leader/medical studies, independent journalist (moderator)
By Jake Thomas
Medical studies carry an air of scientific authority, but they’re not all created equal. Journalists should carefully scrutinize how they were conducted before reporting their findings. That was the key takeaway from this panel discussion.
During this session, Tara Haelle and Dr F. Perry. Wilson explained how to separate studies that point to effective medical treatments from those with misleading conclusions.
Haelle said there is a hierarchy of evidence when it comes to different types of studies, with randomized controlled trials at the top. These involve randomly sorting a group of people into sub-groups who are then given a drug, vaccine or other medical intervention.
One of these groups serves as the control group and is given a placebo, often an inactive substance that resembles the drug or intervention. Comparing the control group’s results to others can reveal that a drug or treatment had certain effects, Haelle said.
Also at the top of the evidence hierarchy are systematic reviews that involve a search of existing medical literature to determine if a protocol used in clinical trials will produce the same result, she explained. Next to those are meta-analyses that apply statistical techniques to previous studies on a drug or treatment.
Wilson said observational dietary studies often tout the health benefits or detriments of a particular food or beverage. For example, he said he found a study showing coffee causes breast cancer and another showing that coffee lowers the risk of breast cancer.
One problem with observational dietary studies is that they don’t randomize participants or measure “confounders,” factors that could skew results, Wilson noted. For example, study participants may be eating different diets, he said. Participants are also less likely to give accurate answers about their diets because of recall and social desirability bias.
“You might not want to say that, ‘I ate three pints of Ben and Jerry's between 2 a.m. and 4 a.m. every morning, right?’” he said.
The session covered other medical study pitfalls. Journalists were encouraged to go beyond a study’s introduction and look at its goals, group size and how additional variables affected results. Reporters should also see if a study used biomarkers, a measurable indicator of biological change such as blood pressure or body temperature.
The context of a study is also crucial. The panelists recommended finding out who funded a study, where it was published and the credentials of those who conducted it, Haelle said. And finally, reporters should contact outside experts and ask them about the study’s implications, red flags and if its findings contradict other evidence.
Jake Thomas is a senior reporter at The Lund Report, a nonprofit news outlet that covers Oregon’s health care system.