Separating the Science from the Silly...
Nutritional studies (and the media headlines that follow) are often confusing. An article titled “Is Eating Eggs Really Just as Bad as Smoking Cigarettes” certainly sounds more impressive than “Meta-analysis of Egg Consumption and Risk of Coronary Heart Disease and Stroke.” It may be easy to place the blame on the news, but studies themselves can have methodology issues that lead to confusing or even incorrect conclusions. So how can you spot bad science?
Here are some tips.
Splashy headlines and incorrect reporting of results: As I wrote above, a headline may get you to click, but the article itself may have errors in reporting of results. In many cases, reporters aren’t scientists, so they may not fully understand what they are writing. Try to go back to the original article, especially if the results seem really out in left field.
Industry funded research: Just because Coke funds a study on the health risks of drinking soda, it doesn’t necessarily mean the results are bought and paid for. In many cases, these studies are still well done, with proper methodology. However, it’s important to note who funds a study and keep that in mind when reviewing the results.
Correlation and Causation Confusion. This may be one of the most least understood areas in research. Correlation means one thing happened, and then another thing happened, but it does not necessarily mean that there is any relationship between the two. It could be pure coincidence, or other factors may come into play. Look for studies that establish cause and effect.
Not enough samples in the study, or samples are not representative: A study in 15,000 people will be more applicable than a study done with 15 people. Additionally, studies involving only one age group or race may not apply to others in the population.
No control group: A well-done study has a control group and a group that receives an intervention. For example, a study on low sodium diets and blood pressure should have a control group of subjects eating a regular diet.
Results can’t be repeated: One study does not typically change how we treat patients, particularly if the results of the study are very different than previous research. For example, a study that concludes that people with diabetes should drink a bottle of orange juice at each meal would need to be replicated by other studies to confirm the findings.
The material is not peer-reviewed. No one polices the internet; many websites report “results” of studies that have not been through scientific peer review. Peer review is a rigorous process that involves other scientists examining the work to make sure that the methods and results are correct.