The following research article published in PlosOne describes the role of replication in research articles as well as the lack of media follow-up on research. I recommend that you ask your students to read this article and have a discussion about the role the media plays in delivering scientific research. Does the media have an obligation to follow up its own reports to confirm their validity?
Here are some highlights that I garnered from the article.
A key idea to understand is that the P-value from a hypothesis test is computed from the sample data. Therefore, the P-value is really the P-value based on the sample data in the study. We know that sample data will vary from sample to sample, so the P-value itself will vary from sample to sample.
P-values are getting a lot of press due to the fact that some initial studies that report statistically significant results cannot be replicated in subsequent studies.
Of particular interest is that most newspapers focus on so-called lifestyle studies. These studies pertain to associations between a pathology and a risk factor (such as the risk of lung cancer due to the choice of smoking versus not smoking or the risk of red meat consumption in getting colon cancer). Contrast these types of studies with non-lifestyle studies (the role of a new medication on depression).
In lifestyle studies, newspapers reported on 5 of the 39 initial studies and 58 of the 600 follow-up studies. In non-lifestyle studies, newspapers reported on 48 of the 366 initial studies, and 45 of the 3718 follow-up studies. What does this suggest?
Replication of results is important. Among the 156 primary studies reported by newspapers, 76 had results that were validated by subsequent analysis. Does this suggest less than a majority of initial studies reported by newspapers have their results validated by subsequent analysis?
Here is where it gets scary. Among 53 initial studies covered by newspapers, 18 were confirmed by subsequent meta-analysis. Among the 35 that were not confirmed, there were 503 studies of which 398 reported either the absence of a statistically significant effect, or a significant effect in the opposite direction. Only 1 of the 398 studies and only 1 of the 35 meta-analysis studies was covered by newspapers.
Citation: Dumas-Mallet E, Smith A, Boraud T, Gonon F (2017) Poor replication validity of biomedical association studies reported by newspapers. PLoS ONE 12(2): e0172650. doi:10.1371/journal.pone.0172650