Introduction
The third report of the Annenberg Science Media Monitor focuses on news coverage of three high-profile retracted scientific findings. Our analysis examines reporting on withdrawn research by four authors:
- Miguel Ángel Martínez González, whose paper, “Primary Prevention of Cardiovascular Disease with a Mediterranean Diet,” was published in February 2013 in The New England Journal of Medicine;
- Oona Lönnstedt and Peter Eklöv, whose paper, “Environmentally relevant concentrations of microplastic particles influence larval fish ecology,” was published in June 2016 in Science;
- Brian Wansink, who has had a number of papers on human eating behavior retracted, including “Bad popcorn in big buckets: portion size can influence intake as much as taste,” published in September 2005 in The Journal of Nutrition Education and Behavior.
News pieces about retractions treat those due to unintended error differently than those withdrawn for malfeasance or fraud. Here we examine one of the former and two of the latter. Reports on the Lönnstedt-Eklöv study and on Wansink’s withdrawn work adopted the counterfeit quest narrative, while coverage of the Mediterranean diet attributed the retractions to researcher error rather than intentional deception.
Articles covering retractions often employ the counterfeit quest narrative, in which storylines chronicle the activities of a deceptive researcher who has gulled custodians of knowledge, such as journal editors and peer reviewers. Our analysis shows that, in general, counterfeit quest narratives report the circumstances leading to retractions, how errors or misconduct were uncovered, and the individuals who identified the issues, but fail to explain how retractions are evidence of the self-corrective norm of science in action. These narratives also fail to explain whether and how the scientific community has acted to prevent a recurrence.
Our research assumes that five principles should guide journalists’ and scientists’ communication about mistaken, flawed, and fraudulent science:
- Specify the finding that is being retracted.
- Indicate how the errors that led to the retraction were discovered and credit the individuals who uncovered the error.
- Indicate that detecting and retracting erroneous findings is evidence of science’s self-corrective norm in action.
- Indicate any action being taken by those in the scientific community to prevent a recurrence.
- Avoid generalizing from a few retractions to the conclusion that science as a whole is broken or in crisis.
The Annenberg Science Media Monitor is a project of Annenberg Public Policy Center’s Science of Science Communication program. The Science Media Monitor analyzes news coverage of widely reported scientific findings in order to increase the public’s understanding of the scientific process, and is supported by a grant from the Rita Allen Foundation.
Mediterranean diet
The initial report in The New York Times about the study titled “Primary Prevention of Cardiovascular Disease with a Mediterranean Diet” (PREDIMED), published on Feb. 25, 2013, in The New England Journal of Medicine, noted its significant implications for the well-being of the public:
Authored by Dr. Miguel Ángel Martínez González of the University of Navarra Medical School, the study randomized subjects into one of three conditions. Those in the first ate a Mediterranean diet high in fruits, vegetables and fish, with additional olive oil, while those in the second consumed the Mediterranean diet with additional nuts. Participants in the third condition ate a low fat diet.
The results of the study elicited widespread coverage. “Mediterranean diet shown to ward off heart attack and stroke” read the headline in The New York Times.2 “Mediterranean diet reduces cardiovascular risk” proclaimed The Washington Post.3
The retraction
The study was retracted on June 13, 2018, after a British anesthesiologist, John Carlisle, who was unaffiliated with it, argued that its findings were overstated and its method faulty. Among other concerns, the randomization was defective. So, for example, in one of the 11 study sites, rather than being randomly assigned to a condition, all participants were given the olive oil diet. In households with multiple family members, all members of the family were assigned to the same diet.
Media coverage of the retraction
Twenty-six articles were collected using keyword searches on LexisNexis, Factiva Dow Jones, and Google News for the terms “Mediterranean diet AND retraction.”
How did the media report the reasons for the retraction, who caught the errors or misconduct, and how were the errors uncovered?
Each of the 26 articles in our sample explained why the study was retracted, 69% (18) mentioned Carlisle, the scholar who detected the problem, and 73% (19) described how the errors were caught. Carlisle’s actions were framed as heroic, having an impact beyond the immediate retraction. A National Public Radio (NPR) post said, “Carlisle’s analysis drew accolades and added him to the ranks of a small but growing number of such sleuths. The success of the approach prompted Carlisle to expand his scope to include randomized trials from other researchers and in other fields.”4
Twelve articles not only featured Carlisle’s analysis but also included evidence of the PREDIMED team’s willingness to correct their mistakes. NPR reported, “When the lead author of the paper, Dr. Miguel Ángel Martínez González … saw Carlisle’s analysis, he conducted a detailed audit of the study and quickly noticed some problems.”5 The article also explained that after Martínez González and his team reanalyzed the data, they republished the findings in The New England Journal of Medicine (NEJM), albeit with a more modest conclusion about the diet’s benefits.
How was the responsibility for the error framed by the media?
The news accounts framed the errors as unintentional rather than intentional, and as mistakes not misconduct. In The New York Times, Donald Berry, a statistician at MD Anderson Cancer Center, said, “These people were naïve. They were sloppy and didn’t know they were being sloppy.” 6
Eight articles reported that some scientists criticized the revised version of the report. Barnett Kramer, director of the division of cancer prevention at the National Cancer Institute, told The New York Times that, “Nothing they have done in this re-analyzed paper makes me more confident.” 7 Other scientists who have continued to analyze the newly updated report have identified what they consider additional errors. 8
Some news outlets quoted scientists who accepted the reanalysis. Jeffrey Drazen, editor-in-chief of NEJM, published a statement asserting that, “Medical professionals and their patients can use the republished information with confidence.” 9
Are retractions evidence of self-correction in science?
Julia Belluz at Vox was the only journalist covering the retraction to suggest that it is evidence of science catching its own errors. She wrote, “PREDIMED was supposed to be an example of scientific excellence in a field filled with conflicted and flawed studies. Yet now it appears to be horribly flawed.” 10 But her analysis was ultimately optimistic: “Yes, studies with big flaws pass peer review and make [it] into high-impact journals, but the record can eventually be corrected because of skeptical researchers questioning things. It’s science working as it should…” 11
What steps are scientists taking to prevent future research error?
Belluz also was the only journalist covering the PREDIMED retraction to highlight changes that could be implemented in the relevant community to prevent a recurrence. She did not, however, note any responses already in process. Instead, she quoted John Ioannidis, a health researcher at Stanford University, who recommended that scholars “[p]erform more, long-term, large randomized control trials – like PREDIMED but do it right. We need to share the data, and make them publicly available, have people be able to look at them and see that they get the same results.” 12
Did journalists avoid generalizing from the retraction to the conclusion that science as a whole is broken or in crisis?
Nearly all of the news accounts avoided generalizing from the retraction. But the subhead of Belluz’s Vox article read: “Nutrition science is supposed to tell us how to eat. It’s in the midst of a crisis.” 13
The following analysis shows how often the five principles occurred in news stories about this retraction:
Hungry fish
Uppsala University (UU) postdoctoral researcher Oona Lönnstedt and Professor Peter Eklöv’s study, “Environmentally relevant concentrations of microplastic particles influence larval fish ecology,” published in Science on June 3, 2016, claimed that when microplastic particles were found in their environment in sufficient quantities, perch larvae preferred to eat them rather than plankton, their usual food source. The result, they concluded, was an increase in mortality among the fish. The study captured the attention of journalists and environmental advocates, who saw the findings as evidence of the impact of human pollution. So, for example, The Guardian ran headlines reporting “Microplastics killing fish before they reach reproductive age, study finds” 14 and “Minister says UK government ‘fully backs’ microbeads ban.” 15
In 2015, the United States banned the use of microplastics, including the exfoliating beads found in common bath products. The study seemed not only to confirm the wisdom of the ban, but also to warrant additional restrictions on similar pollutants. The Washington Post reported that the study was significant because “… scientists are still largely in the dark about how these microplastics are affecting the animals that eat them and how these effects might scale up and impact whole populations. But these kinds of studies are critical for helping policymakers decide what kinds of regulations are warranted when it comes to how plastic is created and disposed of.” 16
The retraction
On June 16, 2016, Lönnstedt and Eklöv’s colleagues Josefin Sundin and Fredrik Jutfelt, who shared a research facility with them, contacted the researchers with 20 questions about the study. Four days later, Sundin and Jutfelt asked the university to launch an investigation on the grounds that the study could not have been performed as reported. The number of fish cited in the study was greater than the number shared with the authors. Moreover, the experiment in their common lab was too small to produce the reported results. 17
After an investigation by a university panel dismissed these claims, Sundin and Jutfelt raised their concerns with the Central Ethical Review Board (CEPN), a research oversight committee in Stockholm. An investigation by the CEPN, which found that Lönnstedt and Eklöv were unable to produce the data supporting their results, concluded that the researchers had committed misconduct. 18 (Lönnstedt claimed that the missing data was stored on a laptop stolen shortly after Science requested access to it.19)
Science issued a statement of concern on Dec. 1, 2016, that included the authors’ explanation for the missing data. Eklöv eventually requested that the journal retract the study. Subsequently, Lönnstedt lost a three-year, $355,000 grant from FORMAS, a Swedish Research Council for sustainable development. 20 Eklöv lost a four-year grant from Uppsala University and was banned from receiving funding for two years. 21
Media coverage of the retraction
Twenty-five articles were located in keyword searches on LexisNexis, Factiva Dow Jones, and Google for the terms “(Lönnstedt OR Eklöv) AND retraction.”
How did the media report the reasons for the retraction, who caught the errors or misconduct, and how were the errors uncovered?
Reporting on the retraction focused on the competing claims of the researchers. All 25 articles described why the research was retracted, 36% (9) mentioned at least one of the two scholars who uncovered the problem (Sundin or Jutfelt) by name, and 80% (20) explained how the misconduct was uncovered.
Some articles outlined in detail the suspicions raised by Sundin and Jutfelt, as well as the responses made by the authors (Lönnstedt and Eklöv) and by Uppsala University. Of special note here is work by a writer for the publication in which the original article was published. Martin Enserink, of Science, outlined the accusations and responses. After the researchers explained their failure to produce the data, Enserink wrote that “The UU panel was satisfied. Lönnstedt and Eklöv had ‘thoroughly answered and explained’ every issue in a ‘satisfactory and credible manner’… The missing data were partly the result of a misunderstanding, it [the panel] said, adding, incorrectly, that ‘all necessary raw data has been freely available … for some time.’ ” 22 Enserink also reported that the university panel leveled a counteraccusation against Sundin and Jutfelt, writing, “The whistleblowers, the panel said, ‘appear to have a very strong desire’ for a misconduct investigation, but most of their issues could have been aired in ‘normal scholarly discussion.’ ” 23
While Uppsala University’s initial response cleared Lönnstedt and Eklöv of the first accusations, Enserink wrote that a follow-up investigation by the CEPN not only implicated Lönnstedt and Eklöv, but also highlighted problems within the university’s scientific establishment. Enserink wrote “If UU…bungled its investigation, as the whistleblowers in this case claim, it could bolster support for a plan released last month that would take misconduct investigations out of university hands and transfer them to a new government agency.” 24
Indeed, coverage focused as much on the competing statements made by the researchers as on the role of the university and the journal in addressing concerns.
How was the responsibility for the error framed by the media?
Articles assigning fault highlighted Lönnstedt’s inability to produce the data used in the study while generally downplaying Eklöv’s involvement. (Eklöv supervised the research remotely and did not visit the site during the reported experiment.) The coverage also pitted Eklöv against Lönnstedt in the aftermath of the retraction. Nature correspondent Quirin Schiermeier reported that “Eklöv wrote in an email … that he takes full responsibility for the errors in the animal ethical permit. ‘But most of all I am very disappointed on my colleague to find out that she actually had fabricated the data… ’ ” 25
Most criticism of the study reported by the media featured the results of the CEPN investigation. Schiermeier quoted a CEPN statement saying that the researchers “are guilty of misconduct in research by violating the regulations on ethical approval for animal experimentation.” 26 Daniel Cressey’s article in Nature also highlighted the CEPN’s criticism of the journal, noting that “[t]he experiments, as described in the paper, seem to lack adequate control experiments…and ‘it is remarkable that the article, given these deficiencies, was accepted by the journal Science.’ ” 27
Are retractions evidence of self-correction in science?
A single article said that the retraction was evidence of self-correction in science. In it, Signe Dean at Science Alert wrote that, despite the disappointing revelation of misconduct, “it’s good to know that science can recognise [sic] and fix mistakes when they find them, to ensure that we’re working with the best information going forward.” 28 Twenty-two articles instead foregrounded the responsibility of individual scientists for committing – and catching – the misconduct. The headline of an article by the American Council on Science and Health declared: “A Long Time Coming: Two Swedish Scientists are Found Guilty of Scientific Misconduct.” 29
James Delingpole at Breitbart presented the retraction as evidence that scientists publish results consistent with their political beliefs. He wrote that “environmental scientists produce [a] study which is widely reported around the world because its ‘facts’ accord so perfectly with the media’s guilt-ridden hysteria when it comes to any story to do with Evil Mankind Destroying the Planet… And you wonder why I keep telling you that the whole global environmental scare is one massive financial scam?” 30
What steps is the scientific community taking to prevent future research misconduct?
Only one article about the retraction, also by Enserink, mentions a call for reform as well as noting preventive steps that have already been taken. He wrote, “In a report to the Swedish government in late February, a group led by UU literature professor Margaretha Fahlgren proposed letting a new government agency, the Research Misconduct Board, handle all investigations.” 31 Enserink further criticized the journal Science, saying that other researchers were disappointed with it for failing to safeguard against this kind of misconduct and for inadequately publicizing the retraction. Enserink wrote that in response to the criticism, Andrew Sugden, a deputy editor at Science, acknowledged that the journal intends to make the retraction notice more visible to readers.
The Wansink retractions
For years, research by Cornell Professor Brian Wansink, director of the university’s Food and Brand Lab, has elicited widespread media coverage. His published works include the article “Bad popcorn in big buckets: portion size can influence intake as much as taste” co-authored by Junyong Kim of University of Central Florida and published in September 2005 in the Journal of Nutrition Education and Behavior, and the book “Slim by Design: Mindless Eating Solutions for Everyday Life,” published in March 2013. His published research has appeared in the Journal of Sensory Studies, the Journal of Product & Brand Management, and BMC Nutrition, among others. Not only did he serve as executive director of the U.S. Department of Agriculture’s Center for Nutrition Policy and Promotion under President George W. Bush, but Wansink was involved in former First Lady Michelle Obama’s “Let’s Move” program. He has been a regular contributor to The New York Times and a featured guest on cable news. His studies have been praised for their straightforward takes on human eating behavior. His findings included evidence that children preferred apples to cookies when the healthier option was branded with popular television characters,32 recipes in the 75th anniversary edition of “The Joy of Cooking” in 2006 contain larger portion sizes than earlier editions,33 and bottomless bowls encouraged people to eat more. 34
Wansink’s findings offered easily implemented solutions to the American obesity epidemic. In 2007, David Leonhardt wrote in The New York Times that Wansink’s “overarching conclusion is that our decisions about eating often have little to do with how hungry we are. Instead, we rely on cues like the size of a popcorn bucket – or the way we organize our refrigerator – to tell us how much to eat. These cues can add 200 calories a day to our diet, but the only way we’ll notice we are overeating is that our pants will eventually get too tight.” 35 Wansink’s straightforward approach to eating better was highly appealing; as Whitney Tilson and John Heins wrote in The Washington Post, “Wansink argues that simple disciplines [sic] can go a long way toward minimizing dumb decisions. For example, he suggests keeping healthy snacks nearby and devoting half of your plate to salad or vegetables.” 36
The retraction
In late 2016, Wansink drew the criticism of other scientists when he published a blog post praising his lab’s relentless pursuit of findings. The post 37 appeared to condone a practice called HARKing, “hypothesizing after the results are known,” a method of statistical manipulation to support a presupposed conclusion. Forbes reported, “When the data didn’t support his initial hypothesis, he told his students to go back and try another idea, and then another, and another, until something comes up positive.” 38
A number of scientists, including Jordan Anaya, a computational biologist; Nicholas Brown, a graduate student in psychology at University of Groningen; James Heathers, a postdoctoral researcher in behavioral science at Northeastern University; and Tim van der Zee, a Ph.D. student at Leiden University, as well as BuzzFeed reporter Stephanie M. Lee, scrutinized Wansink’s past work. A New Yorker article referred to the skeptics as “swashbuckling statisticians who devote time outside of their regular work to re-analyzing too-good-to-be-true studies published by media-friendly researchers – and loudly calling public attention to any inaccuracies they find.” 39
Former colleagues and students told Lee that Wansink encouraged them to dig through data to find significant results. If weak results were rejected by top-tier journals, they were instructed to submit to any others willing to publish the findings. The investigation also found that Wansink had misreported data in a study of the eating behavior of 8- to 11-year-olds; the findings were instead the result of a study involving preschoolers. 40
BuzzFeed reporter Lee wrote that “critics the world over have pored through more than 50 of his old studies and compiled ‘the Wansink Dossier,’ a list of errors and inconsistencies that suggests he aggressively manipulated data. Cornell, after initially clearing him of misconduct, opened an investigation.” 41 As of September 2018, Wansink had been accused of statistical manipulation, methodological misreporting, and self-plagiarism and had 15 articles corrected and 13 retracted. 42
On Sept. 20, 2018, Cornell announced Wansink’s resignation. In a statement, Provost Michael Kotlikoff wrote: “Consistent with the university’s Academic Misconduct policy, a faculty committee conducted a thorough investigation into Professor Wansink’s research. The committee found that Professor Wansink committed academic misconduct in his research and scholarship, including misreporting of research data, problematic statistical techniques, failure to properly document and preserve research results, and inappropriate authorship.” 43
Media coverage of the retraction
One hundred twenty-one articles were collected in keyword searches on LexisNexis, Factiva Dow Jones, and Google for the terms “Wansink AND retraction.”
report the reasons for the retraction, who caught the errors or misconduct, and how the errors were uncovered?
Of the articles on Wansink’s retractions, 95% (115) noted why the research was withdrawn, while 35% (42) mentioned an individual or individuals responsible for uncovering the misconduct. How the errors were caught was described in 36% (44) of the articles.
BuzzFeed contributed the largest number of articles about the retractions (10) in our sample and was the most thorough in explaining the steps taken to initiate the outcome. In her reporting, Lee dissected not only methodological issues, but also included evidence from Wansink’s correspondence with colleagues and first-person accounts from his past employees.
How was the responsibility for the error framed by the media?
In coverage of the retraction, most journalists blamed Wansink directly and exclusively – in 118 of the news accounts, Wansink was deemed responsible for the errors. Although a number of other researchers were implicated in the investigation, only Wansink and his collaborator Collin Payne, formerly of New Mexico State University, are reported to have faced consequences. Payne left his position at New Mexico State in January 2018, after the emails collected by BuzzFeed revealed years of collaboration with Wansink in statistical manipulation. 44 Wansink and his role as the principal investigator on research studies are a primary focus of reports that he encouraged students and collaborators to massage the data for findings.
Some of the stories about the retractions moved from a counterfeit narrative frame to one that indicted science more generally. So, for example, Jesse Singal at The Cut pointed out that Wansink’s retractions are part of a larger trend in unreproducible research in the social sciences:
This article and 21 others in our sample framed the issue as systemic rather than specific to Wansink and his lab. Singal also adopted the language of crisis when highlighting the pressure placed on researchers to publish early and often. He cites van der Zee’s report on the misconduct, writing that “…at the moment, the replication crisis is being fueled by ‘an incentive structure that rewards large numbers of publications reporting sensational findings with little penalty for being wrong.’ ” 46
Are retractions evidence of self-correction in science?
An article about the retraction in the Cornell Daily Sun, the university student newspaper, suggested that Wansink’s retractions could be understood as evidence of the self-corrective norm in action. Artur Gorokh wrote, “At first glance, [the investigation of Wansink’s work] is good news: when the evidence emerged that Wansink’s work might not be statistically sound, the scientific community took it upon itself to investigate the offender.” But Gorokh criticized the intense scrutiny visited upon Wansink, suggesting that the actions of the community are hypocritical and distract from legitimate issues in research publication. He continued, “What is problematic here is not the unfair treatment of Brian Wansink. It’s that because of this concentration on a single researcher the overall narrative gets warped. The story isn’t that Brian Wansink is a horribly unethical and ruthless scientist, it’s that social sciences are in trouble.” 47
What steps are scientists taking to prevent future research error?
Ars Technica, The Atlantic, BuzzFeed, Vox, New York Magazine’s blog The Cut, and an Agence France-Presse article published on Breitbart, totaling 8 percent of the articles (9) on the retractions, described ways that the scientific community could prevent future misconduct. Among the suggested improvements were better data-sharing practices and a less competitive funding structure, especially for young researchers. BuzzFeed quoted van der Zee as saying, “One of the fundamental principles of the scientific method is transparency – to conduct research in a way that can be assessed, verified, and reproduced… This is not optional – it is imperative.” 48 Of these articles, four mentioned actions the scientific community has already taken to prevent future research errors and misconduct.
A Vox article by science reporter Brian Resnick and senior health correspondent Julia Belluz published shortly after Wansink’s resignation outlined three changes in process among scientists: “preregistration of study designs,” “open data sharing,” and “registered replication reports.” 49 With the support of a growing number of researchers, these recommendations aim to guard against statistical manipulation and to bolster the rigor of the research publication process.
An article in The Chronicle of Higher Education said Wansink himself had made changes to his research review process. “His first move was to ask a postdoc in his lab to re-examine the disputed calculations using the original dataset” and “the lab will put in place new procedures to provide ‘guidance for collecting, reporting, and storing data…’ ” 50
Did journalists avoid generalizing from the retraction to the conclusion that science as a whole is broken or in crisis?
Nine articles about Wansink’s retractions described a so-called crisis in an area of science, the result of careless research methods and review, or outright misconduct. Daniel Engber at Slate not only used the word “crisis,” but also singled out Wansink for criticism. Engber wrote, “The story… reveals the best-selling author, media darling, and former U.S. Department of Agriculture official as perhaps the most egregious – or at least the most cartoonish – villain of the replication crisis in psychology, someone who seems to have embraced questionable research practices with astonishing enthusiasm.” 51
So, too, did Derek Lowe at Science. His February 2018 article outlined the issues with Wansink’s research in particular and with social science in general. He wrote, “The reproducibility crisis in social science is driven, in large part, by the fact that humans are horrendously hard to work with as objects of study.” 52 However, he did not argue that the challenges of research about human subjects are a justification for questionable methods. Instead, his article suggested that the crisis is driven by the demands of the academic publishing environment, and warned that outside of the “outright misconduct,” some of his readers may recognize the types of statistical manipulations employed by Wansink as all-too-familiar.
Summary of coverage of retractions
Seven news organizations were selected to represent the high-circulation, high-impact media outlets that regularly cover science – The Associated Press, The New York Times, USA Today, The Wall Street Journal, The Washington Post, Axios, Vox, and Breitbart. Three articles by the Associated Press were widely disseminated after Cornell announced Wansink’s resignation in September 2018 – they appeared on Breitbart, The New York Times and The Washington Post, as well as in their digital counterparts and in regional television and radio news.
The New York Times, The Washington Post, Breitbart and Vox were the only news outlets to cover more than one retraction.
The overall analysis of 172 articles from these news outlets and others found that:
- 97% of the news stories involving one of the three retractions report the circumstances leading to the retractions (Why)
- 49% report how the errors or misconduct were identified (How)
- 3% outline steps that the scientific community has taken to prevent future research mismanagement or misconduct (Actions taken)
- 2% say retractions are evidence of self-correction in science (Self-correction)
- 95% avoid generalizing from a few retractions to conclude that science is broken or in crisis (Avoid “crisis”)
Conclusion
News organizations are more likely to report on how the errors or misconduct were discovered than to credit named scholars for detecting them.
These media outlets are unlikely to include either statements indicating that the retraction is evidence of the self-corrective process at work or to indicate steps being taken to prevent a recurrence. One explanation for these omissions is academic journals’ failure to report steps they have taken to prevent the recurrence of an identified problem.
Appendix: Methods
To determine how the media have covered retractions of widely reported science, a team of coders analyzed 172 articles published after the announcement of any of three retractions or corrections. A search was performed on Google News, LexisNexis, and Factiva Dow Jones for the following terms according to the retraction: “Mediterranean diet AND retraction,” “(Lönnstedt OR Eklöv) AND retraction,” and “Wansink AND retraction.” This examination includes articles from major print and online outlets. Intercoder reliability for the coded items was 1.
Coded items | Number of articles |
---|---|
Why was the research retracted? | 166 |
Who identified the flaws or misconduct? | 69 |
How were flaws or errors identified? | 83 |
Are retractions presented as evidence of self-correction in the scientific community? | 3 |
Are next steps outlined to prevent future mistakes or misconduct? | 13 |
2 Kolata, 2013.
3 Brown, D. (2013, February 25). Mediterranean diet reduces cardiovascular risk. The Washington Post. https://www.washingtonpost.com/national/health-science/mediterranean-diet-reduces-cardiovascular-risk/2013/02/25/20396e16-7f87-11e2-a350-49866afab584_story.html?utm_term=.d1d904b4d08d.
4 McCook, A. (June 13, 2018). Errors Trigger Retraction Of Study On Mediterranean Diet’s Heart Benefits. NPR.
5 McCook, 2018.
6 Kolata, G. (2018, June 13). That Huge Mediterranean Diet Study Was Flawed. But Was It Wrong?. The New York Times. Retrieved from https://www.nytimes.com/2018/06/13/health/mediterranean-diet-heart-disease.html.
7 Kolata, 2018.
8 Belluz, J. (2018, June 20). This Mediterranean diet study was hugely impactful. The science just fell apart. Vox. Retrieved from https://www.vox.com/science-and-health/2018/6/20/17464906/mediterranean-diet-science-health-predimed.
9 May, A. (2018, June 14). Landmark Mediterranean diet study was flawed. Authors retract paper published in NEJM. USA Today. Retrieved from https://www.usatoday.com/story/news/nation-now/2018/06/14/major-mediterranean-diet-study-flawed-authors-retract-nejm-paper/700833002/.
10 Belluz, 2018.
11 Belluz, 2018.
12 Belluz, 2018.
13 Belluz, 2018.
14 Harvey, F. (2016, June 2). Microplastics killing fish before they reach reproductive age, study finds. The Guardian. Retrieved from https://www.theguardian.com/environment/2016/jun/02/microplastics-killing-fish-before-they-reach-reproductive-age-study-finds.
15 Carrington, D. (2016, June 14). Minister says UK government ‘fully backs’ microbeads ban. The Guardian. Retrieved from https://www.theguardian.com/environment/2016/jun/14/nment-backs-microbeads-ban-george-eustice.
16 Harvey, C. (2016, May 4). Study retracted: What tiny plastic particles are doing to tiny fish. The Washington Post. Retrieved from https://www.washingtonpost.com/news/energy-environment/wp/2016/06/02/what-tiny-plastic-particles-are-doing-to-tiny-fish/?utm_term=.d06377fa6ea3.
17 Enserink, M. (2017a, March 21). A groundbreaking study on the dangers of ‘microplastics’ may be unraveling. Science. Retrieved from http://www.sciencemag.org/news/2017/03/groundbreaking-study-dangers-microplastics-may-be-unraveling.
18 Enserink, 2017a.
19 Enserink, 2017a.
20 Enserink, M. (2017b, December 15). Swedish plastics study fabricated, panel finds. Science. Retrieved from http://science.sciencemag.org/content/358/6369/1367.
21 (2018, January 9). Swedish gov’t rescinds grant for fish-plastics researcher [Blog post]. Retrieved from https://retractionwatch.com/2018/01/09/swedish-govt-rescinds-grant-fish-plastics-researcher/.
22 Enserink, 2017a.
23 Enserink, 2017a.
24 Enserink, 2017a.
25 Schiermeier, Q. (2017, December 7). Investigation finds Swedish scientists committed scientific misconduct. Nature. Retrieved from https://www.nature.com/articles/d41586-017-08321-2.
26 Schiermeier, 2017.
27 Cressey, D. (2017, May 2). Controversial microplastics study to be retracted. Nature. Retrieved from https://www.nature.com/news/controversial-microplastics-study-to-be-retracted-1.21929.
28 Dean, S. (2017, May 2). A Widely Reported Study on The Dangers of Microplastics in Fish Is About to Be Retracted. Science Alert. Retrieved from https://www.sciencealert.com/a-widely-reported-study-on-the-effects-of-microplastics-in-fish-is-about-to-be-retracted.
29 Lemieux, J. (2017, December 18). A Long Time Coming: Two Swedish Scientists are Found Guilty of Scientific Misconduct. American Council on Science and Health. Retrieved from https://www.acsh.org/news/2017/12/08/long-time-coming-two-swedish-scientists-are-found-guilty-scientific-misconduct-12256.
30 Delingpole, J. (2017, December 8). Delingpole: ‘Fish Prefer Plastic to Food’ Study Was #Fakenews, Science Misconduct Committee Finds. Breitbart. Retrieved from https://www.breitbart.com/big-government/2017/12/08/delingpole-fish-prefer-plastic-to-food-study-was-fakenews-science-misconduct-committee-finds/.
31 Enserink, 2017a.
32 Gupta, S. (2012, October 29). Elmo says to eat more apples. CNN. Retrieved from http://thechart.blogs.cnn.com/2012/08/29/elmo-says-to-eat-more-apples/.
33 Lang, S. (2009, February 16). ‘Joy of Cooking’ supersizes and packs more calories into home cooking. The Cornell Chronicle. Retrieved from http://news.cornell.edu/stories/2009/02/joy-cooking-larger-portions.
34 Leonhardt, D. (2007, May 2). Your Plate Is Bigger Than Your Stomach. The New York Times. Retrieved from https://www.nytimes.com/2007/05/02/business/02leonhardt.html.
35 Leonhardt, 2007.
36 Tilson, W and Heins, J. (2010, May 23). Investors can learn from how people eat. The Washington Post. Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2010/05/22/AR2010052200277.html.
37 Wansink, B. (2016, November 21). The Grad Student Who Never Said ‘No.’ Blog post. Archived by the Internet Archive’s Wayback Machine. Retrieved from https://web.archive.org/web/20170312041524/http:/www.brianwansink.com/phd-advice/the-grad-student-who-never-said-no.
38 Salzberg, S. (2017, October 2). Cornell’s Food Lab Is Cooking Up Fake News. Forbes. Retrieved from https://www.forbes.com/sites/stevensalzberg/2017/10/02/cornells-food-lab-is-cooking-up-fake-news/#24e3fc1925cd.
39 Rosner, H. (2018, March 21). The Strange, Uplifting Tale of “Joy of Cooking” Versus the Food Scientist. The New Yorker. Retrieved from https://www.newyorker.com/culture/annals-of-gastronomy/the-strange-uplifting-tale-of-joy-of-cooking-versus-the-food-scientist.
40 Lee, S. (2018, February 25). Here’s How Cornell Scientist Brian Wansink Turned Shoddy Data Into Viral Studies About How We Eat. BuzzFeed. Retrieved from https://www.buzzfeednews.com/article/stephaniemlee/brian-wansink-cornell-p-hacking.
41 Lee, 2018.
42 Resnick, B and Belluz, J. (2018, September 21). A top Cornell food researcher has had 13 studies retracted. That’s a lot. Vox. Retrieved from https://www.vox.com/science-and-health/2018/9/19/17879102/brian-wansink-cornell-food-brand-lab-retractions-jama.
43 (2018, September 20). Statement of Cornell University Provost Michael I. Kotlikoff. Cornell University. Retrieved from http://statements.cornell.edu/2018/20180920-statement-provost-michael-kotlikoff.cfm
44 Lee, S. (2018, February 28). A Scientist Who Worked Closely With Brian Wansink Is No Longer At His Job. BuzzFeed. Retrieved from https://www.buzzfeednews.com/article/stephaniemlee/collin-payne-new-mexico-state.
45 Singal, J. (2017, February 8,). A Big Diet-Science Lab Has Been Publishing Shoddy Research. The Cut. Retrieved from https://www.thecut.com/2017/02/cornells-food-and-brand-lab-has-a-major-problem.html.
46 Singal, 2017.
47 Gorokh, A. (2018, March 12). GOROKH | In defense of Brian Wansink. The Cornell Daily Sun. Retrieved from https://cornellsun.com/2018/03/12/gorokh-in-defense-of-brian-wansink/.
48 Lee, S. (2017, September 27). Emails Show How An Ivy League Prof Tried To Do Damage Control For His Bogus Food Science. BuzzFeed. Retrieved from https://www.buzzfeednews.com/article/stephaniemlee/brian-wansink-cornell-smarter-lunchrooms-flawed-data.
49 Resnick and Belluz, 2018.
50 Bartlett, T. (2017, March 17). Spoiled Science. The Chronicle of Higher Education. Retrieved from https://www.chronicle.com/article/Spoiled-Science/239529.
51 Engber, D. (2018, February 28). Death of a Veggie Salesman. Slate. Retrieved from https://slate.com/technology/2018/02/how-brian-wansink-forgot-the-difference-between-science-and-marketing.html.
52 Lowe, D. (2018, February 26). Gotta Be a Conclusion In Here Somewhere. Science. Retrieved from http://blogs.sciencemag.org/pipeline/archives/2018/02/26/gotta-be-a-conclusion-in-here-somewhere.
The Annenberg Science Media Monitor is supported by a grant from the Rita Allen Foundation.