Skip to main content

High impact nutrition and dietetics journals’ use of publication procedures to increase research transparency

Abstract

Background

The rigor and integrity of the published research in nutrition studies has come into serious question in recent years. Concerns focus on the use of flexible data analysis practices and selective reporting and the failure of peer review journals to identify and correct these practices. In response, it has been proposed that journals employ editorial procedures designed to improve the transparency of published research.

Objective

The present study examines the adoption of editorial procedures designed to improve the reporting of empirical studies in the field of nutrition and dietetics research.

Design

The instructions for authors of 43 journals included in Quartiles 1 and 2 of the Clarivate Analytics’ 2018 Journal Citation Report category Nutrition and Dietetics were reviewed. For journals that published original research, conflict of interest disclosure, recommendation of reporting guidelines, registration of clinical trials, registration of other types of studies, encouraging data sharing, and use of the Registered Reports were assessed. For journals that only published reviews, all of the procedures except clinical trial registration were assessed.

Results

Thirty-three journals published original research and 10 published only reviews. Conflict of interest disclosure was required by all 33 original research journals. Use of guidelines, trial registration and encouragement of data sharing were mentioned by 30, 27 and 25 journals, respectively. Registration of other studies was required by eight and none offered Registered Reports as a publication option at the time of the review. All 10 review journals required conflict of interest disclosure, four recommended data sharing and three the use of guidelines. None mentioned the other two procedures.

Conclusions

While nutrition journals have adopted a number of procedures designed to improve the reporting of research findings, their limited effects likely result from the mechanisms through which they influence analytic flexibility and selective reporting and the extent to which they are properly implemented and enforced by journals.

Peer Review reports

Introduction

The rigor and integrity of the published research literature in the field of nutrition studies has come into serious question in recent years. One reason for this is the publicity surrounding the retraction of more than a dozen papers produced by researchers at a major nutrition laboratory at one of the top universities in the United States [1, 2]. Such a large-scale retraction of papers casts doubts on the peer review process of the journals that published this research, and is especially concerning as many of these were high impact journals with presumably stringent peer review procedures and standards.

In addition, some nutrition researchers have begun to question the quality of published research both within the discipline as a whole and within specific areas of specialization. A report from an American Society for Nutrition advisory committee highlighted the threats to research integrity that arise from competing interests among researchers, specifically the type of selective statistical analyses that are conducted and reported and the conclusions investigators draw from these (e.g., withholding unfavorable findings from publication) [3]. In a more detailed examination of such issues in the field of childhood obesity interventions, Brown et al. identified ten statistical and methodological errors frequently found in the published literature that were associated with exaggerated claims about program effectiveness in changing diet, increasing exercise and weight reduction [4]. These included reporting results only for secondary outcome variables in the light of null findings for primary outcomes, reporting only subgroup analyses in the light of no main effects, data dredging for spurious statistically significant results, using one-tailed tests of statistical significance, and claiming that null results were nonetheless “clinically significant”. Bero and colleagues also identified bias in outcome reporting in a series of systematic reviews focused on nutrition research funded by the food industry [5,6,7]. They found the reporting of results in such studies tended to be skewed in favor of the sponsoring industry, and that authors’ financial conflicts of interest were frequently undisclosed in journal publications. Such reporting bias and lack of transparency is especially problematic as it can lead to the development of inappropriate dietary guidelines [8].

In the field of nutritional epidemiology, Ioannidis contends that “good scientific principles” are lacking, with investigators being free to report selective results from multiple analyses they have conducted, very few of which will have been prespecified [9]. In a study of published case-control and cohort studies that examined the association between 40 randomly selected ingredients from recipes and cancer, Schoenfeld and Ioannidis found that the vast majority reported either an increased or decreased risk based on very weak statistical evidence [10]. They attributed this to publication bias and selective reporting of positive results in publications.

The concerns raised regarding the quality and integrity of research published in the field of nutrition research reflect those that have arisen within many academic disciplines over the past decade, and that have been described in terms of a “credibility crisis” [11, 12]. The response to this crisis has been, in part, to call for increased rigor and transparency in the editorial and peer review process used by academic journals [13,14,15]. This has involved requiring full and detailed disclosure of conflicts of interest, use of guidelines such as the Consolidated Standards for Reporting Trials (CONSORT) [16] and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [17] when writing-up manuscripts for publication, preregistration of hypotheses, study designs and measures, and sharing of data and code as a condition of publication. Those concerned with the quality and integrity of nutrition research have also suggested that such procedures be introduced by academic journals within the discipline [3, 4].

Six studies have examined journal adoption of three editorial procedures designed to improve the quality of published research in the fields of pediatric research [18, 19], surgery [20], emergency medicine [21], orthopedic and general medicine [22], and addiction [23]. Clinical trial registration and recommendation of the CONSORT guidelines were assessed in all six studies, with adoption of the procedure ranging from 23 to 86% for the former and 20 to 81% for the latter [18,19,20,21,22]. Other reporting guidelines, such as PRISMA, were less likely to be recommended in the instructions for authors of these journals. Conflict of interest disclosure was examined in three studies and required by 61% of online and 78% of print pediatrics journals [18, 19], and 97% of addiction journals [23].

To our knowledge, no existing study has assessed the adoption of editorial procedures designed to improve the reporting of empirical studies in the field of nutrition research. The current study is intended to fill this gap in existing knowledge. It is an exploratory and descriptive study in which no hypotheses were prespecified or tested.

Materials and methods

In line with previous studies of journal editorial procedures [18, 20,21,22,23], Clarivate Analytics Journal Citation Report (JCR) was used to identify high impact nutrition and dietetics journals. Specifically, 21 journals ranked in Quartile 1 (Q1) and 22 in Quartile 2 (Q2) of the 2018 JCR category Nutrition and Dietetics were selected [24]. The impact factor of each of these 43 journals was also obtained from the JCR webpage. Q1 journals are those with impact factors greater than 75% of journals within that particular JCR category, while Q2 journals have impact factors greater than 50% of journals within the category. The 43 Q1 and Q2 journals were selected so that their instructions for authors could be identified and reviewed in a timely manner (i.e., before they became outdated and were replaced online) and this number was considered to be reasonable in terms of representing the high impact journals within the field of nutrition and dietetics research. However, recognizing the limitations of journal rankings based on impact factor [25], the Scimago Journal Ranking (SJR) h-index of each of the 43 journals was also identified. This expresses the number of articles (h) published by the journal that have received at least h citations [26].

The instructions for authors of each of the 43 journals was located on the journal webpage on the Internet. In addition, if a journal’s instructions for authors cited additional documents that prospective authors were required to consult (e.g., publisher ethics guidelines), these too were located. Instructions for authors and related documents that were available as a Portable Document Format (PDF) were downloaded. If the author instructions and related documents on webpages were unavailable as a PDF, they were copied and pasted into a Word document. All the material reviewed was downloaded between April 3 and October 14, 2019. The downloaded instructions for authors of each journal, along with any relevant additional materials cited in these, were reviewed separately by each of the two authors and mention of each publication procedure was recorded. The two reviewers met to discuss their ratings and any discrepancies were resolved. Specifically, where there was disagreement about the presence or absence of requirements and recommendations in a particular set of instructions for authors, they re-reviewed these during the meeting and came to agreement as to whether the procedure was or was not described in the documents. The Center for Open Science list of journals that have adopted the Registered Reports publication format was also reviewed for the inclusion of any of the 43 nutrition and dietetics journals on December 19, 2019 [27].

For journals that published both original research and systematic reviews the following six procedures were assessed: (i) conflict of interest disclosure, (ii) recommendation of specific reporting guidelines when writing-up the results of studies for publication (e.g., CONSORT, PRISMA) or reference to the EQUATOR Network or FAIRshare repositories of guidelines [28, 29], (iii) registration of clinical trials in a registry such as Clinical Trials.gov [30] (iv) registration of other types of studies in a registry such as PROSPERO (for systematic reviews) [31], (v) encouraging or requiring data sharing, and (vi) use of the Registered Reports publishing format [27]. For journals that exclusively published systematic reviews and meta-analyses, five of the six procedures were considered relevant, the exception being clinical trial registration. In addition, in the case of two of the remaining five, it was anticipated that fewer guidelines would be relevant to journals that only publish reviews (e.g., PRISMA and Meta-analysis of Observational Studies in Epidemiology (MOOSE) [32]) and that just one registry (PROSPERO) would likely be utilized. These procedures were selected based on those used in previous studies of journal editorial practices [18,19,20,21,22,23], as well as those described in the broader literature on measures to reduce publication and reporting bias and promote research transparency [33,34,35].

The analyses presented are descriptive data pertaining to the number of publication procedures mentioned in journal instructions to authors.

Results

Table 1 shows the JCR impact factor, SJR h-index and publisher of the 43 journals that appeared in the top two quartiles (Q1 and Q2) of the 2018 JCR Nutrition and Dietetics category. Ten of the journals were judged to be exclusively focused on review articles from the aims described in their instructions for authors, and these were reviewed separately from the 33 journals that focused on original research (in addition to reviews).

Table 1 2018 JCR Q1 and Q2 Nutrition and Dietetics Journals Arranged by JCR and Scimago Impact Factors and with Publishera

Table 2 shows which of the five publication procedures was mentioned in the instructions for authors of each of the ten journals that exclusively publish review articles. Conflict of interest disclosure was required by all journals, but two of the other four procedures were only mentioned by three or four journals and two were not mentioned by any journal in its instructions for authors.

Table 2 Specification in Instructions for Authors of Five Publication Procedures Designed to Improve Research Integrity by 2018 JCR Q1 and Q2 Nutrition and Dietetics Journals that Publish Reviews (n = 10)a

Table 3 shows which of the six publication procedures was mentioned in the instructions for authors of the 33 nutrition and dietetics journal that published articles reporting original research. The range was 1 to 5, and the mean was 3.7. Twenty-three (70%) of the journals discussed four or more of the procedures in their instructions for authors.

Table 3 Specification in Instructions for Authors of Six Publication Procedures Designed to Improve Research Integrity by 2018 JCR Q1 and Q2 Nutrition and Dietetics Journals that Publish Original Research (n = 33)a

All of the 33 journals required authors to disclose conflicts of interest. The second most widely recommended publication procedure was use of guidelines when writing-up results from empirical studies, with 30 of the 33 journals mentioning at least one specific set of guidelines and one mentioning the EQUATOR Network in their instructions to authors or related documents. Registration of clinical trials was also common, with 27 journals requiring this of investigators. Compared to clinical trials, registration of research using other study designs was much less common, with just eight journals requiring this of investigators. Of these, seven required registration of systematic reviews in PROSPERO and one (Journal of Nutrition) required registration of observational studies, but did not specify a particular registry.

Discussion of data sharing appeared in the instructions for authors of 25 of the 33 journals. For the vast majority, this involved encouraging data sharing for all types of studies. The three exceptions to this were Nutrition and Diabetes for which data sharing was a condition of publication, and Obesity and Applied Physiology Nutrition and Metabolism which both required a data sharing statement, but only for clinical trials. None of the journals had adopted the Registered Reports publication format at the time the review was conducted.

Table 4 shows the specific guidelines and guideline repositories recommended in each of the 30 journals that discussed these in their instructions for authors. Ten journals mentioned the EQUATOR Network repository of guidelines. Five of these journals also mentioned the FAIRsharing repository and all but one also mentioned a specific set of guidelines. A total of 19 specific guidelines were recommended, with the number mentioned by each journal that discussed these ranging from one to 12. Of the 29 that recommended a specific set of guidelines, 27 mentioned CONSORT, 21 ARRIVE, 19 PRISMA and 12 STROBE. Each of the other 15 guidelines was mentioned by fewer than 10 journals.

Table 4 Mention by Journals of the EQUATOR Network and Specific Guidelines (n = 30)a

Discussion

The requiring of Registered Reports has not been adopted at all within the discipline’s high impact academic peer-reviewed outlets, and registration of studies other than clinical trials is seldom required. Notably however, and consistent with similar studies conducted in other academic disciplines, publication procedures such as conflict of interest disclosure, clinical trial registration, recommendation of guidelines and data sharing have become standard editorial requirements in nutrition and dietetic journals.

Nevertheless, given this generally encouraging picture, the question arises as to why there is such concern about the quality and integrity of published research within the field of nutrition and dietetics studies [3,4,5,6]. In answer, the limited effects on research quality and transparency of the publication procedures examined in this study can best be explained by the mechanisms through which they influence analytic flexibility and selective reporting and the extent to which they are properly implemented and enforced. For example, conflict of interest disclosure does not directly affect flexible data analysis and selective reporting of results. Rather, it alerts readers to the fact that one or more authors of a paper has a competing interest that may create a preference for results of a certain kind (e.g., those showing a positive effect of an intervention program). In addition, journal conflict of interest disclosure policies almost exclusively focus on financial conflicts, which is a potential limitation, especially in applied research fields where affiliation and confirmation biases may predispose investigators to favor certain types of results over others. In light of such influences, it has been argued that there is a need to focus on conflicts of interest beyond those of a financial nature, such as career-related advancement and ideological conflicts [3, 36]. The scope of these may be fairly wide in the field of nutrition and dietetics research. For example, a number of the papers retracted by the Cornell University Food and Brand Lab focused on meal sizes and the structure of eating habits, ideas for which its director was well-known and about which he had written a popular book [37]. Kroger and colleagues [3] propose that investigators who have published a book relevant to the topic of their research should report this as a financial conflict of interest.

However, even in the absence of such a book, it is likely that investigators will have a strong affiliation to the theories they have developed and their prior discoveries regarding these, and therefore their ability to objectively conduct replication studies may be limited [38]. How such affiliation bias should be covered by conflict of interest disclosures remains unclear, and it is questionable whether this is even feasible and desirable [39]. Accordingly, conflict of interest disclosures may not be the most efficient editorial procedure to employ when attempting to reduce analytic flexibly and selective outcome reporting. In the absence of being able to create a list of every possible “interest” that might influence how studies are conceptualized and conducted, it is probably easier to put in place publication procedures such as prospective registration and Registered Reports that make such practices difficult to engage in, whatever the motivation for their use.

Indeed, study registration has been presented by its advocates as a more direct way to address these problems [40] and, as was found in the present study, mandatory registration is becoming widespread among academic journals, at least in the case of randomized clinical trials. However, its application beyond this type of study has been minimal [21, 23], and this was found to be the case in the current analysis of nutrition and dietetics journals. The two main limitations of study registration are that a large proportion of registered studies are retrospectively registered and many investigators do not adhere to key elements of their registered analysis plan when analyzing data and writing-up results for publication [41,42,43,44]. So, while it is encouraging that more than 80% of journals required clinical trial registration and we would recommend all journals require registration of studies using other research designs (e.g., systematic reviews, observational studies), it is essential that editors insist registration is prospective and not retrospective and that they require investigators to demonstrate they adhered to their prospectively registered analysis plans [15, 45].

Lack of adherence and suboptimal oversight is also a problem with study guidelines and data sharing, both of which tend to be recommended or encouraged, rather than required, by journals. As shown in the current study, there now exist a lot of reporting guidelines, but it is unlikely reviewers check manuscript against all the items appearing in these when reviewing manuscripts, and research shows adherence to many of the specific items described in CONSORT, PRISMA and STROBE remains poor [46,47,48]. If journals are to recommend use of reporting guidelines, they should ensure that there is some formal procedure used to assess adherence built into the peer-review process, with either authors or reviewers completing a check-list of compliance.

While three-quarters of the nutrition and dietetics journals in the current study discussed data sharing in their instructions for authors, only one required it as a condition of publication. We recommend all journals require a clear data sharing statement from authors giving precise reasons for their decisions as to how they are going to share their data, and justifying any decision not to share data. If data are housed in a third-party repository, the means of accessing these should be specified; if they are not stored in such a repository, a clear procedure for obtaining the data from the authors should be described. The type of voluntary policies favored by nutrition and dietetics journals have been shown to result in less than one in ten investigators making raw empirical data accessible [49,50,51,52]. Moreover, even when data are shared by investigators, someone has to reanalyze them and identify discrepancies and irregularities in order for analytic flexibility and selective reporting to be identified. For this to have a deterrent effect, a great deal of such reanalysis of published studies would need to take place, something which is unlikely in the prevailing research culture that places little value on replication studies [53]. Also, in the absence of a prespecified analysis plan, there is no way to recreate the study as initially designed and identify deviations from this in a replication study. Reproducibility also requires adherence to a prespecified analysis plan that describes the computational procedures used in the original research, as well as access to the original data and code [54].

Limitations

There are a number of limitations of this study that should be noted. First, not all nutrition and dietetics journals were included and it is possible that other journals, such as those included in Quartiles 3 and 4 of the JCR, used more or less of any one of the particular publication procedures that were examined. Second, the instructions for authors and related materials were reviewed by just two individuals who may each have failed to identify a publication procedure in the course of their review. Third, the paper presents a snap-shot of the state of the discipline, and journals may have subsequently adopted one or more of the publication procedures assessed since the time the review was conducted. For example, we are aware that one of the journals included in our sample (the International Journal of Eating Disorders) has subsequently introduced Registered Reports as an option for authors. Fourth, the study presents no data as to whether the procedures examined were actually enforced by the journals that adopted them and adhered to by the authors that publish in these journals. As noted above, adherence and enforcement are essential to the effective operation of editorial policies in improving research transparency and quality.

Conclusions

In conclusion, publication procedures that lock investigators into prespecified hypotheses, methods, measures and statistical analyses have the greatest potential to limit analytic flexibility and selective reporting of results. Of the procedures examined in this study that best accomplish this, one (Registered Reports) was not used by any of the nutrition and dietetics journals reviewed and the other (registration) was rarely required outside of clinical trials. None of the journals that published only reviews and meta-analyses required that these be preregistered.

We recommend that all nutrition and dietetics journals adopt one or both of the procedures that lock-in data analyses and allow differentiation of genuine confirmatory hypothesis testing from other types of research. Journals could then clearly distinguish (either by having separate sections or including some indicator on the title page of each paper) confirmatory research from research that purports to be but cannot demonstrate that the hypotheses tested were pre-specified. A third category of genuine exploratory research could also be included. The journal instructions for authors could clearly specify the types of claims that can be made for studies within each of these categories. This simple procedure would prevent investigators who employ flexible data analysis practices and selective reporting from presenting the results of these as genuine hypothesis testing [13]. The goal would be, over time, to move more investigators to preregister their hypotheses and analysis plans or publish using the registered reports format in order that their papers be published in the more prestigious confirmatory category that allowed stronger statements about the validity and implications of the study. At present, nutrition and dietetics journals make no distinction between results that emerge from flexible data analysis and selective reporting and those that do not, and therefore there is no incentive for investigators to prespecify their hypotheses and analysis plans.

Availability of data and materials

The dataset generated and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

ARRIVE:

Animal Research: Reporting of in Vivo Experiments

CONSORT:

Consolidated Standards for Reporting Trials

EQUATOR:

Enhancing the QUAlity and Transparency Of health Research

JCR:

Journal Citation Report

MOOSE:

Meta-analysis Of Observational Studies in Epidemiology

PDF:

Portable Document Format

PRIMSA:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

SJR:

Scimago Journal Ranking

STROBE:

Strengthening the Reporting of Observational Studies in Epidemiology

References

  1. van der Zee T, Anaya J, Brown NJ. Statistical heartburn: an attempt to digest four pizza publications from the Cornell food and brand lab. BMC Nutr. 2017;3(54):1–15.

    Google Scholar 

  2. Munafò MR, Hollands GJ, Marteau T. Open science prevents mindless science. BMJ. 2018;363(k4309):1–2.

    Google Scholar 

  3. Kroeger CM, Garza C, Lynch CJ, Myers E, Rowe S, Schneeman BO, Sharma AM, Allison DB. Scientific rigor and credibility in the nutrition research landscape. Am J Clin Nutr. 2018;107(3):484–94.

    Google Scholar 

  4. Brown AW, Altman DG, Baronowski T, Bland JM, Dawson JA, Dhurandhar NV, Dowla S, Fontaine KR, Gelman A, Heymfield SB, et al. Childhood obesity intervention studies: a narrative review and guide for investigators, authors, editors, reviewers, journalists, and readers to guard against exaggerated effectiveness claims. Obes Rev. 2019;20(11):1523–41.

    Google Scholar 

  5. Mandrioli D, Kearns CE, Bero LA. Relationship between research outcomes and risk of bias, study sponsorship, and author financial conflicts of interest in reviews of the effects of artificially sweetened beverages on weight outcomes: a systematic review of reviews. PLoS One. 2016;11(9):e0162198.

    Google Scholar 

  6. Fabbri A, Lai A, Grundy Q, Bero LA. The influence of industry sponsorship on the research agenda: a scoping review. Am J Public. 2018;108(11):e9–e16.

    Google Scholar 

  7. Chartres N, Fabbri A, Bero LA. Association of industry sponsorship with outcomes of nutrition studies: a systematic seview and meta-analysis. JAMA Intern Med. 2016;176(12):1769–77.

    Google Scholar 

  8. Bero LA, Norris SL, Lawrence MA. Making nutrition guidelines fit for purpose. BMJ. 2019;365:l1579.

    Google Scholar 

  9. Ioannidis JPA. The challenge of reforming nutritional epidemiologic research. JAMA. 2018;320(10):969–70.

    Google Scholar 

  10. Schoenfeld JD, Ioannidis JPA. Is everything we eat associated with cancer? A systematic cookbook review. Am J Clin Nutr. 2013;97:127–34.

    Google Scholar 

  11. Nuzzo R. Fooling ourselves. Nature. 2015;526:182–5.

    Google Scholar 

  12. Shrout PE, Rodgers JL. Psychology, science, and knowledge construction: broadening perspectives from the replication crisis. Annu Rev Psychol. 2018;69:487–510.

    Google Scholar 

  13. Wagenmakers E-J, Wetzels R, Brsboom D, van der Maas HLJ, Kievit RA. An agenda for purely confirmatory research. Perspect Psychol Sci. 2012;7:632–8.

    Google Scholar 

  14. Miguel E, Camerer C, Casey K, Cohen J, Esterling KM, Gerbe A, et al. Promoting transparency in social science research. Science. 2014;343:30–1.

    Google Scholar 

  15. International Committee of Medical Journal Editors. Recommendations for the conduct, reporting, editing, and publication of scholarly work in medical journals 2018. Available from: http://www.icmje.org/icmje-recommendations.pdf. Cited 19 Nov 2019.

    Google Scholar 

  16. Schulz KF, Altman DG, Moher D. CONSORT Group. CONSORT 2010 Statement: Updated guidelines for reporting parallel group randomised trials. PLoS Med. 2010;7(3):e1000251.

    Google Scholar 

  17. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. Ann Intern Med. 2009;151:264–9.

    Google Scholar 

  18. Meerpohl JJ, Wolff RF, Niemeyer CM, Antes G, von Elm E. Editorial policies for pediatric journals: survey of instructions for authors. Arch Pediatr Adolesc Med. 2010;164(3):268–72.

    Google Scholar 

  19. Meerpohl JJ, Wolff RF, Antes G, von Elm E. Are pediatric open Acess journals promoting good publication practice: an analysis of author instructions. BMC Pediatr. 2011;11(27):1–7.

    Google Scholar 

  20. Smith TA, Kulatilake P, Brown LJ, Wigley J, Hameed W, Shantikumar S. Do surgery journals insist on reporting by CONSORT and PRISMA? A follow-up survey of ‘instructions to authors.’. Ann Med Surg. 2015;4(1):17–21.

    Google Scholar 

  21. Sims ST, Henning NM, Wayant CC, Vassar M. Do emergency medicine journals promote trial registration and adherence to reporting guidelines? A survey of “Instructions for Authors”. Scan J Truma, Resusc Emerg Med. 2016;24(1):137.

    Google Scholar 

  22. Checketts JX, Sims MT, Detweiler B, Middlemist K, Jones J, Vassar M. An evaluation of reporting guidelines and clinical trial registry requirements among orthopaedic surgery journals. J Bone Joint Surg Am. 2018;100(3):e15.

    Google Scholar 

  23. Gorman DM. Use of publication procedures to improve research integrity by addiction journals. Addiction. 2019;114(8):1478–86.

    Google Scholar 

  24. Clarivate Analytics. 2018 Journal citation reports 2018. Available from: https://clarivatecom/webofsciencegroup/wp-content/uploads/sites/2/2019/10/Crv_JCR_Full-Marketing-List_A4_2018_v4pdf Cited 29 Jan 2020.

    Google Scholar 

  25. Larivière V, Sugimoto CR. The Journal Impact Factor: A brief history, critique, and discussion of adverse effects. In: Glänzel W, Moed HF, Schmoch U, Thelwall M, editors. Springer Handbook of Science and Technology Indicators, vol. 2018. Cham (Switzerland): Springer International Publishing; 2018. p. 3–24.

    Google Scholar 

  26. Scimago Journal & County Rank. 2019. Available from: https://www.scimagojr.com.

  27. Center for Open Science. Registered Reports: Peer review before results are known to align scientific values and practices. Available from: https://cos.io/rr/. Cited 19 Dec 2019.

  28. EQUATOR Network. Enhancing the QUAlity and Transparency Of health Research 2019. Available from: https://www.equator-network.org. Cited 19 Dec 2019.

    Google Scholar 

  29. FAIRsharing.org. Standards, databases, policies 2020. Available from: https://fairsharing.org/policies/. Cited 8 Jan 2020.

    Google Scholar 

  30. U.S. National Library of Medicine. ClinicalTrials.gov. Available from: https://clinicaltrials.gov. Cited 2 Jan 2020.

  31. National Institute for Health Research. PROSPERO: International prospective register of systematic reviews. Available from: https://www.crd.york.ac.uk/prospero/. Cited 2 Jan 2020.

    Google Scholar 

  32. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, Moher D, Becker BJ, Sipe TA, Thacker SB. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis of observational studies in epidemiology (MOOSE) group. JAMA. 2000;283(15):2008–12.

    Google Scholar 

  33. Ioannidis JPA, Munafò MR, Fusar-Poli P, Nosek BA, David SP. Publication and other reporting biases in cognitive sciences: detection, prevalence and prevention. Trends Cogn Sci. 2014;18(5):235–41.

    Google Scholar 

  34. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SD, et al. Promoting an open research culture. Science. 2015;348(6242):1422–5.

  35. Munafò MR, Nosek BA, Bishop D, Button KS, Chambers CD, Sert NP, Simonsohn U, Wagenmakers E-J, Ware JJ, Ioannidis JPA. A manifesto for reproducible science. Nat Hum Behav. 2017;1:1–9.

    Google Scholar 

  36. Galea S. A typology of nonfinancial conflict in population health research. Am J Publ Hlth. 2018;108:631–2.

    Google Scholar 

  37. Bero LA, Grundy Q. Why having a (nonfinancial) interest is not a conflict of interest. PLoS Biol. 2016;14(12):e2001221.

    Google Scholar 

  38. Ioannidis JPA. Scientific inbreeding and same-team replication: type D personality as an example. J Psychosom Res. 2012;73:408–10.

    Google Scholar 

  39. Wansink B. Mindless eating: why we eat more than we think. New York: Bantam Books; 2006.

    Google Scholar 

  40. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. PNAS. 2018;115:2600–6.

    Google Scholar 

  41. Harriman SL, Patel J. When are clinical trial registered? An analysis of prospective versus retrospective registration. Trials. 2016;17:187.

    Google Scholar 

  42. Hunter KE, Seidler AL, Askie LM. Prospective registration trends, reasons for retrospective registration and mechanisms to increase prospective registration compliance: descriptive analysis and survey. BMJ Open. 2018;8:e019983.

    Google Scholar 

  43. van Lent M, IntHout J, Out H. J. Differences between information in registries and articles did not influence publication acceptance. J Clin Epidemiol. 2015;68:1059–67.

    Google Scholar 

  44. Rankin J, Ross A, Baker J, O’Brien M, Scheckel C, Vassar M. Selective outcome reporting in obesity clinical trials: a cross-sectional review. Clin Ob. 2017;7:245–54.

    Google Scholar 

  45. Smith SM, Dworkin RH. Prospective clinical trial registration: not sufficient, but always necessary. Anaesthesia. 2018;73(5):538–41.

    Google Scholar 

  46. Page MJ, Moher D. Evaluations of the uptake and impact of the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement and extensions: a scoping review. Systemat Revs. 2017;6:263.

    Google Scholar 

  47. Li G, Bhatt M, Wang M, Mbuagbaw L, Samaan Z, Thabane L. Enhancing primary reports of randomized controlled trials: three most common challenges and suggested solutions. PNAS. 2018;115:2595–9.

    Google Scholar 

  48. Pouwels KB, Widyakusuma NN, Groenwold RHH, Hak E. Quality of reporting of confounding remained suboptimal after the STROBE guideline. J Clin Epidemiol. 2016;69:217–24.

    Google Scholar 

  49. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. PLoS Biol. 2018;16(11):e2006930.

    Google Scholar 

  50. Vidal-Infer A, Aleixandre-Benavent R, Lucas-Dominguez R, Sixto-Costoya A. The availability of raw data in substance abuse scientific journals. J Subst Use. 2019;24:36–40.

    Google Scholar 

  51. Walters C, Harter ZJ, Wayant C, Vo N, Warren M, Chronister J, Tritz D, Vassar M. Do oncology researchers adhere to reproducible and transparent principles? A cross-sectional survey of published oncology literature. BMJ Open. 2019;9:e033962.

    Google Scholar 

  52. Gorman DM. Availability of research data in high-impact addiction journals with a data sharing policy. Sci Eng Ethics. 2020;26:1625–32.

    Google Scholar 

  53. Ioannidis JPA. How to make more published research true. PLoS Med. 2014;11(10):1001747.

    Google Scholar 

  54. National Academies of Science, Engineering and Medicine. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press; 2019.

    Google Scholar 

Download references

Acknowledgements

The authors have no acknowledgements to disclose.

Funding

No funding source supported this work.

Author information

Authors and Affiliations

Authors

Contributions

DMG conceptualized the study and design, participated in the analysis and interpretation of data, and drafted and revised the article. AOF participated in the analysis and interpretation of data, and revised the article. The author(s) read and approved the final manuscript.

Authors’ information

DMG is a Professor in the Department of Epidemiology & Biostatistics, Texas A&M University School of Public Health. AOF an Associate Professor in the Department of Health Policy & Management, Texas A&M University School of Public Health.

Corresponding author

Correspondence to Alva O. Ferdinand.

Ethics declarations

Ethics approval and consent to participate

Institutional review board approval was not needed for this study because it did not involve human subjects and instead relied on publicly available Instructions for Authors for various journals.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gorman, D.M., Ferdinand, A.O. High impact nutrition and dietetics journals’ use of publication procedures to increase research transparency. Res Integr Peer Rev 5, 12 (2020). https://0-doi-org.brum.beds.ac.uk/10.1186/s41073-020-00098-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s41073-020-00098-9

Keywords