Introduction#

Science is an incremental progress towards creating and organizing knowledge through theories and testable predictions. Reproducibility is a core part of science: being able to repeat or recreate scientific results is essential for the complex process of knowledge accumulation. Due to its relevance, different terms have been introduced to describe specific aspects of the process (e.g., “reproducibility” when the same data and methods are used, “replicability” when new data and potentially new methods are used, “robustness” when the same data but different methods are used, and “generalizability” when new data and methods are used) [The Turing Way Community et al., 2019]. Here, we will use “reproducibility” as an umbrella term comprising them together to refer to the ability of recreating scientific results [Poldrack et al., 2020]. Open science tools and practices have been developed to advance reproducibility, as well as accessibility and transparency at all stages of the research cycle and across all levels of society. Together, they remove barriers to sharing and facilitate collaboration, with the goal of improving reproducibility and, ultimately, accelerating scientific discoveries.

Issue

Empirical observations of how labs conduct research indicate that the adoption rate of open practices and tools for reproducible and collaborative science, unfortunately, remains in its infancy.

Even when members of a specific scientific community have taken a central role in open science advocacy and tool development, like in the neuroimaging community, the impact on the rest of the very same community is limited. A recent survey [Paret et al., 2021] including researchers who are senior and likely to hold a positive attitude towards open science, indicated that 42% have never pre-registered a neuroimaging study and 34% have never shared their raw neuroimaging data. Many of those who indicated that they pre-registered or shared their data at least once likely did not do so in all their studies, and thus, the actual rate of pre-registration and data sharing in neuroimaging is likely much lower.

The limited adoption of open science practices is at odds with the overwhelming evidence that a lack of open practices and tools can hinder reproducibility with costs for scientific progress and for society. Indeed, reproducibility issues have been undermining the foundation of scientific research in several fields, such as psychology [Klein et al., 2018, Open Science Collaboration, 2015] social sciences [Camerer et al., 2016, Camerer et al., 2018], neuroimaging [Botvinik-Nezer et al., 2020, Li et al., 2021, Munafò et al., 2017], preclinical cancer biology research [Errington et al., 2021, Errington et al., 2021], and more [Hutson, 2018, Nissen et al., 2016, Serra-Garcia and Gneezy, 2021]. As a response, there has been a rise in the development of tools and approaches to facilitate reproducibility and open science, in the spirit of Findability, Accessibility, Interoperability, and Reusability principles (FAIR) [Clayson et al., 2022, Gorgolewski and Poldrack, 2016, Nosek et al., 2019, Nosek et al., 2012, Poldrack et al., 2017, Poldrack et al., 2019, Poldrack et al., 2020, Wilkinson et al., 2016]. Beyond their potential to mitigate transparency and reproducibility issues, these practices provide important benefits for individual researchers by increasing exposure, reputation, chances of publication, number of citations, media attention, potential collaborations, and position and funding opportunities [Allen and Mehler, 2019, Hunt, 2019, Markowetz, 2015, McKiernan et al., 2016, Nosek et al., 2022]. Hence, one could have expected a higher uptake for such beneficial practices and tools.

Recently, there is a parallel top-down change of policies to further support the adoption of open science practices and tools. For example, funding agencies are now enforcing the implementation of certain open data practices for publicly funded research ([de San Román, 2021]; e.g., the NIH in the U.S. and the ERC in Europe; [de Jonge et al., 2021]), and some require a plan for research data storage and sharing, openly accessible publication formats and dissemination plans beyond the classical journal publication. Additionally, they provide funding for the development of necessary software, hardware, and collaborative infrastructure) to support the transition to open and reproducible neuroscience (e.g., the NIH BRAIN Initiative, NIH ReproNim project [Kennedy et al., 2019], NSF CRCNS, EU Human Brain Project, German NFDI). These efforts by funding agencies are complemented by stakeholder institutions like the OHBM, the International Neuroinformatics Coordinating Facility (INCF), the Chinese Open Science Network (COSN), and the Open Science Framework (OSF), who provide platforms for the development of standards and best practices of open and FAIR neuroscience research, assemble training material, quality control, and promote open science practices. Moreover, journals have started changing their policies with regard to open access options and data sharing. Together, these institutional measures aim at fostering the benefits of open science practices, and the adoption of open and reproducible science standards will be increasingly required for labs and individual researchers.

Issue

Nevertheless, multiple barriers of entry to open science practices are driving the modest rate of adoption in the general research community. Among them are lack of knowledge or training and lack of skills or resources.

A survey by Borghi and Van Gulick Borghi and Van Gulick [2018] found that 65% of researchers reported openness and reproducibility as motivation for implementing research data management in MRI, but between 40-50% pointed to the lack of best practices/tools and knowledge/training as main obstacles for embracing these practices. Likewise, a more recent survey indicated that similar percentages of researchers in neuroimaging have never learned how to pre-register or share their data online and that they know too little about pre-registration platforms and suitable data repositories [Paret et al., 2021]. These later challenges could be alleviated by a simplified overview of the open resources available. However, information required for implementing open science practices over the full research cycle is currently scattered among many different sources. Even experienced researchers in the topic often find it hard to navigate the ecosystem of community-developed tools and to make sustainable choices.

What do we provide

This manuscript provides an integrated overview of community-developed resources critical to support open and reproducible neuroimaging throughout the entire research cycle and across different neuroimaging modalities (particularly MRI, MEG, EEG, and PET).

Most previous reviews on the topic have focused on the importance and benefits of open and reproducible science [Munafò et al., 2017, Nosek et al., 2012, Poldrack et al., 2017, McKiernan et al., 2016]. Here, we instead focus on a detailed overview, which we believe is urgently needed to accelerate the wider adoption of open science tools and practices by the general research community. The goal is to increase scientific reproducibility and openness by making it easier for scientists to select the best instruments offered by open science at every step of their research workflow. We provide justification on why each of them implements good practices and information on how to access and how to integrate them in the research workflow.

What do we not provide

We do not recommend particular tools over others in this review, as the ideal tools may depend on many factors that vary between researchers. However, we can recommend points to consider when selecting tools. Typically we want to choose tools that integrate with tools and practices already established in the lab, have a steep learning curve, and a long benefit.

In order to increase the likelihood of tools being sustainable, they should be relatively mature, well maintained, and supported by an active community. Another indicator is whether the tools and practices are integrated in already established toolboxes or supported by one of the larger openscience organizations. If still multiple tools meet these criteria, then it might be advantageous to choose one that is used by peers and collaboration partners. When we recommend practices, we state the problems they are supposed to address.

We also encourage the readers to join the development teams and leadership of those tools, becoming an active part of the open neuroimaging community. Contributions from individuals who are experiencing barriers to the uptake of specific practices are particularly encouraged, since they can help mitigate these barriers for the benefit of everyone.

Resources table

To further guide the readers, the manuscript is accompanied by a detailed table containing links and pointers to the resources featured in the text of each section (see the resources table).

References on this page
A1

Christopher Allen and David M A Mehler. Open science challenges, benefits and tips in early career and beyond. PLoS Biology, 17(5):e3000246, May 2019.

A2

John A Borghi and Ana E Van Gulick. Data management and sharing in neuroimaging: practices and perceptions of MRI researchers. PLoS One, 13(7):e0200562, July 2018.

A3

Rotem Botvinik-Nezer, Felix Holzmeister, Colin F Camerer, Anna Dreber, Juergen Huber, Magnus Johannesson, Michael Kirchler, Roni Iwanir, Jeanette A Mumford, R Alison Adcock, Paolo Avesani, Blazej M Baczkowski, Aahana Bajracharya, Leah Bakst, Sheryl Ball, Marco Barilari, Nadège Bault, Derek Beaton, Julia Beitner, Roland G Benoit, Ruud M W J Berkers, Jamil P Bhanji, Bharat B Biswal, Sebastian Bobadilla-Suarez, Tiago Bortolini, Katherine L Bottenhorn, Alexander Bowring, Senne Braem, Hayley R Brooks, Emily G Brudner, Cristian B Calderon, Julia A Camilleri, Jaime J Castrellon, Luca Cecchetti, Edna C Cieslik, Zachary J Cole, Olivier Collignon, Robert W Cox, William A Cunningham, Stefan Czoschke, Kamalaker Dadi, Charles P Davis, Alberto De Luca, Mauricio R Delgado, Lysia Demetriou, Jeffrey B Dennison, Xin Di, Erin W Dickie, Ekaterina Dobryakova, Claire L Donnat, Juergen Dukart, Niall W Duncan, Joke Durnez, Amr Eed, Simon B Eickhoff, Andrew Erhart, Laura Fontanesi, G Matthew Fricke, Shiguang Fu, Adriana Galván, Remi Gau, Sarah Genon, Tristan Glatard, Enrico Glerean, Jelle J Goeman, Sergej A E Golowin, Carlos González-García, Krzysztof J Gorgolewski, Cheryl L Grady, Mikella A Green, João F Guassi Moreira, Olivia Guest, Shabnam Hakimi, J Paul Hamilton, Roeland Hancock, Giacomo Handjaras, Bronson B Harry, Colin Hawco, Peer Herholz, Gabrielle Herman, Stephan Heunis, Felix Hoffstaedter, Jeremy Hogeveen, Susan Holmes, Chuan-Peng Hu, Scott A Huettel, Matthew E Hughes, Vittorio Iacovella, Alexandru D Iordan, Peder M Isager, Ayse I Isik, Andrew Jahn, Matthew R Johnson, Tom Johnstone, Michael J E Joseph, Anthony C Juliano, Joseph W Kable, Michalis Kassinopoulos, Cemal Koba, Xiang-Zhen Kong, Timothy R Koscik, Nuri Erkut Kucukboyaci, Brice A Kuhl, Sebastian Kupek, Angela R Laird, Claus Lamm, Robert Langner, Nina Lauharatanahirun, Hongmi Lee, Sangil Lee, Alexander Leemans, Andrea Leo, Elise Lesage, Flora Li, Monica Y C Li, Phui Cheng Lim, Evan N Lintz, Schuyler W Liphardt, Annabel B Losecaat Vermeer, Bradley C Love, Michael L Mack, Norberto Malpica, Theo Marins, Camille Maumet, Kelsey McDonald, Joseph T McGuire, Helena Melero, Adriana S Méndez Leal, Benjamin Meyer, Kristin N Meyer, Glad Mihai, Georgios D Mitsis, Jorge Moll, Dylan M Nielson, Gustav Nilsonne, Michael P Notter, Emanuele Olivetti, Adrian I Onicas, Paolo Papale, Kaustubh R Patil, Jonathan E Peelle, Alexandre Pérez, Doris Pischedda, Jean-Baptiste Poline, Yanina Prystauka, Shruti Ray, Patricia A Reuter-Lorenz, Richard C Reynolds, Emiliano Ricciardi, Jenny R Rieck, Anais M Rodriguez-Thompson, Anthony Romyn, Taylor Salo, Gregory R Samanez-Larkin, Emilio Sanz-Morales, Margaret L Schlichting, Douglas H Schultz, Qiang Shen, Margaret A Sheridan, Jennifer A Silvers, Kenny Skagerlund, Alec Smith, David V Smith, Peter Sokol-Hessner, Simon R Steinkamp, Sarah M Tashjian, Bertrand Thirion, John N Thorp, Gustav Tinghög, Loreen Tisdall, Steven H Tompson, Claudio Toro-Serey, Juan Jesus Torre Tresols, Leonardo Tozzi, Vuong Truong, Luca Turella, Anna E van 't Veer, Tom Verguts, Jean M Vettel, Sagana Vijayarajah, Khoi Vo, Matthew B Wall, Wouter D Weeda, Susanne Weis, David J White, David Wisniewski, Alba Xifra-Porxas, Emily A Yearling, Sangsuk Yoon, Rui Yuan, Kenneth S L Yuen, Lei Zhang, Xu Zhang, Joshua E Zosky, Thomas E Nichols, Russell A Poldrack, and Tom Schonberg. Variability in the analysis of a single neuroimaging dataset by many teams. Nature, 582(7810):84–88, June 2020.

A4

Colin F Camerer, Anna Dreber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Johan Almenberg, Adam Altmejd, Taizan Chan, Emma Heikensten, Felix Holzmeister, Taisuke Imai, Siri Isaksson, Gideon Nave, Thomas Pfeiffer, Michael Razen, and Hang Wu. Evaluating replicability of laboratory experiments in economics. Science, 351(6280):1433–1436, March 2016.

A5

Colin F Camerer, Anna Dreber, Felix Holzmeister, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Gideon Nave, Brian A Nosek, Thomas Pfeiffer, Adam Altmejd, Nick Buttrick, Taizan Chan, Yiling Chen, Eskil Forsell, Anup Gampa, Emma Heikensten, Lily Hummer, Taisuke Imai, Siri Isaksson, Dylan Manfredi, Julia Rose, Eric-Jan Wagenmakers, and Hang Wu. Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature Human Behaviour, 2(9):637–644, September 2018.

A6

Peter E Clayson, Andreas Keil, and Michael J Larson. Open science in human electrophysiology. Int. J. Psychophysiol., 174:43–46, April 2022.

A7

Hans de Jonge, Maria Cruz, and Stephanie Holst. Funders need to credit open science. Nature, 599(7885):372, November 2021.

A8

Alea López de San Román. Open science in horizon europe. April 2021.

A9

Timothy M Errington, Alexandria Denis, Nicole Perfito, Elizabeth Iorns, and Brian A Nosek. Challenges for assessing replicability in preclinical cancer biology. eLife, 2021.

A10

Timothy M Errington, Maya Mathur, Courtney K Soderberg, Alexandria Denis, Nicole Perfito, Elizabeth Iorns, and Brian A Nosek. Investigating the replicability of preclinical cancer biology. Elife, December 2021.

A11

Krzysztof J Gorgolewski and Russell A Poldrack. A practical guide for improving transparency and reproducibility in neuroimaging research. PLoS Biology, 14(7):e1002506, July 2016.

A12

Laurence T Hunt. The life-changing magic of sharing your data. Nat Hum Behav, 3(4):312–315, April 2019.

A13

Matthew Hutson. Artificial intelligence faces reproducibility crisis. Science, 359(6377):725–726, February 2018.

A14

David N Kennedy, Sanu A Abraham, Julianna F Bates, Albert Crowley, Satrajit Ghosh, Tom Gillespie, Mathias Goncalves, Jeffrey S Grethe, Yaroslav O Halchenko, Michael Hanke, Christian Haselgrove, Steven M Hodge, Dorota Jarecka, Jakub Kaczmarzyk, David B Keator, Kyle Meyer, Maryann E Martone, Smruti Padhy, Jean-Baptiste Poline, Nina Preuss, Troy Sincomb, and Matt Travers. Everything matters: the ReproNim perspective on reproducible neuroimaging. Front. Neuroinform., 13:1, February 2019.

A15

Richard A Klein, Michelangelo Vianello, Fred Hasselman, Byron G Adams, Reginald B Adams, Jr, Sinan Alper, Mark Aveyard, Jordan R Axt, Mayowa T Babalola, Štěpán Bahník, Rishtee Batra, Mihály Berkics, Michael J Bernstein, Daniel R Berry, Olga Bialobrzeska, Evans Dami Binan, Konrad Bocian, Mark J Brandt, Robert Busching, Anna Cabak Rédei, Huajian Cai, Fanny Cambier, Katarzyna Cantarero, Cheryl L Carmichael, Francisco Ceric, Jesse Chandler, Jen-Ho Chang, Armand Chatard, Eva E Chen, Winnee Cheong, David C Cicero, Sharon Coen, Jennifer A Coleman, Brian Collisson, Morgan A Conway, Katherine S Corker, Paul G Curran, Fiery Cushman, Zubairu K Dagona, Ilker Dalgar, Anna Dalla Rosa, William E Davis, Maaike de Bruijn, Leander De Schutter, Thierry Devos, Marieke de Vries, Canay Doğulu, Nerisa Dozo, Kristin Nicole Dukes, Yarrow Dunham, Kevin Durrheim, Charles R Ebersole, John E Edlund, Anja Eller, Alexander Scott English, Carolyn Finck, Natalia Frankowska, Miguel-Ángel Freyre, Mike Friedman, Elisa Maria Galliani, Joshua C Gandi, Tanuka Ghoshal, Steffen R Giessner, Tripat Gill, Timo Gnambs, Ángel Gómez, Roberto González, Jesse Graham, Jon E Grahe, Ivan Grahek, Eva G T Green, Kakul Hai, Matthew Haigh, Elizabeth L Haines, Michael P Hall, Marie E Heffernan, Joshua A Hicks, Petr Houdek, Jeffrey R Huntsinger, Ho Phi Huynh, Hans IJzerman, Yoel Inbar, Åse H Innes-Ker, William Jiménez-Leal, Melissa-Sue John, Jennifer A Joy-Gaba, Roza G Kamiloğlu, Heather Barry Kappes, Serdar Karabati, Haruna Karick, Victor N Keller, Anna Kende, Nicolas Kervyn, Goran Knežević, Carrie Kovacs, Lacy E Krueger, German Kurapov, Jamie Kurtz, Daniël Lakens, Ljiljana B Lazarević, Carmel A Levitan, Neil A Lewis, Jr, Samuel Lins, Nikolette P Lipsey, Joy E Losee, Esther Maassen, Angela T Maitner, Winfrida Malingumu, Robyn K Mallett, Satia A Marotta, Janko Međedović, Fernando Mena-Pacheco, Taciano L Milfont, Wendy L Morris, Sean C Murphy, Andriy Myachykov, Nick Neave, Koen Neijenhuijs, Anthony J Nelson, Félix Neto, Austin Lee Nichols, Aaron Ocampo, Susan L O'Donnell, Haruka Oikawa, Masanori Oikawa, Elsie Ong, Gábor Orosz, Malgorzata Osowiecka, Grant Packard, Rolando Pérez-Sánchez, Boban Petrović, Ronaldo Pilati, Brad Pinter, Lysandra Podesta, Gabrielle Pogge, Monique M H Pollmann, Abraham M Rutchick, Patricio Saavedra, Alexander K Saeri, Erika Salomon, Kathleen Schmidt, Felix D Schönbrodt, Maciej B Sekerdej, David Sirlopú, Jeanine L M Skorinko, Michael A Smith, Vanessa Smith-Castro, Karin C H J Smolders, Agata Sobkow, Walter Sowden, Philipp Spachtholz, Manini Srivastava, Troy G Steiner, Jeroen Stouten, Chris N H Street, Oskar K Sundfelt, Stephanie Szeto, Ewa Szumowska, Andrew C W Tang, Norbert Tanzer, Morgan J Tear, Jordan Theriault, Manuela Thomae, David Torres, Jakub Traczyk, Joshua M Tybur, Adrienn Ujhelyi, Robbie C M van Aert, Marcel A L M van Assen, Marije van der Hulst, Paul A M van Lange, Anna Elisabeth van 't Veer, Alejandro Vásquez- Echeverría, Leigh Ann Vaughn, Alexandra Vázquez, Luis Diego Vega, Catherine Verniers, Mark Verschoor, Ingrid P J Voermans, Marek A Vranka, Cheryl Welch, Aaron L Wichman, Lisa A Williams, Michael Wood, Julie A Woodzicka, Marta K Wronska, Liane Young, John M Zelenski, Zeng Zhijia, and Brian A Nosek. Many labs 2: investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science, 1(4):443–490, December 2018.

A16

X Li, L Ai, S Giavasis, H Jin, E Feczko, T Xu, J Clucas, A Franco, A S Heinsfeld, A Adebimpe, and Others. Moving beyond processing and analysis-related variation in neuroscience. bioRxiv, 2021.

A17

Florian Markowetz. Five selfish reasons to work reproducibly. Genome Biol., 16:274, December 2015.

A18

Erin C McKiernan, Philip E Bourne, C Titus Brown, Stuart Buck, Amye Kenall, Jennifer Lin, Damon McDougall, Brian A Nosek, Karthik Ram, Courtney K Soderberg, Jeffrey R Spies, Kaitlin Thaney, Andrew Updegrove, Kara H Woo, and Tal Yarkoni. How open science helps researchers succeed. eLife, July 2016.

A19(1,2)

Marcus R Munafò, Brian A Nosek, Dorothy V M Bishop, Katherine S Button, Christopher D Chambers, Nathalie Percie Du Sert, Uri Simonsohn, Eric Jan Wagenmakers, Jennifer J Ware, and John P A Ioannidis. A manifesto for reproducible science. Nature Human Behaviour, 1(1):1–9, 2017.

A20

Silas Boye Nissen, Tali Magidson, Kevin Gross, and Carl T Bergstrom. Publication bias and the canonization of false facts. eLife, December 2016.

A21

Brian A Nosek, Emorie D Beck, Lorne Campbell, Jessica K Flake, Tom E Hardwicke, David T Mellor, Anna E van 't Veer, and Simine Vazire. Preregistration is hard, and worthwhile. Trends in Cognitive Sciences, 23(10):815–818, October 2019.

A22

Brian A Nosek, Tom E Hardwicke, Hannah Moshontz, Aurélien Allard, Katherine S Corker, Anna Dreber, Fiona Fidler, Joe Hilgard, Melissa Kline Struhl, Michèle B Nuijten, Julia M Rohrer, Felipe Romero, Anne M Scheel, Laura D Scherer, Felix D Schönbrodt, and Simine Vazire. Replicability, robustness, and reproducibility in psychological science. Annu. Rev. Psychol., 73:719–748, January 2022.

A23(1,2)

Brian A Nosek, Jeffrey R Spies, and Matt Motyl. Scientific utopia: II. restructuring incentives and practices to promote truth over publishability. Perspect. Psychol. Sci., 7(6):615–631, November 2012.

A24(1,2)

Christian Paret, Nike Unverhau, Franklin Feingold, Russell A Poldrack, Madita Stirner, Christian Schmahl, and Maurizio Sicorello. Survey on open science practices in functional neuroimaging. bioRxiv, pages 2021.11.26.470115, November 2021.

A25(1,2)

Russell A Poldrack, Chris I Baker, Joke Durnez, Krzysztof J Gorgolewski, Paul M Matthews, Marcus R Munafò, Thomas E Nichols, Jean Baptiste Poline, Edward Vul, and Tal Yarkoni. Scanning the horizon: towards transparent and reproducible neuroimaging research. Nat. Rev. Neurosci., 18(2):115–126, 2017.

A26

Russell A Poldrack, Franklin Feingold, Michael J Frank, Padraig Gleeson, Gilles de Hollander, Quentin Jm Huys, Bradley C Love, Christopher J Markiewicz, Rosalyn Moran, Petra Ritter, Timothy T Rogers, Brandon M Turner, Tal Yarkoni, Ming Zhan, and Jonathan D Cohen. The importance of standards for sharing of computational models and data. Computational Brain & Behavior, 2(3-4):229–232, December 2019.

A27

Russell A Poldrack, Grace Huckins, and Gael Varoquaux. Establishment of best practices for evidence for prediction: a review. JAMA Psychiatry, 77(5):534–540, May 2020.

A28

Russell A Poldrack, Kirstie Whitaker, and David Kennedy. Introduction to the special issue on reproducibility in neuroimaging. Neuroimage, 218:116357, September 2020.

A29

Marta Serra-Garcia and Uri Gneezy. Nonreplicable publications are cited more than replicable ones. Science Advances, May 2021.

A30

Mark D Wilkinson, Michel Dumontier, I Jsbrand Jan Aalbersberg, Gabrielle Appleton, Myles Axton, Arie Baak, Niklas Blomberg, Jan-Willem Boiten, Luiz Bonino da Silva Santos, Philip E Bourne, Jildau Bouwman, Anthony J Brookes, Tim Clark, Mercè Crosas, Ingrid Dillo, Olivier Dumon, Scott Edmunds, Chris T Evelo, Richard Finkers, Alejandra Gonzalez-Beltran, Alasdair J G Gray, Paul Groth, Carole Goble, Jeffrey S Grethe, Jaap Heringa, Peter A C 't Hoen, Rob Hooft, Tobias Kuhn, Ruben Kok, Joost Kok, Scott J Lusher, Maryann E Martone, Albert Mons, Abel L Packer, Bengt Persson, Philippe Rocca-Serra, Marco Roos, Rene van Schaik, Susanna-Assunta Sansone, Erik Schultes, Thierry Sengstag, Ted Slater, George Strawn, Morris A Swertz, Mark Thompson, Johan van der Lei, Erik van Mulligen, Jan Velterop, Andra Waagmeester, Peter Wittenburg, Katherine Wolstencroft, Jun Zhao, and Barend Mons. The FAIR guiding principles for scientific data management and stewardship. Scientific Data, 3:160018, March 2016.

A31

McKiernan, Bourne, Brown, Buck, Kenall, and others. & spies, JR (2016). point of view: how open science helps researchers succeed. Elife, 2016.

A32

Open Science Collaboration. PSYCHOLOGY. estimating the reproducibility of psychological science. Science, 349(6251):aac4716, August 2015.

A33

The Turing Way Community, Becky Arnold, Louise Bowler, Sarah Gibson, Patricia Herterich, Rosie Higman, Anna Krystalli, Alexander Morley, Martin O'Reilly, and Kirstie Whitaker. The turing way: a handbook for reproducible data science. 2019.