Email Print Share

This document has been archived.

NSF 17-022

Dear Colleague Letter: Encouraging Reproducibility in Computing and Communications Research

October 21, 2016

Dear Colleagues:

The National Science Foundation (NSF) recognizes a general and growing concern among researchers [1, 2] that a number of influences-including bias toward positive results, competition to rush findings to print, an overemphasis on presenting conceptual breakthroughs in high-impact venues, and a lack of incentives for academic researchers to retract irreproducible findings-have combined to reduce standards of reproducibility and rigor in research, and thus retard the general progress of science and engineering [3].

Given that research in computing and communications is not immune to these influences, and building upon other ongoing efforts to promote reproducibility [4], the Directorate for Computer and Information Science and Engineering (CISE) announces through this Dear Colleague Letter (DCL) its intention to support research that improves the level of reproducibility in research on computer systems and networking; modeling, analysis and simulation of computing and communication systems; and cybersecurity.

Specifically, CISE encourages principal investigators (PIs) submitting new proposals to or with active awards in its Computer and Network Systems (CNS) core, Computing and Communication Foundations (CCF) core, and Secure and Trustworthy Cyberspace (SaTC) programs to embrace completeness and transparency in developing rigorous protocols as well as in making experimental parameters and collected data available to other researchers. In particular, PIs are strongly encouraged to describe, as part of their data management plans, how they will provide access to well-documented datasets, modeling and/or simulation tools, and codebases to support reproducibility of their methods.

Reproducibility can occur across different realms-numerical, empirical, computational and statistical-and may be analytical, direct, systematic or conceptual. Reproducibility can be interpreted to include traits such as repeatability, robustness, reliability and generalizability [5].

Through this DCL, the participating programs noted above encourage proposals that specifically seek to reproduce, verify and/or characterize recent results in the disciplines covered under each program's ambit of research. Such proposals should identify the key results to be reproduced, motivate the importance of the results to the community and the need for independent validation, and present rigorous methodologies for experimentation with the goal of extensively and thoroughly characterizing the operating parameters under which these results can be reproduced. Where practical, proposers should also propose models and openly accessible repositories for complete data sharing of all results from these experiments. Research in Undergraduate Institutions (RUI) proposals are particularly encouraged.

PROPOSAL PREPARATION AND SUBMISSION

Full proposals relevant to this DCL should be submitted pursuant to existing program solicitations for the CCF core, CNS core, and SaTC programs, including all proposal preparation instructions specified therein. Limits on the number of submissions per PI pursuant to these program solicitations are unchanged by this DCL. The relevant program solicitations are linked below:

REVIEW PROCESS

Proposals submitted pursuant to this DCL will be reviewed alongside all other proposals received in response to the solicitation for each participating program. Proposals will be reviewed in accordance with NSF's review criteria; reviewers are asked to evaluate research on both its intellectual merit and broader impacts. Proposals pursuant to this DCL may also be evaluated together in a group, with specific review criteria focusing on the completeness of the experiment design and assessment of data sharing practices.

Proposers should be aware that the NSF merit review principles (https://www.nsf.gov/pubs/policydocs/pappguide/nsf16001/gpg_3.jsp#IIIA1) as well as the NSF merit review criteria (https://www.nsf.gov/pubs/policydocs/pappguide/nsf16001/gpg_3.jsp#IIIA2) support proposals that advance reproducibility in scientific research to the same extent as those that advance transformative research.

POINTS OF CONTACT

For further information, interested PIs may contact:

  • Thyaga Nandagopal [Program Director, CISE/CNS, telephone: (703) 292-8950, email: tnandago@nsf.gov];
  • Mimi McClure [Program Director, CISE/CNS, telephone: (703) 292-8950, email: mmcclure@nsf.gov];
  • Darleen Fisher [Program Director, CISE/CNS, telephone: (703) 292-8950, email: dlfisher@nsf.gov];
  • Nina Amla [Program Director, SaTC & CISE/CCF, telephone: (703) 292-8910, email: namla@nsf.gov];
  • D. Richard Brown [Program Director, CISE/CCF, telephone: (703) 292-8910, email: ribrown@nsf.gov];
  • Almadena Chtchelkanova [Program Director, CISE/CCF, telephone: (703) 292-8910, email: achtchel@nsf.gov]; and/or
  • Tracy Kimbrel [Program Director, CISE/CCF, telephone: (703) 292-8910, email: tkimbrel@nsf.gov].


Sincerely,
Jim Kurose
Assistant Director
Computer and Information Science and Engineering (CISE)

References:

[1] Ralph Cicerone, "Research Reproducibility, Replicability, Reliability", National Academy of Sciences (NAS) Annual Meeting, April 2015.

[2] "Data Replication & Reproducibility", Science, ISSN 0036-8075, http://www.sciencemag.org/site/special/data-rep/.

[3] "Further confirmation needed", Nature Biotechnology, Volume 30, issue 806, September 2012.

[4] "Big Data Regional Innovation Hubs: Establishing Spokes to Advance Big Data Applications (BD Spokes)," NSF 16-510, https://www.nsf.gov/pubs/2016/nsf16510/nsf16510.htm.

[5] "Reality check on reproducibility," Nature, Volume 533, issue 7604, May 2016.