Archived funding opportunity

This document has been archived. The latest version is NSF 15-540.

NSF 14-515: Promoting Research and Innovation in Methodologies for Evaluation (PRIME)

Program Solicitation

Document Information

Document History

Program Solicitation NSF 14-515

NSF Logo

National Science Foundation

Directorate for Education & Human Resources
     Research on Learning in Formal and Informal Settings

Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

     February 21, 2014

Important Information And Revision Notes

This solicitation has been revised to incorporate into the Other Information section a newly issued publication jointly developed by the National Science Foundation and the Institute of Education Sciences in the U.S. Department of Education entitled, Common Guidelines for Education Research and Development. The Guidelines describe six types of research studies that can generate evidence about how to increase student learning. Research types include those that generate the most fundamental understandings related to education and learning; examinations of associations between variables; iterative design and testing of strategies or interventions; and assessments of the impact of a fully-developed intervention on an education outcome. For each research type, there is a description of the purpose and the expected empirical and/or theoretical justifications, types of project outcomes, and quality of evidence.

The Guidelines publication can be found on the NSF website with the number NSF 13-126 (https://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf). A set of FAQs regarding the Guidelines are available with the number NSF 13-127 (https://www.nsf.gov/pubs/2013/nsf13127/nsf13127.pdf). Grant proposal writers and PIs are encouraged to familiarize themselves with both documents and use the information therein to help in the preparation of proposals to NSF.

Summary Of Program Requirements

General Information

Program Title:

Promoting Research and Innovation in Methodologies for Evaluation (PRIME)

Synopsis of Program:

The Promoting Research and Innovation in Methodologies for Evaluation (PRIME) program seeks to support research on evaluation with special emphasis on: (1) exploring innovative approaches for determining the impacts and usefulness of STEM education projects and programs; (2) building on and expanding the theoretical foundations for evaluating STEM education and workforce development initiatives, including translating and adapting approaches from other fields; and (3) growing the capacity and infrastructure of the evaluation field. Three types of proposals will be supported by the program: Exploratory Projects that include proof-of-concept and feasibility studies; more extensive Full-Scale Projects; and workshops and conferences.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.076 --- Education and Human Resources

Award Information

Anticipated Type of Award: Standard Grant or Continuing Grant

Estimated Number of Awards: 13 to 18 It is anticipated that between 13 and 18 projects will be awarded in FY 2014: approximately 7-10 full scale and approximately 6-8 exploratory projects will be selected for funding. The remainder of funds will be allocated to conference and workshop projects, RAPIDs and EAGERs, pending availability of funds.

Anticipated Funding Amount: $8,000,000 pending availability of funds.

Eligibility Information

Who May Submit Proposals:

The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the Grant Proposal Guide, Chapter I, Section E.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI:

There are no restrictions or limits.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not Applicable
  • Preliminary Proposal Submission: Not Applicable
  • Full Proposals:
    • Full Proposals submitted via FastLane: NSF Proposal and Award Policies and Procedures Guide, Part I: Grant Proposal Guide (GPG) Guidelines apply. The complete text of the GPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg.
    • Full Proposals submitted via Grants.gov: NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov Guidelines apply (Note: The NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide)

B. Budgetary Information

  • Cost Sharing Requirements: Inclusion of voluntary committed cost sharing is prohibited.
  • Indirect Cost (F&A) Limitations: Not Applicable
  • Other Budgetary Limitations: Not Applicable

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

         February 21, 2014

Proposal Review Information Criteria

Merit Review Criteria: National Science Board approved criteria apply.

Award Administration Information

Award Conditions: Standard NSF award conditions apply.

Reporting Requirements: Standard NSF reporting requirements apply.

I. Introduction

The National Science Foundation (NSF) is charged with promoting the vitality of the nation's science, technology, engineering and mathematics (STEM) research and education enterprises. As part of this mission, the Directorate for Education and Human Resources (EHR) has primary responsibility for providing national, research-based leadership in STEM education. The mission of EHR is to advance excellence in U.S. STEM education at all levels and in all settings (both formal and informal) through the support of research and development and the preparation of a diverse workforce of scientists, technicians, engineers, mathematicians and educators. Along with these advances in STEM education come growing pressures for public accountability by individual education projects and the larger programs that typically fund them. Federal agencies are being called upon to show that funding priorities are evidence-based, and to provide plans for how both project and program evaluations demonstrate or validate impact and are used to support budget priorities (US Office of Management and Budget [OMB], 2010, 2012, & 2013).

The new developments in STEM education and workforce development, along with increasing pressures for accountability, challenge evaluators to develop innovative evaluation approaches, questions, theories, methodologies, measures, analytic tools and reporting formats. Evaluation theory and practice need to complement innovations in STEM education and human resource development in order for program evaluations to inform decision-making, meet accountability requirements, and provide useful information for program improvement.

The Promoting Research and Innovation in Methodologies for Evaluation (PRIME) program seeks to advance evaluation theory and practice across all levels of the STEM education enterprise in both formal and informal settings. PRIME calls for studies with special emphasis on developing innovative STEM evaluation methodologies and identifying ways to measure or demonstrate the impacts of STEM education programs. Approaches are encouraged that address new ways to conceptualize evaluation, such as a focus on themes of national importance (e.g., teacher education, cyberlearning, innovation) rather than on particular projects or programs. Other areas of interest include assessing the cumulative effects of engaging in STEM programs over time or determining impact in the context of complex and multivariate causation that is inherent to STEM learning in real-world settings.

II. Program Description

The overarching goal of the PRIME program is to support the development, demonstration, and validation of innovative new methodologies and approaches in STEM evaluation. To address this goal, the program is interested in proposals that:

  1. Explore innovative new approaches for determining the impact and usefulness of evaluations of STEM education projects or programs, with appropriate rigor.
  2. Expand the theoretical foundations for evaluating STEM education and human resource initiatives, including translating approaches from other fields.
  3. Increase the capacity of and infrastructure for researchers and evaluators by increasing the number of individuals who can produce conceptually sound and methodologically appropriate evaluations of STEM education and workforce projects, portfolios, and programs.

Evaluation contexts, and thus problems, addressed in response to this solicitation may vary from large-scale system change to individual experiences and impacts. They may focus on any level of education and on any combination of formal or informal settings. Proposed goals can be as broad as developing new evaluation research designs, or as focused as developing metrics to measure the impacts of new learning environments (e.g., massive open online courses [MOOCs] and badges)or changes in traditional classroom practices. Some initiatives may target specific fine-grained topics within a STEM subject area, while others may focus on STEM fields more broadly. They may be purely theoretical or have large empirical components. Finally, proposals may include a wide range of design features (e.g., partnerships) and goals (e.g., broadening participation) that may exist within and across specific STEM education and workforce initiatives. These examples are presented to illustrate that the solicitation permits a broad range of entry points, issues, and settings. We encourage the field to engage these issues creatively in preparing proposals.

Examples of potential projects (meant to be suggestive, not prescriptive nor limiting) within each of the three program sub-goals are:

Explore innovative new approaches for determining the impact and usefulness of evaluations of STEM education projects or programs, with appropriate rigor.

  • Adapt methods of rapid evaluation, assessment, and appraisal for use in formal and informal STEM education at all levels.
  • Develop context-sensitive theory and methods that assess fidelity and adaptation of implementation with emphasis on assessing impact on student outcomes.
  • Develop rigorous methodologies for examining either the collective or disaggregated impact of learners' participation in multiple government-funded STEM education and broadening participation programs.
  • Create innovative approaches to determining what works, for whom, under what circumstances.
  • Develop methods for evaluating a broader range of outcomes beyond traditional achievement test scores, such as motivation, literacy, persistence in STEM careers, or broader community outcomes such as economic or societal impacts.
  • Identify methodological approaches for assessing the impact of major policy initiatives, such as College and Career Ready Standards.
  • Design methodologies for conducting meta-evaluations and addressing tensions between project- and program-level evaluations.
  • Create innovative longitudinal methodologies to track students' progress across levels of schooling.
  • Develop methods for evaluating early-stage, transformative approaches in STEM.
  • Design techniques that allow learners to demonstrate their STEM competence in informal settings without undermining the voluntary nature of learning in such settings.
  • Assess the usefulness and use of innovative evaluation approaches for decision-making, program improvement, and accountability.
  • Take advantage of large survey data from studies conducted by NCSES, NCES, and Census to provide context and meaning to new evaluation models and studies.
  • Explore the use of integrated and linked longitudinal data from state and local databases to conduct cross state evaluations.

Expand the theoretical foundations for evaluating STEM education and workforce initiatives, including translating approaches from other fields.

  • Adapt or apply methods used in other fields (e.g., organizational theory, science of science policy, public health, economics) to STEM education and learning settings.
  • Adapt methods from epidemiology and/or media research to assess the spread of educational innovation throughout STEM education
  • Challenge assumptions in determining causality and attribution of impact by demonstrating viable approaches
  • Translate fundamental educational research, STEM discipline-based educational research, and/or models of evaluation into innovative methodologies to determine the impact, effectiveness, and utility of STEM education projects and programs.
  • Incorporate use of national and state databases into STEM evaluation.

Increase the capacity of and infrastructure for researchers and evaluators by increasing the number of individuals who can produce conceptually sound and methodologically appropriate evaluations of STEM education and workforce projects, portfolios, and programs.

  • Create extended systems of professional development and training to support a range of evaluation professionals.
  • Foster communities of practice that develop, test, and share evaluation approaches.
  • Develop theoretical and empirical solutions to common systemic problems such as limited use of evaluation findings or resistance to documentation of negative results.

STEM content and context should be a central factor of any evaluative approach taken. Therefore, proposal teams should specify, as necessary, a range of experts such as education researchers, evaluators, methodologists, STEM disciplinary scientists and engineers, social, behavioral, economic and cognitive scientists. PRIME especially encourages proposals that incorporate or adapt the ways in which other disciplines conduct evaluation.

Eligible Proposal Types

1. Exploratory Projects

Exploratory projects are small-scale explorations that include proof-of-concept and feasibility studies. Exploratory projects must describe relevant literature, evaluation research questions, data to be gathered and analytic approaches to be taken. Not all Exploratory projects will result in a subsequent, full-scale proposal. However, for those that do, the results and implications of the exploratory work must be explicitly described. Exploratory projects cannot exceed $250,000 total and a duration of two years.

2. Full-Scale Projects

Full-scale projects are larger in scope and may investigate pressing issues facing the field; develop innovative evaluation methodologies or approaches; or build capacity for rigorous, useful evaluations. Full-scale projects cannot exceed $800,000 total and a duration of three years.

3. Conferences and Workshops

The PRIME program may support a few well-focused conferences and workshops that have the potential to transform the field. Budgets are expected to be related to the duration of the event and the number of participants. Typical costs are around $100,000. Proposals should include a conceptual framework for the conference, a draft agenda, a possible participant list, and the likely outcomes or products that will result from the conference. Proposals may be submitted at any time, generally at least one year in advance of when the conference would be held. Please see the NSF Grant Proposal Guide, GPG Section II D.8, for additional information about conference and workshop proposals.

All proposals (except those for conferences and workshops) are expected to specifically address the following requirements:

  • Specify the theoretical underpinnings from one or more education and social science disciplines that will drive the research and development.
  • Specify the settings or contexts where the evaluative research will occur such as geographical location, time span, population, etc.
  • Outline creative strategies for engaging communities of practitioners, evaluators, researchers, and STEM content experts (as appropriate) for the co-design of approaches and effective dissemination of project outcomes.
  • Identify desired outcomes of the research and development such as potential products and the audiences/ communities who will find them useful.
  • Specify the methodologies to be researched or developed either by pursuing new areas or by translating and applying existing approaches in creative and innovative ways.
  • Identify an evaluation plan that describes how outside feedback on the work will be obtained (external evaluation, advisory board, etc.).

References

National Research Council. (1999). How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press.

National Research Council. (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academy Press.

National Research Council. (2009). Learning science in informal environments: People, ideas, and pursuits. Washington, DC: National Academy Press.

US Office of Management and Budget. (2013). Next Steps in the Evidence and Innovation Agenda: Memorandum to the heads of Departments and Agencies. (Document M-13-17). Washington, DC: Author. Retrieved August 27, 2013 from http://www.whitehouse.gov/sites/default/files/omb/memoranda/2013/m-13-17.pdf.

US Office of Management and Budget. (2012). Use of Evidence and Evaluation in the 2014 Budget: Memorandum to the heads of Executive Departments and Agencies. (Document M-12-14). Washington, DC: Author. Retrieved August 27, 2013 from http://www.whitehouse.gov/sites/default/files/omb/memoranda/2012/m-12-14.pdf.

US Office of Management and Budget. (2010). Evaluating Programs for Efficacy and Cost Efficiency: Memorandum for the heads of Executive Departments and Agencies. (Document M-10-32). Washington, DC: Author. Retrieved September 3, 2010 from http://www.whitehouse.gov/sites/default/files/omb/memoranda/2010/m10-32.pdf.

Resources

Page, S. E. (2007). The difference: How the power of diversity creates better groups, firms, schools and societies. Princeton, NJ: Princeton University Press.

National Research Council. (2002). Scientific research in education. Washington, DC: National Academy Press.

National Research Council. (2010). Study of Teacher Preparation Programs in the U.S. Washington, DC: National Academy Press.

NSF Task Force on Cyberlearning. (2008). Fostering learning in the networked world: The cyberlearning opportunity and challenge. (NSF 08-204). Arlington, VA: National Science Foundation.

Committee on Prospering in the Global Economy of the 21st Century: An Agenda for American Science and Technology, National Academy of Sciences, National Academy of Engineering, Institute of Medicine. (2007). Rising Above the Gathering Storm: Energizing and Employing American for a Brighter Economic Future. Washington, DC: National Academy Press.

Committee on Undergraduate Science Education, Center for Science, Mathematics, and Engineering Education, National Research Council. (1999). Transforming Undergraduate Education in Science, Mathematics, Engineering, and Technology. Washington, DC: National Academy Press.

Board on Science Education, Center for Education, National Research Council. (2008). Evidence on Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education. Retrieved September 3, 2010 from http://www7.nationalacademies.org/bose/Promising%20Practices_Homepage.html.

Yarbrough, D. B., Shulha, L. M., Hopson, R. K., and Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, California: Sage. Retrieved September 3, 2010 from http://www.jcsee.org/program-evaluation-standards.

American Evaluation Association. (2004). Guiding Principles for Evaluators Fairhaven, MA: Author. Retrieved September 3, 2010 from http://www.eval.org/p/cm/ld/fid=51.

III. Award Information

It is anticipated that between 13 and 18 projects will be awarded in FY 2014: approximately 7-10 full scale and approximately 6-8 exploratory projects will be selected for funding. The remainder of funds will be allocated to conference and workshop projects, RAPIDs and EAGERs, pending availability of funds.

IV. Eligibility Information

Who May Submit Proposals:

The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the Grant Proposal Guide, Chapter I, Section E.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI:

There are no restrictions or limits.

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system.

  • Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. Chapter II, Section D.4 of the Grant Proposal Guide provides additional information on collaborative proposals.

Important Proposal Preparation Information: FastLane will check for required sections of the full proposal, in accordance with Grant Proposal Guide (GPG) instructions described in Chapter II.C.2. The GPG requires submission of: Project Summary; Project Description; References Cited; Biographical Sketch(es); Budget; Budget Justification; Current and Pending Support; Facilities, Equipment & Other Resources; Data Management Plan; and Postdoctoral Mentoring Plan, if applicable. If a required section is missing, FastLane will not accept the proposal.

Please note that the proposal preparation instructions provided in this program solicitation may deviate from the GPG instructions. If the solicitation instructions do not require a GPG-required section to be included in the proposal, insert text or upload a document in that section of the proposal that states, "Not Applicable for this Program Solicitation." Doing so will enable FastLane to accept your proposal.

Please note that per guidance in the GPG, the Project Description must contain, as a separate section within the narrative, a discussion of the broader impacts of the proposed activities. Unless otherwise specified in this solicitation, you can decide where to include this section within the Project Description.

PRIME has five additional preparation requirements that each proposal must address: (1) letters of agreement to participate, (2) research design and methodology, (3) project evaluation (4) project personnel and management, and (5) dissemination.

Letters of agreement to participate: As appropriate, there are two types of letters that may be included in the supplementary documents section of the proposal. First, proposals are expected to include letters of agreement to participate from all appropriate organizations that provide the context for data collection and/or play a substantial role in ensuring access to required resources. Second, proposals are expected to contain letters of agreement to participate from members of advisory committees.

Research design and methodology: PRIME expects investigators to propose appropriate and rigorous research methods, whether quantitative, qualitative, or mixed-method. Investigators are expected to conduct their research so that relevant models, frameworks, data, literature, and measures are well-documented, usable, and replicable by other teams wishing to work on similar problems from other vantage points or by using other research designs. The proposed methods should be well-justified, consonant with theory, and suited to the stated questions or hypotheses. Each supported project must meet the following basic requirements:

  • The proposed topics, questions, methodologies and settings must be consistent with the overall goals of the PRIME program.
  • Investigators must demonstrate how the proposed evaluative research builds upon existing evidence obtained from relevant prior literature in one or more relevant domains.
  • PRIME encourages and supports research using a range of designs. The design and methods must explicitly describe the design, including the data collection, data analysis, and data interpretation plans, along with the limitations of the study.
    • For quantitative proposals the description of methods should include: the study sample and selection process, instruments and other means of data collection, power analyses, and minimal detectable effects sizes, and the models to be tested. Information must also be provided on the reliability, validity, and appropriateness of proposed measures and instruments. If the reliability and validity of instruments are not known, the proposal must describe specific plans for establishing these measurement properties. This list is not exhaustive; it is only intended to demonstrate the level of detail expected by proposals.
    • For qualitative proposals with qualitative components, the description of methods should include: the rationale for the populations and cases to be studied; detail on the sources of data, such as observations or artifacts; efforts to triangulate findings; reliability and validity of instruments and protocols; procedures to code and interpret the data; and plans for ensuring authenticity and validity of findings. This list is not exhaustive; it is only intended to demonstrate the level of detail expected by proposals.
    • For mixed-methods studies, both the qualitative and quantitative methods should be clear and follow the suggestions above. In addition, the proposal should describe the ways that the qualitative and quantitative portions connect and/or would inform each other.

Project Evaluation: All projects must have an evaluation plan that is appropriate to the goals of the project and explicitly describes the approach that the project team intends to use in assessing its successes and failures and meeting its milestones and objectives. Project evaluations should be sufficiently distant from the project to be objective but should be designed to be of most help to the project team pursuant to its responsibilities to the field.

All projects must have a substantive external expert review mechanism that provides regular critical review on the project's methods and progress, analysis procedures, interpretation of data into findings, and dissemination activities. In some cases, the expert review mechanism may not be sufficient given the nature of the project and an independent evaluator may be required. Finally, proposals should describe how evaluation input will be used to shape the project.

Project personnel and management: The research and management roles of each of the senior personnel on the project must be described in brief within the project description. Collaborative teams representing multiple disciplines are typical in PRIME projects.

Where projects request time for students and other trainees, the project description should be clear on their roles and responsibilities. Investigators are reminded that all proposals requesting funds to support postdoctoral researchers are required by NSF to submit a one-page mentoring plan within the supplementary documents section or the proposal. FastLane will not permit submission of a proposal if the Plan is missing.

PRIME encourages the inclusion of women, persons with disabilities, underrepresented racial and ethnic groups, and diverse viewpoints.

Dissemination: All PRIME projects are expected to accumulate and communicate knowledge to relevant evaluation, research, policy, practitioner, and other communities. As part of EHR's strong and unwavering commitment to the broader impacts of funded research, reports from successful PRIME projects must be published in peer-reviewed professional or scholarly journals, and findings (positive or negative) must be disseminated through appropriate means to audiences relevant to the goals of the project.

B. Budgetary Information

Cost Sharing: Inclusion of voluntary committed cost sharing is prohibited

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

         February 21, 2014

D. FastLane/Grants.gov Requirements

  • For Proposals Submitted Via FastLane:

To prepare and submit a proposal via FastLane, see detailed technical instructions available at: https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

  • For Proposals Submitted Via Grants.gov:
  • Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: http://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

    Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

Proposers that submitted via FastLane are strongly encouraged to use FastLane to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF Proposal Processing And Review Procedures

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in the GPG as Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: http://nsf.gov/bfa/dias/policy/merit_review/.

Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in Empowering the Nation Through Discovery and Innovation: NSF Strategic Plan for Fiscal Years (FY) 2011-2016. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

One of the core strategies in support of NSF's mission is to foster integration of research and education through the programs, projects and activities it supports at academic and research institutions. These institutions provide abundant opportunities where individuals may concurrently assume responsibilities as researchers, educators, and students, and where all can engage in joint efforts that infuse education with the excitement of discovery and enrich research through the variety of learning perspectives.

Another core strategy in support of NSF's mission is broadening opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (GPG Chapter II.C.2.d.i. contains additional information for use by proposers in development of the Project Description section of the proposal.) Reviewers are strongly encouraged to review the criteria, including GPG Chapter II.C.2.d.i., prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and underrepresented minorities in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will be completed and submitted by each reviewer. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer at least 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). Within 90 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag.

VIII. Agency Contacts

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

For questions related to the use of FastLane, contact:

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

IX. Other Information

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website at https://public.govdelivery.com/accounts/USNSF/subscriber/new?topic_id=USNSF_179.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this new mechanism. Further information on Grants.gov may be obtained at http://www.grants.gov.

This solicitation has been revised to incorporate into the Other Information section a newly issued publication jointly developed by the National Science Foundation and the Institute of Education Sciences in the U.S. Department of Education entitled, Common Guidelines for Education Research and Development. The Guidelines describe six types of research studies that can generate evidence about how to increase student learning. Research types include those that generate the most fundamental understandings related to education and learning; examinations of associations between variables; iterative design and testing of strategies or interventions; and assessments of the impact of a fully-developed intervention on an education outcome. For each research type, there is a description of the purpose and the expected empirical and/or theoretical justifications, types of project outcomes, and quality of evidence.

The Guidelines publication can be found on the NSF website with the number NSF 13-126 (https://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf). A set of FAQs regarding the Guidelines are available with the number NSF 13-127 (https://www.nsf.gov/pubs/2013/nsf13127/nsf13127.pdf). Grant proposal writers and PIs are encouraged to familiarize themselves with both documents and use the information therein to help in the preparation of proposals to NSF.

About The National Science Foundation

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See Grant Proposal Guide Chapter II, Section D.2 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

  • Location:

4201 Wilson Blvd. Arlington, VA 22230

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:

Send an e-mail to:

nsfpubs@nsf.gov

or telephone:

(703) 292-7827

  • To Locate NSF Employees:

(703) 292-5111


Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004), and NSF-51, "Reviewer/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Office of the General Counsel
National Science Foundation
Arlington, VA 22230