Innovative Technology Experiences for Students and Teachers (ITEST)

Program Solicitation
NSF 19-583

Replaces Document(s):
NSF 17-565

NSF Logo

National Science Foundation

Directorate for Education and Human Resources
     Research on Learning in Formal and Informal Settings

Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

     August 19, 2019

     August 14, 2020

     August 13, 2021

IMPORTANT INFORMATION AND REVISION NOTES

Revisions to program solicitation:

  • Modified project types, including adding calls for synthesis, conference, and resource center projects
  • Revised required components for proposals

Any proposal submitted in response to this solicitation should be submitted in accordance with the revised NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 19-1), which is effective for proposals submitted, or due, on or after February 25, 2019.

SUMMARY OF PROGRAM REQUIREMENTS

General Information

Program Title:

Innovative Technology Experiences for Students and Teachers (ITEST)

Synopsis of Program:

ITEST is an applied research and development (R&D) program providing direct student learning opportunities in pre-kindergarten through high school (PreK-12). The learning opportunities are based on innovative use of technology to strengthen knowledge and interest in science, technology, engineering, and mathematics (STEM) and information and communication technology (ICT) careers. To achieve this purpose, ITEST supports projects that engage students in technology-rich experiences that: (1) increase awareness and interest of STEM and ICT occupations; (2) motivate students to pursue appropriate education pathways to those occupations; and (3) develop STEM-specific disciplinary content knowledge and practices that promote critical thinking, reasoning, and communication skills needed for entering the STEM and ICT workforce of the future.

ITEST seeks proposals that pursue innovative instructional approaches and practices in formal and informal learning environments, in close collaboration with strategic partnerships. ITEST proposals should broaden participation of all students, particularly those in underrepresented and underserved groups in STEM fields and related education and workforce domains. ITEST supports three types of projects: (1) Exploring Theory and Design Principles (ETD); (2) Developing and Testing Innovations (DTI); and (3) Scaling, Expanding, and Iterating Innovations (SEI). ITEST also supports Synthesis and Conference proposals.

All ITEST proposals must address how they are (A) designing innovations that meet ITEST program goals which include innovative use of technologies, innovative learning experiences, STEM workforce development, strategies for broadening participation, and strategic partnerships; and (B) measuring outcomes through high-quality research which includes high-quality research design, project evaluation, and dissemination of findings.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

  • Address questions to the Program, telephone: (703) 292-8620, email: DRLITEST@nsf.gov

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.076 --- Education and Human Resources

Award Information

Anticipated Type of Award: Standard Grant or Continuing Grant

Estimated Number of Awards: 22 to 30

ITEST expects to fund between 22 and 30 awards per year depending on the type of proposal and funding level.

  • 6 to 8 awards for Exploring Theory and Design Principles for Innovations (ETD) with durations up to three years and budgets up to $400,000;
  • 8 to 10 awards for Developing and Testing Innovations (DTI) with durations up to four years and budgets up to $1,500,000;
  • 3 to 5 awards for Scaling, Expanding, and Iterating Innovations (SEI) with durations up to five years and budgets up to $3,000,000;
  • 2 to 3 awards for Syntheses with durations up to two years and budgets up to $300,000; and
  • 2 to 3 awards for Conferences with durations of one year and budgets up to $100,000.
  • In addition, ITEST intends to fund one Resource Center with duration up to three years and total funding up to $4,000,000 in FY 2020. This award may be made as a continuing award.

Anticipated Funding Amount: $25,000,000 to $30,000,000

NSF anticipates having approximately $25,000,000 to $30,000,000 available for the FY20 competition and approximately $25,000,000 to $30,000,000 each year thereafter.

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds

Eligibility Information

Who May Submit Proposals:

The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the NSF Proposal & Award Policies & Procedures Guide (PAPPG), Chapter I.E.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI:

There are no restrictions or limits.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not required
  • Preliminary Proposal Submission: Not required

B. Budgetary Information

  • Cost Sharing Requirements:

    Inclusion of voluntary committed cost sharing is prohibited.

  • Indirect Cost (F&A) Limitations:

    Not Applicable

  • Other Budgetary Limitations:

    Other budgetary limitations apply. Please see the full text of this solicitation for further information.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

         August 19, 2019

         August 14, 2020

         August 13, 2021

Proposal Review Information Criteria

Merit Review Criteria:

National Science Board approved criteria. Additional merit review considerations apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:

Standard NSF award conditions apply.

Reporting Requirements:

Additional reporting requirements apply. Please see the full text of this solicitation for further information.

TABLE OF CONTENTS

Summary of Program Requirements

  1. Introduction

  2. Program Description

  3. Award Information

  4. Eligibility Information

  5. Proposal Preparation and Submission Instructions
    1. Proposal Preparation Instructions
    2. Budgetary Information
    3. Due Dates
    4. FastLane/Research.gov/Grants.gov Requirements

  6. NSF Proposal Processing and Review Procedures
    1. Merit Review Principles and Criteria
    2. Review and Selection Process

  7. Award Administration Information
    1. Notification of the Award
    2. Award Conditions
    3. Reporting Requirements

  8. Agency Contacts

  9. Other Information

I. INTRODUCTION

The vision of the National Science Foundation (NSF) is for all people in the United States to have lifelong access to high-quality science, technology, engineering, and mathematics (STEM) education and for the United States to be a global leader in STEM literacy, innovation, and employment (National Science Foundation, 2018). The pace of global innovation is accelerating, as is the competition for scientific and technological talent. Today, the economic prosperity and national security of the United States rests increasingly on its capacity for continued scientific and technological innovation. The national innovation effort depends more than ever on strong, cross-sector collaborations around common STEM education interests and goals. Such collaborations will create an effective and inclusive STEM education ecosystem that prepares PreK-12 students for STEM careers, fosters entrepreneurship, and provides all people, particularly those from underserved and underrepresented populations, with access to excellent STEM education throughout their lifetimes. Recognizing that a first-class STEM education should be accessible to people of all ages, backgrounds, communities, and career paths, organizations from across the entire STEM education ecosystem, such as PreK-12 education systems, higher education institutions, industry, private and public businesses, and professional societies are working to build capacity for a stronger future STEM and ICT workforce.

NSF initiated ITEST in 2003 through revenue from the Federal H-1B visa program. The program supports Research and Development (R&D) efforts to engage students and teachers in experiences that increase knowledge of and interest in STEM and ICT careers. Accordingly, ITEST reflects the vision of the NSF and the Directorate for Education and Human Resources (EHR). This vision includes advancing knowledge in STEM Learning and Learning Environments, Broadening Participation in STEM, and STEM Workforce Development.

The goals of ITEST support NSF's 10 Big Ideas, which can be accessed at: https://www.nsf.gov/news/special_reports/big_ideas/.

The goals of ITEST are consistent with the NSF Strategic Plan 2018-2022, which can be accessed at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf18045. They are also consistent with the federal STEM education strategic plan, Charting A Course for Success: America’s Strategy for STEM Education Report, which can be accessed at https://www.whitehouse.gov/wp-content/uploads/2018/12/STEM-Education-Strategic-Plan-2018.pdf.

II. PROGRAM DESCRIPTION

Successful ITEST proposals will meet the intellectual merit and broader impacts review criteria by describing the ways in which proposed technology-rich learning opportunities will strengthen knowledge and interest in STEM and ICT careers among PreK-12 students. ITEST proposals must address how they are (A) designing innovations that meet ITEST program goals, and (B) measuring outcomes through high-quality research. This section further describes the required design elements, research elements, and the available project types.

A. Designing Innovations that Meet ITEST Program Goals

ITEST projects provide direct learning opportunities in formal and informal PreK-12 settings that leverage innovative use of technology to strengthen knowledge and interest in STEM and ICT careers. Measuring knowledge and interest includes attending to cognitive outcomes (such as changes in knowledge related to STEM and ICT domains and careers), and social-emotional outcomes (such as changes in motivation, engagement, interest, dispositions, or attitudes towards STEM and ICT careers) for individuals and groups of students. ITEST project innovations will accomplish these goals through strategic partnerships, and with a focus on broadening the participation of groups who are underrepresented in STEM and ICT careers. To meet ITEST program goals, proposals should include design elements that address five components: innovative use of technology, innovative learning experiences, STEM workforce development, strategies for broadening participation, and strategic partnerships.

A1. Innovative Use of Technologies

ITEST requires that learning opportunities involve the innovative use of technologies, particularly those that are relevant to the STEM workforce of the future. Addressing this component could include using new and leading-edge tools, programs, and equipment, or creative ways to use existing technologies, including software and other resources. Proposals will explain how the innovation includes students’ direct use and engagement with the technology and the contexts that surround the technology use that ensure meaningful learning.

The following details are essential for clear descriptions of this required component:

  • Providing an explicit rationale for how students’ and teachers’ engagement with innovative technologies will develop STEM knowledge, skills, and dispositions needed to contribute to the future STEM and ICT workforce. This includes specifying the anticipated learning outcomes of the technology experiences, explaining how the experiences will result in the targeted outcomes, and describing the evidence that will be used to measure the extent to which the targeted cognitive and social-emotional learning outcomes will be achieved.
  • Explaining how innovative technology experiences are developmentally and age-appropriate for students and suited for target populations of students and teachers, particularly for underserved and underrepresented student populations.

A2. Innovative Learning Experiences

The ITEST program seeks innovative STEM learning experiences in both formal and informal environments that strengthen students' knowledge and interest in STEM and ICT careers. Projects should include clear descriptions linking the design of the innovation to the potential for such learning experiences. Descriptions should address the key features of the design, articulating their basis in relevant scholarly literatures and explaining how these design features are intended to realize the innovative learning experiences. Descriptions should also articulate how the students' learning experiences will strengthen knowledge and interest in STEM and ICT careers. A work plan should outline the activities that will be carried out, the personnel that will be responsible for them, and the expected impacts they will have on the participants. Innovations should be considered broadly to involve not only students in the PreK-12 grade range, but also teachers, instructors, mentors, coaches, administrators, or any other participants involved in the innovation.

A3. STEM Workforce Development

A diverse talent pool of STEM-literate citizens prepared for the jobs of the future is essential for maintaining national innovation, growing the economy, and supporting future scientific discovery. Preparing the STEM and ICT workforce of the future is an NSF priority and entails supporting stronger learning of STEM and ICT topics and fostering increased interest in and engagement with STEM and ICT career paths.

The following details are essential for clear descriptions of this required component:

  • Focusing on workforce-oriented learning environments that connect PreK-12 learning and workforce needs.
  • Clearly defining an innovation that directly engages both students and educators in experiences that promote awareness of, interest in, and capacities to participate in STEM and ICT careers or career pathways.
  • Describing how the innovation advances knowledge of promising workforce-related activities, such as entrepreneurship, apprenticeships, internships, and mentoring, and the conditions that promote interest in and knowledge of STEM and ICT careers.

A4. Strategies for Broadening Participation

Diversity, access, equity, and inclusion are fundamental in broadening participation in STEM and advancing the nation’s capacity for STEM innovation. Strengthening the STEM and ICT workforce includes supporting students who have been historically underrepresented and underserved in the STEM career pipeline. Promoting diversity, access, equity, and inclusion in PreK-12 learning environments facilitates the merging of ideas, approaches, tools, and technologies of diverse learners and engages them with a broad range of STEM disciplines that stimulate discovery. ITEST supports proposals that target all students, particularly from underserved and underrepresented populations, and their educators. Proposals should explicitly identify effective ways to promote knowledge of and interest in STEM and ICT careers or career pathways for students from populations that are currently underserved or underrepresented in STEM.

The following details are essential for clear descriptions of this required component:

  • Describing strategies for recruiting and selecting participants from a population or populations currently underserved or underrepresented in STEM professions, careers, or education pathways.
  • Identifying the specific strengths of and challenges faced by the underserved or underrepresented populations selected to be served.
  • Articulating strategies for building on the participants' strengths and addressing the challenges they typically face in STEM learning and interest through the technology experiences, learning activities, and entrepreneurial experiences embedded in the innovation.
  • Explaining how the technology experiences and learning activities are developmentally and age appropriate.

A5. Strategic Partnerships

Knowledge and interest in STEM and ICT careers can be strengthened by strategic partners with shared visions. Productive strategic partnerships focus on the alignment between STEM teaching and learning and the development of the current and future STEM workforce. Examples of partners include colleges and universities (including Historically Black Colleges and Universities, Hispanic Serving Institutions, Tribal Colleges and Universities, Alaska Native and native Hawai’ian-serving Institutions, and community colleges); privately- and publicly-held businesses; libraries, museums, and other informal learning environments. The ITEST program is particularly interested in innovations that integrate appropriate entrepreneurial educational experiences that are inclusive and increase the participation of underserved and underrepresented groups. Strong proposals should stipulate the ways in which strategic partnerships contribute to the sustainability of the project and the benefits for students, educators, and each strategic partner.

The following details are essential for clear descriptions of this required component:

  • Describing how strategic partnerships strengthen existing collaborations and develop new connections between educational institutions, employers, and their communities.
  • Specifying how strategic partnerships engage students and teachers in STEM career-based learning experiences with local employers, internships, apprenticeships, or research experiences as appropriate.
  • Visibly positioning mathematics and statistics education as magnets for STEM and ICT career interest through engagement in applied partnership contexts as appropriate.
  • Describing how strategic partnerships capitalize on formal and informal learning contexts to support academic and technical learning as preparation for higher education, and also support teachers to scaffold this learning as appropriate.

B. Measuring Outcomes through High-Quality Research

ITEST projects must include a research component that measures the outcomes of the innovation relative to the goals of increasing knowledge of, and interest in, STEM and ICT careers. Measuring knowledge and interest includes attending to cognitive outcomes (such as changes in knowledge related to STEM and ICT domains and careers), and social-emotional outcomes (such as changes in motivation, engagement, interest, dispositions, or attitudes towards STEM and ICT careers) for individual and groups of students. ITEST proposals can include a wide variety of research designs and methodologies, but must include the following three components: high-quality research design, project evaluation, and dissemination.

B1. High-Quality Research Design

To achieve both short-term and long-term impacts, ITEST proposals should clearly outline a research plan. Research may be employed not only to advance scholarly literatures but also to promote understanding of the context-specific factors that influence the impacts of the designed innovations. Research can be framed as design-based with both practical and theoretical implications. Research is considered broadly to include but not be limited to the kinds of studies described in the Common Guidelines for Education Research and Development (Institute of Education Sciences, US Department of Education and the National Science Foundation, 2013), which can be accessed at NSF 13-126.

The following details are essential for clear descriptions of this required component:

  • Research questions that are appropriately framed and motivated by scholarly literatures relevant to STEM learning, teaching, workforce preparation, broadening participation, innovative uses of technology, and/or partnerships. Research questions should be theory-oriented and should enhance the ability to explain the relation between the innovation's design features and the impacts on knowledge and interest in STEM and ICT careers. Research questions should aim to inform theory locally.
  • Specific plans for collecting quantitative and/or qualitative forms of data that are most relevant for addressing the research questions. Such data may include but are not limited to cognitive and social-emotional outcomes, mediating factors (e.g., patterns of engagement, discussion, and affect), characteristics of participants, features of the innovative technologies, and participants' interactions with them.
  • Well-defined analytical methods appropriate for drawing inferences from the collected data in order to address the research questions.

B2. Project Evaluation

ITEST proposals are expected to describe the mechanisms that will be used to assess the success of the project in developing knowledge of and interest in STEM and ICT careers through a project evaluation plan. This plan should describe the steps that will effectively provide feedback on all aspects of the work both formatively throughout the duration of the project and summatively at the end. For projects with external evaluators, PIs are encouraged to include reports of evaluation activities as part of their annual and final project reports.

Formative evaluation activities are often designed to document the extent to which project activities are being carried out as intended and to provide information on interim outcomes. Such information allows the project team not only to assess success over the course of the project, but also to make mid-course corrections iteratively to improve the project. Summative evaluation activities typically focus on documenting the final outcomes achieved and the extent to which these are in line with the original goals of the project. Evidence from ongoing evaluative activities may provide valuable insights into any discrepancies between stated and achieved outcomes.

The following details are essential for clear descriptions of this required component:

  • Articulation of evaluation questions relevant to the project's scope of work. What does the project need to learn to assess success?
  • Delineation of the activities and data that will be employed to generate evidence addressing the evaluation questions and stipulate the project staff that will be responsible for this evidence. How does the project propose to address these information needs? Explicit consideration should be given to the mechanisms for providing independent oversight and review of these activities (e.g., an independent, third-party evaluator or an external advisory board).
  • Description of how the project plans to use the evaluation evidence, including how feedback will be shared, with whom (e.g., project leadership, external advisors), and for what purpose (e.g., to inform ongoing project management, to supplement research findings and contribute to the generation of knowledge).

B3. Dissemination of Findings

ITEST proposals must include a creative communication strategy for reaching broad audiences, including scholars, practitioners, policymakers, and the public. While the potential results of the proposed research are expected to be of sufficient quality and significance to merit peer-reviewed publications, approaches that reach broader audiences are also expected.

The following details are essential for clear descriptions of this required component:

  • Key elements of the communication plan, such as target audiences and the channels, media, or technologies appropriate for reaching specific audiences.
  • Dissemination strategies that reach the audiences that are appropriate to the strategic partnership, in particular those in addition to scholars reached through publications and presentations in conferences and other similar environments.

C. Project Types

ITEST research and development projects should include compelling objectives, sound literature reviews, clearly formulated research questions and research designs, valid and reliable measurement instruments (or plans to develop such instruments), and appropriate data collection and analysis methods that contribute to high-quality, evidence-based, and outcomes-driven projects. Proposals should include learning opportunities, educator professional development, and the study of cognitive and social-emotional outcomes related to interest in and knowledge of STEM and ICT career pathways. Proposals should clearly identify the specific age- and grade-appropriate disciplinary concepts and practices to be addressed. There is no expectation that proposals address all disciplines included in STEM.

ITEST supports three types of R&D projects, in addition to synthesis studies and conferences. Each of the three R&D project categories can be related to one or more of the research types described in the Common Guidelines for Education Research and Development (Institute of Education Sciences, US Department of Education and the National Science Foundation, 2013), which can be accessed at NSF 13-126.

Exploring Theory and Design Principles (ETD)

ETD R&D projects draw on literature in the field to develop conceptual framing for innovations that foster student knowledge of, and interest in, STEM and ICT careers. ETD projects investigate extant conditions and explore factors in the field intended to increase students’ and educators’ knowledge, motivation, participation, persistence, confidence, and resilience in STEM and ICT fields. The outcomes of an ETD study will be a preliminary theoretical framework or a prototype; and a set of design principles, methods or features of innovations to increase knowledge and interest in STEM and ICT careers for students in PreK-12 formal and informal settings, particularly students from underserved and underrepresented populations. Because both knowledge and interest are important, ITEST encourages ETD studies that collect and analyze data related to both cognitive and social-emotional student outcomes.

Up to three years, up to $400,000

ETD studies are consistent with the research described in the IES/NSF Common Guidelines as Type 2: Early-Stage or Exploratory Research. ETD studies build core knowledge by creating and shaping theory and building out design principles and methods that represent a significant step towards operationalizing theory.

Developing and Testing Innovations (DTI)

The core work of DTI R&D projects involves designing the innovation, pilot-testing or implementing the innovation, and analyzing its outcomes. DTI studies focus on direct engagement with students and educators and assessment of student outcomes. DTI implementation involves all students, particularly underserved and underrepresented student populations in PreK-12 formal or informal settings. Research on implementation explicitly attends to the ways in which the design principles or features of the innovation capitalize on the strengths and challenges those student populations bring to the learning environments and influence knowledge of and interest in STEM and ICT careers. Because both knowledge and interest are important, ITEST encourages DTI studies that focus on both cognitive and social-emotional student outcomes.

Up to four years, up to $1,500,000

DTI projects are consistent with Type 3: Design and Development Research of the IES/NSF Common Guidelines, in that they draw on existing theory and evidence to design and iteratively develop interventions, including testing individual components to provide feedback in the development process. Results from DTI studies may lead to additional work to better understand the theory behind the outcomes or may indicate that the intervention is sufficiently promising to warrant large-scale testing. DTI studies demonstrate a considerable amount of supporting data to warrant scaling, expansion, or iteration of innovations.

Scaling, Expanding, and Iterating Innovations (SEI)

SEI R&D projects make use of an existing innovation and investigate the mechanisms that expand its reach to broader audiences. SEI studies are designed to build on and expand DTI projects, or findings from innovations previously developed within or outside of the ITEST portfolio. SEI projects (a) broaden an innovation at a significant scale of five to ten times greater than the original implementation; (b) extend an innovation to different student populations, regions of the country, grade levels or ages of students with varying skills, and educators’ capacities in PreK-12 formal and informal settings; (c) examine issues of transferability and generalizability and the factors that support or inhibit scaling; and (d) assess cognitive and social-emotional student outcomes and measure whether students continue to pursue further STEM and ICT education or careers.

Up to five years, up to $3,000,000

SEI projects are consistent with the three types of impact research described in the IES/NSF Common Guidelines: Type 4: Efficacy Research, which allows for testing of an intervention under “ideal” circumstances, including a high level of support or developer involvement than would be the case under normal circumstances; Type 5: Effectiveness Research, which estimates the impacts of the intervention when implemented under conditions of routine practice; and Type 6: Scale-up Research, which examines effectiveness in a wide range of populations, contexts, and circumstances without substantial developer involvement in implementation or evaluation.

Synthesis Studies

ITEST supports synthesis studies, including qualitative syntheses and quantitative meta-analyses focused on effective technology-based models, STEM and ICT workforce development in PreK-12 STEM learning environments, and measurement of cognitive and social-emotional students’ outcomes resulting from STEM learning environments.

Synthesis studies demonstrate a command of the literature on the question, issue, or topic in both breadth and depth to make a case for the amount, type, and relevance of available literature. Literature selection processes (e.g., search criteria) and quality and inclusion criteria (e.g., peer-reviewed articles, conference presentations, evaluation reports) should be discussed. Synthesis studies are expected to generate products usable by researchers and practitioners and indicate how the products serve the ITEST program goals.

Up to two years, up to $300,000

Conference Projects

ITEST supports conference projects aimed at building capacity among researchers and practitioners, particularly focused on the development of R&D agendas to advance the ITEST program goals and objectives. Proposals must demonstrate command of the literature and practice of the topic selected and describe the expertise and selection criteria of participants. Conference proposals include a conceptual framework, a draft agenda, the expected outcomes or products resulting from the activity, and discuss how these products will be useful and disseminated to the communities of researchers, practitioners, and beyond.

Up to one year, up to $100,000

Please note that conference proposals may be submitted anytime during the year and reviewed accordingly. Proposers should contact a program officer prior to submission to discuss their ideas. For general guidance about conferences, follow the PAPPG guidance for preparing Conference Proposals (PAPPG II.E.7). The "Conference" type of proposal should be selected in the proposal preparation module in FastLane or Grants.gov. Conference proposals must be submitted via FastLane or Grants.gov.

D. The ITEST Resource Center

ITEST intends to fund one Resource Center in FY 2020 to build a community of educational researchers and practitioners consistent with the purpose of ITEST. An important aim of the resource center will be to foster a community of research and practice that is framed around the NSF's current and emerging priorities. The Center is expected to provide technical support for all ITEST active and prospective awardees, facilitate dissemination of awards’ outcomes, convene PI meetings, and advance the mission of broadening participation in STEM and ICT careers and career education pathways. The size of the request should be appropriate to the scope of work proposed. The "Center" type of proposal should be selected in the proposal preparation module in FastLane or Grants.gov. Center proposals must be submitted via FastLane or Grants.gov.

The Resource Center is expected to advance the goals of ITEST through (a) technical assistance that facilitates ITEST projects’ success and articulates innovative models for STEM learning environments; (b) synthesizing and disseminating ITEST projects’ findings nationally to inform the national STEM education fields; and (c) conducting outreach to broaden participation from the ITEST and NSF communities, as well as from states, organizations, and higher education institutions not currently represented in the ITEST portfolio.

Three years, up to $4,000,000

D1. Capacity Building: The ITEST Resource Center is expected to build the capacity of the ITEST community to develop and execute innovative R&D projects consistent with ITEST’s goals. This work includes facilitating discussions across a network of active and potential ITEST projects through PI convenings and other appropriate means to facilitate the development of a broader and better-connected R&D community. The Center is expected to conduct a comprehensive ITEST portfolio analysis and collaborate with other EHR-supported resource centers to broaden awareness of the various funding programs and resources in STEM education and promote synergistic efforts to advance the knowledge base and broader participation in STEM education.

D2. Technical Support: The ITEST Resource Center is expected to provide technical support to facilitate ITEST projects’ success in developing and articulating innovations that strengthen knowledge of and interest in STEM and ICT careers throughout all stages of projects’ lifespans. Technical assistance may include, but is not limited to, providing short-term online or blended professional development courses and workshops for educators, facilitating the emergence and development of communities of practice, identifying promising practices and resources (both print and digital) that may help projects in achieving their goals, and assisting prospective PIs with access to information about ITEST outcomes and resources.

D3. Dissemination: The ITEST Resource Center is expected to synthesize and disseminate ITEST projects’ findings nationally in order to inform and influence the community of stakeholders. This work includes conducting a comprehensive analysis of the ITEST portfolio for internal and external stakeholders. It also includes implementing a broad dissemination plan to communicate ITEST outcomes and resources to formal and informal STEM education professional organizations, industry and policy stakeholders, and STEM education research communities about the major unique contributions of ITEST projects to the field.

D4. Broadening Participation in the ITEST PI Community: The ITEST Resource Center is expected to conduct outreach efforts to broaden participation in the ITEST community. Specifically, the resource center should seek individuals from organizations and communities not currently represented in the ITEST portfolio and facilitate increased participation in STEM workforce development through expansion of the ITEST portfolio to underrepresented geographic regions, community types (e.g., rural, suburban, or urban), and institutions (e.g., minority-serving institutions, community colleges, school districts, or formal and informal learning centers).

The lead institution of the ITEST Resource Center, including organizations and higher education institutions, is expected to demonstrate capacity to plan, develop, and manage a national center that provides technical support to a diverse portfolio of projects. The lead institution should have proven expertise in STEM and ICT, formal and informal STEM education, and capacity building of STEM educators and researchers. The lead institution and any identified partners should also show expertise in research and development designs and methodologies, the use of technology in PreK-12 learning environments, research-informed approaches for broadening participation with special attention to underserved and underrepresented student populations, workforce development in PreK-12 STEM education, and research-informed approaches to establishing strategic partnerships.

NSF intends to employ the appropriate funding mechanism with the ITEST Resource Center that ensures systematic communication and participation between NSF and the potential awardee, including continuous assessment of progress and implementation of necessary modifications.

E. Relevant References

National Academies of Sciences, Engineering, and Medicine. (2011). Expanding underrepresented minority participation: America's science and technology talent at the crossroads. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/12984/chapter/1

National Academies of Sciences, Engineering, and Medicine. (2016). Developing national STEM workforce strategy: A workshop summary. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/21900/chapter/1

National Academies of Sciences, Engineering, and Medicine. (2017). Building America’s skilled technical workforce. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/23472/chapter/1

National Academies of Sciences, Engineering, and Medicine. (2018). Measuring the 21st century science and engineering workforce population: Evolving needs. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/24968/chapter/1

National Academies of Sciences, Engineering, and Medicine. (2018). English learners in STEM subjects: Transforming classrooms, schools, and lives. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/24968/chapter/1

National Research Council (2009). Learning science in informal environments: People, places, and pursuits. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/12190/chapter/1

National Research Council (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/13398/chapter/1

National Research Council (2015). Identifying and supporting productive STEM programs in out-of-school settings. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/21740/chapter/1

National Science & Technology Council: Committee on STEM Education. (2018). Charting a course for success: America’s strategy for STEM education. Retrieved from https://www.whitehouse.gov/wp-content/uploads/2018/12/STEM-Education-Strategic-Plan-2018.pdf

National Science Board. (2018). Our nation’s future competitiveness relies on building a STEM-capable U.S. workforce: A policy companion statement to Science and Engineering Indicators 2018. Arlington, VA: National Science Foundation. Retrieved from https://www.nsf.gov/nsb/sei/companion-brief/NSB-2018-7.pdf

National Science Foundation. (2018). Building the future: Investing in discovery and innovation - NSF strategic plan for fiscal years (FY) 2018 - 2022 (NSF18045). Retrieved from https://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf18045.

National Science Foundation. (n.d.). Broadening participation. Retrieved from https://www.nsf.gov/od/broadeningparticipation/bp.jsp.

III. AWARD INFORMATION

Anticipated Type of Award: Standard Grant or Continuing Grant

Estimated Number of Awards: 22 to 30

ITEST expects to fund between 22 and 30 awards per year depending on the type of proposal and funding level.

  • 6 to 8 awards for Exploring Theory and Design Principles for Innovations (ETD) with durations up to three years and budgets up to $400,000;
  • 8 to 10 awards for Developing and Testing Innovations (DTI) with durations up to four years and budgets up to $1,500,000;
  • 3 to 5 awards for Scaling, Expanding, and Iterating Innovations (SEI) with durations up to five years and budgets up to $3,000,000;
  • 2 to 3 awards for Syntheses with durations up to two years and budgets up to $300,000; and
  • 2 to 3 awards for Conferences with durations of one year and budgets up to $100,000.
  • In addition, ITEST intends to fund one Resource Center with duration up to three years and total funding up to $4,000,000 in FY 2020. This award may be made as a continuing award.

Anticipated Funding Amount: $25,000,000 to $30,000,000

NSF anticipates having approximately $25,000,000 to $30,000,000 available for the FY20 competition and approximately $25,000,000 to $30,000,000 each year thereafter.

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds

IV. ELIGIBILITY INFORMATION

Who May Submit Proposals:

The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the NSF Proposal & Award Policies & Procedures Guide (PAPPG), Chapter I.E.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI:

There are no restrictions or limits.

V. PROPOSAL PREPARATION AND SUBMISSION INSTRUCTIONS

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via FastLane, Research.gov, or Grants.gov.

  • Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.
  • Full Proposals submitted via Research.gov: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal and Award Policies and Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov. The Prepare New Proposal setup will prompt you for the program solicitation number.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. PAPPG Chapter II.D.3 provides additional information on collaborative proposals.

See PAPPG Chapter II.C.2 for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the PAPPG instructions.

The following information supplements the standard PAPPG or NSF Grants.gov Application Guide proposal preparation guidelines:

Cover Sheet: Prospective PIs should complete this sheet with all the requested information. If the proposer requests that another NSF program also consider the proposal, that must be indicated on the Cover Sheet. Also, in the title section, please begin the title with the type of research project being submitted (i.e., Exploring Theory and Design Principles [ETD]; Developing and Testing Innovations [DTI]; Scaling, Expanding, and Iterating Innovations [SEI]; Synthesis; Conference; or Resource Center), followed by a colon, and the title of the proposal. Likewise, please make sure to check the human subjects box. No awards will be made without Institutional Review Board’s approvals or exemptions.

Project Summary: A one-page Project Summary must be provided, which consists of three parts: (1) an overview, (2) a statement on the intellectual merit of the proposed activity, and (3) a statement on the broader impacts of the proposed activity. The first sentence of the overview must indicate the type of ITEST project being submitted. The overview must describe the STEM content emphases; the approach to be designed, implemented, and evaluated; and the participants to be targeted, including the age ranges or grade levels of student participants. Proposals must contain separate statements on intellectual merit and broader impacts in the Project Summary and the Project Description. Refer to the PAPPG for additional information at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Project Description: This section is limited to a maximum of 15 pages. A proposal must respond fully to the ITEST Program Description in this program solicitation. The Project Description must address the following elements in any order:

Project Overview, Rationale, and Importance: The proposal must show how the project addresses critical STEM educational needs and the potential for intellectual merit and broader impacts within the context of the ITEST purpose. The proposal provides an overview of the project goals or objectives, and a rationale for how the work will improve knowledge of and interest in STEM/ICT career pathways for students and advance teachers’ understanding of STEM/ICT content and career pathways. The proposed work addresses how the planned STEM education innovations differ from existing practice, and why the study has the potential to improve student and teacher learning and other educational outcomes beyond what current practices provide.

Results from Prior NSF Support: In cases where the prospective PI or any Co-PI has received more than one award (excluding amendments to existing awards), please report only the one award that is most closely related to the proposal.

Expertise and Management: The project team should reflect the types of expertise needed to successfully implement and manage the project, such as interdisciplinary teams of STEM education researchers, development experts, school district personnel, or experienced teachers; STEM content experts or researchers; researchers in career and workforce development, psychology, sociology, anthropology, or any other field related to the work. An advisory group or consultants who can provide guidance in research design and methodologies, including quantitative or qualitative research methods, implementation, or development of measurement instruments are highly recommended.

Budget: PIs are encouraged to include funds in the budget for two people to attend the annual ITEST PI meeting for each year of the project.

Other Sections in Addition to the 15-page Project Description include:

References Cited: Any literature cited should be specifically related to the proposed project, and the Project Description should make clear how each reference has played a role in design of the project.

Biographical Sketches: Biographical information (no more than two pages each) must be provided for each individual identified as senior personnel. Biographical sketches should adhere to the format outlined in the PAPPG, Chapter II, Section C.2.f. (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg).

Special Information/Supplementary Documentation: The only items permitted in the Supplementary information section are (1) letters of collaboration, (2) the data management plan, and (3) the postdoctoral researcher mentoring plan (if applicable). Refer to the PAPPG for additional information at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Data Management Plan: Proposers should be provide a detailed data management plan. Transparency requires that the Federal agencies share how they are maximizing outcomes of Federal STEM investments and activities and ensuring broad benefit to the public. Proposers are highly encouraged to review the EHR Directorate-specific data management plan guidance, which can be accessed at https://www.nsf.gov/bfa/dias/policy/dmpdocs/ehr.pdf.

Appendices: Not permitted. The 15-page project description must contain all the information needed to describe the project. Proposals submitted with an appendix will be returned without review.

B. Budgetary Information

Cost Sharing:

Inclusion of voluntary committed cost sharing is prohibited.

Other Budgetary Limitations:

Major research equipment purchases are not supported. The ITEST program limits the purchase of equipment to software, probes, and specialized equipment needed to implement a specific project. General purpose equipment, such as computers, notepads, and cellphones are not supported.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

         August 19, 2019

         August 14, 2020

         August 13, 2021

D. FastLane/Research.gov/Grants.gov Requirements

For Proposals Submitted Via FastLane or Research.gov:

To prepare and submit a proposal via FastLane, see detailed technical instructions available at: https://www.fastlane.nsf.gov/a1/newstan.htm. To prepare and submit a proposal via Research.gov, see detailed technical instructions available at: https://www.research.gov/research-portal/appmanager/base/desktop?_nfpb=true&_pageLabel=research_node_display&_nodePath=/researchGov/Service/Desktop/ProposalPreparationandSubmission.html. For FastLane or Research.gov user support, call the FastLane and Research.gov Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov or rgov@nsf.gov. The FastLane and Research.gov Help Desk answers general technical questions related to the use of the FastLane and Research.gov systems. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

For Proposals Submitted Via Grants.gov:

Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: http://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

Proposers that submitted via FastLane or Research.gov may use Research.gov to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF PROPOSAL PROCESSING AND REVIEW PROCEDURES

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in PAPPG Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/bfa/dias/policy/merit_review/.

Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in https://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf18045.Building the Future: Investing in Discovery and Innovation - NSF Strategic Plan for Fiscal Years (FY) 2018 – 2022. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

One of the strategic objectives in support of NSF's mission is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. NSF's contribution to the national innovation ecosystem is to provide cutting-edge research under the guidance of the Nation's most creative scientists and engineers. NSF also supports development of a strong science, technology, engineering, and mathematics (STEM) workforce by investing in building the knowledge that informs improvements in STEM teaching and learning.

NSF's mission calls for the broadening of opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (PAPPG Chapter II.C.2.d(i). contains additional information for use by proposers in development of the Project Description section of the proposal). Reviewers are strongly encouraged to review the criteria, including PAPPG Chapter II.C.2.d(i), prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and underrepresented minorities in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

Additional Solicitation Specific Review Criteria

Prospective PIs must be aware of these criteria, address them and decide where to include the following information within the Project Description using a heading for this purpose. Reviewers will be asked to evaluate proposals based on how broadening participation is described in the proposal, including the specific-review criteria. Please note that while there is an overlap between the suggested aspects to address under the Broadening Participation and Innovative Learning Technologies required components, proposers and reviewers must answer the following questions:

  • To what extent does the proposal include explicit and adequate strategies for recruiting and selecting participants, particularly those from underserved and underrepresented populations in STEM professions, careers, or education pathways?
  • To what extent does the proposal describe compelling approaches to address diversity, access, equity, and inclusion in PreK-12 learning environments to ensure that all students, particularly those from underserved and underrepresented populations actively engage with a broad range of STEM disciplines and fields that stimulate effective instruction and learning?
  • To what extent does the proposal describe specific research-informed instructional approaches to build on the challenges and strengths that students and their teachers bring to classrooms and informal learning environments, particularly with students from underserved and underrepresented populations in STEM fields?
  • To what extent does the proposal explain how planned innovations with the technology are developmentally and age-appropriate for students and suited for the specific populations of students and teachers, particularly for underserved and underrepresented student populations?

In addressing the solicitation-specific review criteria, ITEST especially welcomes proposals that will pair well with the efforts of NSF Inclusion across the Nation of Communities of Learners of Underrepresented Discoverers in Engineering and Science (INCLUDES) to develop STEM talent from all sectors and groups in our society (one of the NSF Ten Big Ideas). Collaborations are encouraged between ITEST proposals and existing NSF INCLUDES projects, provided the collaboration strengthens both projects.

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. AWARD ADMINISTRATION INFORMATION

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

For projects with external evaluators, PIs are encouraged to include reports of evaluation activities as part of their annual and final project reports.

VIII. AGENCY CONTACTS

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

  • Address questions to the Program, telephone: (703) 292-8620, email: DRLITEST@nsf.gov

For questions related to the use of FastLane or Research.gov, contact:

  • FastLane and Research.gov Help Desk: 1-800-673-6188

    FastLane Help Desk e-mail: fastlane@nsf.gov.

    Research.gov Help Desk e-mail: rgov@nsf.gov

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

IX. OTHER INFORMATION

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at http://www.grants.gov.

ABOUT THE NATIONAL SCIENCE FOUNDATION

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See the NSF Proposal & Award Policies & Procedures Guide Chapter II.E.6 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

  • Location:

2415 Eisenhower Avenue, Alexandria, VA 22314

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:

Send an e-mail to:

nsfpubs@nsf.gov

or telephone:

(703) 292-7827

  • To Locate NSF Employees:

(703) 292-5111

PRIVACY ACT AND PUBLIC BURDEN STATEMENTS

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004), and NSF-51, "Reviewer/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Office of the General Counsel
National Science Foundation
Alexandria, VA 22314



Policies and Important Links

|

Privacy | FOIA | Help | Contact NSF | Contact Web Master | SiteMap

National Science Foundation

National Science Foundation, 2415 Eisenhower Avenue, Alexandria, Virginia 22314, USA
Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (703) 292-5090 or (800) 281-8749

Text Only