Active funding opportunity

This document is the current version.

NSF 23-600: Assessing and Predicting Technology Outcomes (APTO)

Program Solicitation

Document Information

Document History

  • Posted: June 22, 2023

Program Solicitation NSF 23-600

NSF Logo

National Science Foundation

Directorate for Technology, Innovation and Partnerships

Preliminary Proposal Due Date(s) (required) (due by 5 p.m. submitter's local time):

     August 21, 2023

Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

     October 30, 2023

Important Information And Revision Notes

Any proposal submitted in response to this solicitation should be submitted in accordance with the NSF Proposal & Award Policies & Procedures Guide (PAPPG) that is in effect for the relevant due date to which the proposal is being submitted. The NSF PAPPG is regularly revised and it is the responsibility of the proposer to ensure that the proposal meets the requirements specified in this solicitation and the applicable version of the PAPPG. Submitting a proposal prior to a specified deadline does not negate this requirement.

Summary Of Program Requirements

General Information

Program Title:

Assessing and Predicting Technology Outcomes (APTO)

Synopsis of Program:

The National Science Foundation (NSF), through the Directorate for Technology, Innovation and Partnerships (TIP), is launching a new program on Assessing and Predicting Technology Outcomes (APTO) to assess how investments in science and technology research and development will contribute to specific outcomes for the Nation. The APTO program will support a cohort of projects that will work together to complement each other's research and development (R&D) efforts on technology outcome models to accurately describe three types of technology outcomes: technology capabilities, technology production, and technology use. These models should be able to predict future as well as past states of technology outcomes. Of particular interest are prediction models that are generalizable across multiple technology areas. The outcome of this work will help assess and evaluate the effectiveness of U.S. R&D investments and generate information that decision makers could use to strategize and optimize investments for advancing long-term U.S. competitiveness into the future.

The APTO program serves the TIP directorate's need for technology assessment to understand where the U.S. stands — as a whole and in individual regions — vis-à-vis competitiveness in the key technology focus areas named in Sec. 10387 of the CHIPS and Science Act. TIP is interested in answers to the question of which science and technology investments would offer the greatest impact in the key technology focus areas and would be essential to the long-term national security and economic prosperity of the United States. As a key aspect of TIP's technology assessment activity, the APTO program will bring together multidisciplinary teams to help develop the data, intellectual foundations, and analytics necessary to inform decision making.

The research community has accumulated important insights about the "rate and direction of inventive activity"1 as an aggregate economic good, and about what decision makers can do to increase the overall production of that good. Meanwhile, industry has immense experience with creating specific technologies and planning how to reach intended technology outcomes over periods of several years. The APTO program aims to expand on this knowledge base spanning academia and industry to better understand and predict the long-term evolution of specific technologies over a period of a few years to decades, and specifically model how intentional, purposeful investments can change that evolution.

APTO will fund research and development of causal models that accurately describe past and future technology outcomes, specifically the capabilities, production, and use of specific technologies. These models should be able to predict likely future outcomes for specific technologies and what intentional investments could reliably change or accelerate those outcomes by correctly capturing the various causal relationships. Building and testing these models will require significant amounts of specialized data gathered from a variety of sources, e.g., historical sources, experimentation, expert elicitation, and others. Data extraction and processing tools may need to be developed as part of that effort.

APTO will support a cohort of projects that will work in collaboration on research and development of Technology Outcome Models and in development/preparation of Data Sets and related Tools.


1 "The Rate and Direction of Inventive Activity", edited by R. R. Nelson, 1962. Princeton, NJ: Princeton University Press.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.084 --- NSF Technology, Innovation and Partnerships

Award Information

Anticipated Type of Award: Standard Grant or Cooperative Agreement

Estimated Number of Awards: 5 to 20

Anticipated Funding Amount: $30,000,000

Subject to availability of funds

Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research laboratories, professional societies and similar organizations located in the U.S. that are directly associated with educational or research activities.
  • For-profit organizations: U.S.-based commercial organizations, including small businesses, with strong capabilities in scientific or engineering research or education and a passion for innovation.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or co-PI:

There are no restrictions or limits.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not required
  • Preliminary Proposals: Submission of Preliminary Proposals is required. Please see the full text of this solicitation for further information.
  • Full Proposals:

B. Budgetary Information

  • Cost Sharing Requirements:

    Inclusion of voluntary committed cost sharing is prohibited.

  • Indirect Cost (F&A) Limitations:

    Not Applicable

  • Other Budgetary Limitations:

    Other budgetary limitations apply. Please see the full text of this solicitation for further information.

C. Due Dates

  • Preliminary Proposal Due Date(s) (required) (due by 5 p.m. submitter's local time):

         August 21, 2023

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

         October 30, 2023

Proposal Review Information Criteria

Merit Review Criteria:

National Science Board approved criteria. Additional merit review criteria apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:

Additional award conditions apply. Please see the full text of this solicitation for further information.

Reporting Requirements:

Standard NSF reporting requirements apply.

I. Introduction

The National Science Foundation (NSF), through the Directorate for Technology, Innovation and Partnerships (TIP), is launching a new program on Assessing and Predicting Technology Outcomes (APTO) to assess how investments in science and technology research and development will contribute to specific outcomes for the Nation. The APTO program will support a cohort of projects that will work together to complement each other's research and development (R&D) efforts on technology outcome models to accurately describe three types of technology outcomes: technology capabilities, technology production, and technology use. These models should be able to predict future as well as past states of technology outcomes. Of particular interest are prediction models that are generalizable across multiple technology areas. The outcome of this work will help assess and evaluate the effectiveness of U.S. R&D investments and generate information that decision makers could use to strategize and optimize investments for advancing long-term U.S. competitiveness into the future.

The APTO program serves the TIP directorate's need for technology assessment to understand where the U.S. stands — as a whole and in individual regions — vis-à-vis competitiveness in the key technology focus areas named in Sec. 10387 of the CHIPS and Science Act. TIP is interested in answers to the question of which science and technology investments would offer the greatest impact in the key technology focus areas and would be essential to the long-term national security and economic prosperity of the United States. As a key aspect of TIP's technology assessment activity, the APTO program will bring together multidisciplinary teams to help develop the data, intellectual foundations, and analytics necessary to inform decision making.

The research community has accumulated important insights about the "rate and direction of inventive activity"2 as an aggregate economic good, and about what decision makers can do to increase the overall production of that good. Meanwhile, industry has immense experience with creating specific technologies and planning how to reach intended technology outcomes over periods of several years. The APTO program aims to expand on this knowledge base spanning academia and industry to better understand and predict the long-term evolution of specific technologies over a period of a few years to decades, and specifically model how intentional, purposeful investments can change that evolution. This knowledge will set the foundation for future data-driven investment approaches that can advance U.S. competitiveness, foster innovation and benefit all of America.

APTO will fund a cohort of projects to conduct R&D in models that can accurately describe past and future outcomes including the capabilities, production, and use of specific technologies. These models will be able to predict likely outcomes for specific future technologies and what investments would reliably change or accelerate those outcomes. In doing so, the models would need to capture the important causal structure between investments and technology outcomes.

Building effective technology outcome models will likely require a generalized understanding of technology evolution across all technologies. However, the key technologies of interest to the APTO program include those from Sec. 10387 of the CHIPS and Science Act of 2022:

  • Artificial intelligence, machine learning, autonomy, and related advances;
  • High performance computing, semiconductors, and advanced computer hardware and software;
  • Quantum information science and technology;
  • Robotics, automation, and advanced manufacturing;
  • Technology for natural and anthropogenic disaster prevention or mitigation;
  • Advanced communications technology and immersive technology­;
  • Biotechnology, medical technology, genomics, and synthetic biology;
  • Data storage, data management, distributed ledger technologies, and cybersecurity, including biometrics;
  • Advanced energy and industrial efficiency technologies, such as batteries and advanced nuclear technologies, including but not limited to for the purposes of electric generation; and
  • Advanced materials science, including composites 2D materials, other next-generation materials, and related manufacturing technologies.

The models and methods produced in APTO will enable assessing and evaluating the effectiveness of R&D investments, and whether they are optimized for advancing long-term U.S. competitiveness, in these technology areas as well as others.


2 "The Rate and Direction of Inventive Activity", edited by R. R. Nelson, 1962. Princeton, NJ: Princeton University Press.

II. Program Description

The APTO program is interested in supporting R&D in causal models for assessing and predicting outcomes related to technology capabilities, technology production, and technology use. To that end, the program supports two Lines of Effort (LOE) — one on Technology Outcome Models and the other on Data Sets and Tools. The program will select a cohort of projects which, taken together, are best able to cover the full landscape of technology assessment and prediction across the key technology focus areas listed above.

Individual projects may focus on assessment and prediction of one or more technology outcomes (i.e., technology Capability, Production, Use) and on one or both Lines of Effort (i.e., Technology Outcome Models, Data Sets & Tools). The program will strive to cover the full breadth of areas, as best as possible, across the funded cohort of projects.

1. Technology Outcomes and Lines of Effort

The technical goals of the APTO program are methods, with related data sets and accompanying tools, that help to accurately assess and predict three types of technology outcomes:

  • Capabilities refers to the state-of-the-art technical performance of an artifact or method;
  • Production is the quantity of creation of an artifact or adoption of a method with a given level of Capabilities; this could be a global total or the total for different regions, industries, etc.; and
  • Use is the count of instances or volume of utilization of an artifact or method with a given level of Capabilities; this could also be a global total or the total for different regions, consumer groups, etc.

The APTO program will support the development of causal models that accurately describe how different types and scales of investments change technology outcomes, and by how much. For example, consider a model for high-performance computing that could be built with data until 1980 and make predictions on the state of the art of computer chip capabilities in 1990 — how many would be produced and what parts of the world would use them. Importantly, the models should be capable of including descriptions of what cause(s) led to these outcomes and be able to make accurate predictions of the outcomes when the purported cause(s) change. This causal understanding is essential for correctly assessing the effectiveness of previous investments and for identifying how future investments might change future outcomes.

Building and testing technology outcome models will likely require significant quantities of specialized data gathered from a variety of sources including historical data, data from experiments, expert elicitation, and others, to serve as model inputs or to test model outputs. Collecting and preparing these data for use in the models will likely require specialized tools. The APTO program will support two Lines of Effort (LOEs) – Development of Technology Outcome Models and Development of reference Data Sets along with associated Tools:

  • Technology Outcome Models – Causal models that accurately predict technology outcomes and what investments change them, along with accompanying software and a reference to data sets employed; and
  • Data Sets and Tools – Derived and/or newly created data sets used for creation and testing of Technology Outcome Models. May include data on technology outcomes or predictors of those outcomes, along with tools developed for efficiently creating such datasets.

The three technology outcomes and two LOEs combine to create several different types of activities that could take place in the APTO program. Examples include:

Lines of Effort

 

 

Technology Outcome Models

Data Sets and Associated Data Creation/Preparation Tools

Technology Outcomes

Capabilities

Create a model that predicts the probability that state-of-the-art telecommunication hardware in 2030 has bandwidth between X and Y TB/s. Train this model with data from 1730 to 1995 and test it on data from 1995 to 2010.

Data: e.g., aggregate historical data from lab notebooks and academic journals from 1730 to 2010 on telecommunication technologies' bandwidth, latency, and power requirements.

Tools: e.g., data extraction software to read a document about, say, a telecommunication artifact and extracts the artifact's bandwidth, latency, and power, expressed in TB/s, s and Watts, and the year of the document.

Production

Create a model that predicts the probability that the volume of U.S. production of quantum computers with between X and Y logically-independent qubits in 2035 is between N and M. The model includes parameters inferred from experimental data.

Data: e.g., data from a randomized, controlled experiment conducted with, say, 50 companies on increasing production in a single technology using 1 of 3 different techniques; the output is a record of production changes and techniques used.

Tools: e.g., data processing software to make disparate company production data comparable across companies, for instance, by reconciling data formats, product ontologies, and reporting errors.

Use

Create a model that predicts the probability country Z's consumption of nucleic acid synthesis between X and Y base pairs per unit time in 2040 is between J and K. One of the model's inputs as a predictor is the amount the technology is discussed in popular discourse.

Data: e.g., dataset obtained by parsing popular press journalism from different countries from 1990 to 2000 for references to, say, synthetic biology, that quantifies density of references by country for each year.

Tool: e.g., text understanding software that identifies when a technical document is referring to, say, nucleic acid synthesis, regardless of exact terminology employed or the document language.

Proposals may focus on one or more Technology Outcomes and one or both LOEs. Given the nature of the effort involved, it is possible that the initial projects in the APTO program are more focused on the Data Sets and Tools LOE, though proposals in the Technology Outcome Models LOE will also be equally considered. Proposals across both LOEs are also encouraged, though not required. The APTO program will strive to fund a cohort or projects that cover all Technology Outcomes and both LOEs. Proposals may propose R&D in the above areas, or other mechanisms such as prize challenges issued to achieve specific objectives as part of the project effort.

Based on the preliminary proposals received, NSF may also choose to identify and suggest opportunities for teaming among proposers for the full proposal, including across proposals in different LOEs.

2. Technology Outcome Models

Projects focusing on technology outcome models should create models that accurately predict future technology outcomes. The ultimate goal is predictive causal models that describe what actions decision makers could take to change technology outcomes and by how much. Pursuit of this goal may require intermediate modeling efforts that only identify correlation instead of causation, that have low predictive power outside of the data upon which they were built, or that have unexplainable mechanics.

Model Inputs

The focus of APTO is on the role of monetary investments. It is possible that the effectiveness of such investments is heavily mediated and moderated by additional factors. Frequent candidates for predictors for future technology outcomes include R&D funding (public, private) in aggregate or specific to the technology; how the R&D funding is spent (e.g., object-level research vs. collective research tools, grants vs. contracts, etc.); academic publications; patents; publication norms; intellectual property law; the modularity of the technology; scaling relationships in the design of the technology; level of agreement on the dominant technology design; education spending (public, private); graduation rates and numbers of graduates; population growth; size of working age population; historical industrial production of similar technologies; market concentration of producers; market concentration of consumers; changes in outcomes of other technologies; and changes in other outcomes for the same technology.

Model Outputs

Model outputs would typically be the probability that a technology outcome (Capability, Production, or Use) will be within several possible values, by a particular date. For example, model outputs may be expressed as probability distributions across possible outcome values for each future date. APTO is interested in model outputs that are at least 5 years into the model's future (see Accuracy Evaluation) — and ideally decades out.

Accuracy Evaluation

Prediction accuracy improvements are a key goal of the program and will be considered both during the proposal review and in evaluation of multi-year efforts. Prediction accuracy should be tested using methods such as out-of-sample prediction that train the model with data until year N and then predict the outcome in year N+5. Prediction accuracy should be quantified using generalized metrics that comprehensively describe the gap in information between using predicted values to describe the true outcomes (e.g., Kullback-Leibler divergence, cross-entropy, log-loss, etc.). These may be supplemented by additional metrics of accuracy.

It is anticipated that the vast majority of model development and testing will be done with historical data, which is useful for assessing past technology investments. However, for testing out-of-sample predictions, using past data creates the possibility of over-fitting, even if accidental. As such, model development and testing should make full use of techniques to increase the robustness of the resulting model. This can include training a model on data from a technology and testing on holdout data from the same technology, training a model on data from one set of technologies and then testing it on another set of technologies, and other relevant methods.

Benchmarks

APTO is interested in development of benchmarks for technology outcome models. For example, a "theory-free" approach may build extrapolation models using only historical trend data, such as ARIMA (Autoregressive Integrated Moving Average) models. Technology outcome models could be compared with such trend extrapolation models to measure their performance.

Time Scale

Predictions and prediction accuracy may be reported at a time scale no shorter than annually, though models may choose to make use of faster time scales.

Time Horizon

Predictions should be made and evaluated at least 5 years into the future from the latest data used in the model. For example, if a technology outcome model is constructed using data until 1980, prediction accuracy from 1985 and onward should be reported.

For technology outcome models of Production and Use, the aim is understanding what actions decision makers can take to affect these outcomes prior to the Capability first being created.

Granularity and Generalization

The ultimate aim of the APTO program is accurate predictions across many key technologies, but those key technologies are specific: models should make predictions about Capabilities, Production or Use of specific key technologies, and not aggregated measures of technology as a whole like total factor productivity. However, technology outcome models may range in granularity from bespoke models for specific technologies (e.g., thin film photovoltaics) to coarser models for broader classes of technologies (e.g., all photovoltaics). It is possible (though not yet robustly demonstrated) that increased predictive accuracy for any key technology can be reliably accomplished through leveraging technical facts unique to that technology (e.g., photo-voltaic capabilities being a function of modules, made from wafers, made from silicon). Generalizable models may require switching between technology-specific modes when specific information and frameworks are useful and available, and coarser-grained modes when they are not.

The APTO program is interested in assessing transferability of models, i.e., whether models developed for one key technology focus area could be rapidly transferred to another focus area.

3. Data Sets & Tools

Awards under the Data Sets & Tools LOE will create data sets, with associated tools, that are useful for creating and testing technology outcome models that accurately predict future technology outcomes. Data sets can be created on the technology outcomes to be predicted (Capabilities, Production, and Use), as well as possible predictors of technology outcomes; these will serve as model outputs or model inputs for the Technology Outcome Models LOE. Data sets can be created through aggregation, preparation, and curation of historical data, including through digitization, archival research, acquisition from data vendors, etc. Data sets can also be built through creating and collecting novel data, such as through experiments designed to test a particular hypothesized predictor.

Examples of possible data sources for outcomes and predictors include (this is an illustrative list and not meant to be exhaustive):

  • academic journals;
  • trade journals;
  • patents;
  • vendor product data sheets;
  • company accounting records;
  • government survey data; and
  • expert elicitation.

Important Metadata

Outcome data should be timestamped with when the outcome occurred, at a time resolution of at least the year, to be useful for testing of predictive models (and likely development). Each outcome data-point should also be described with as much additional specificity and metadata as possible (e.g., "thin-film photo-voltaic with efficiency X created in year Y in city Z with technique N, according to journal article M"). Among other uses, such information will help in building an ontology of the technologies for which outcomes are being predicted.

Technology Coverage

Data sets from a large of a swath of technology are encouraged; depending on the level of resolution, one could identify thousands or even millions of technology domains. Data set size can be increased by increasing samples within a single, focused domain (e.g., 1,000 data points to 1,000,000 data points on internal combustion engines) or increasing the breadth of domains sampled (e.g., 1,000 data points in internal combustion engines plus 1,000 data points on search algorithms). The value of additional data being a focused data set vs. broad data set will be determined by its utility in increasing the predictive accuracy of generalizable technology outcome models; inspection of how to make this trade-off is encouraged.

Geographic and Temporal Coverage

The APTO program is chiefly aimed at knowledge most relevant to U.S. decision makers. However, projects are encouraged to also incorporate data that have geographic coverage beyond the United States or even OECD (Organisation for Economic Co-operation and Development) member nations, and temporal coverage going back more than decades.

Tools

Data tools exist in conjunction with data sets. The APTO program is interested in commercially-available tools as well as creation of new tools to enable access and use of relevant data sets.

Data tools may be necessary that can scour large sets of documents, including technical documents, (i) for those that contain relevant information, such as those pertaining to a specific technology, and (ii) for extracting the relevant data points from the documents.

Technical data, particularly capabilities data, are often presented in a variety of ways, such as expressing the same capability type in different units (e.g., pounds vs. kilograms). Data extraction tools will likely also be needed to automatically convert such disparate descriptions of data into a single representation to help with interpretation of data sets and making of technology outcome models.

As long as data discovery, extraction, and conversion are imperfect, they will produce false negatives, false positives, and other errors that must be accounted when creating a data set. Data processing tools may need to be created to further clean or characterize the outputs of the other tools so that the outputs can become a technology data set used for building technology outcome models.

While data discovery, extraction, conversion, and processing are described here, other data tools may be needed to create an effective workflow for creating large, diverse data sets relevant for building models to predict technology outcomes. Data tools are not strictly required to be in the form of software nor entirely automatic. However, tools will ideally be able to be rapidly applied to large numbers of diverse documents and/or diverse technologies.

4. Considerations for Use-Inspired Research in the APTO Program

Gaps Between Capabilities, Production and Use

The temporal gap between Capabilities, Production and Use can be very small or large. For example, it is possible to create a new machine learning capability and automatically push it out to a billion personal electronic devices overnight. In other cases, however, there can be decades between a capability first being created and it being produced at any notable scale, or between its production and its use by multiple users. Thus, it will depend on the technology and the scenario as to how meaningful are the distinctions between these technology outcomes.

Ontologies of Technologies

Defining what constitutes a distinct technology can be fraught, especially as technology performance changes over time and thus the technology itself evolves. The complexities of ontologies for technology are thus at least somewhat analogous to the complexities of ontologies of biology, with questions of what constitutes a single species or even a single organism. It may be that making predictions that are accurate and interpretable will require also making advances in technology ontologies, and so submissions that aim to make such advances are welcomed, provided that the advances are leveraged to make more accurate predictions.

Ontologies of technologies may be hierarchical, with various kinds of aggregation. However, the objects of interest for the APTO program will be technologies themselves, and not overarching metrics of all technological progress like total factor productivity. Nevertheless, such overarching metrics may still be useful as a predictor of individual technologies' future Capabilities, Production, or Use.

Multidimensional Technology Performance

Technologies' capabilities are typically multidimensional, even if some dimensions are not explicitly reported or considered. For example, a battery's energy storage capacity may be the only feature reported, because its weight and volume are assumed, standardized, and/or deemed irrelevant. However, these unattended-to dimensions of capabilities may be barriers that prevent a technology from being more widely produced or used; once the capability improves the barriers may drop and the technology may be produced or used more.

Technical vs. Economic Performance

A technology in isolation can be described in terms of technical outputs vs. technical inputs. For example, a photo-voltaic may be characterized by power produced per unit time per unit of surface area at solar noon. However, additionally relevant can be economic performance metrics: how much it costs to produce, acquire, etc. Economic metrics are potentially valuable data as they aggregate a large amount of information (e.g., about the many supply chains or manufacturing process that feeds into that technology). However, economic performance metrics are also fraught because they can vary for reasons unrelated to the technological capability itself (e.g., profit margins changing due to competition, products being sold at a loss to gain market share, changes in interest rates, input cost changes due to technical improvements further up in the supply chain, etc.). The APTO program will consider prediction of both technical and economic metrics, but care should be taken that the economic metrics are interpreted in light of their additional complexities.

Interactions and Dependencies Between Technologies

The object of interest for the APTO program is individual key technologies and their outcomes. However, one technology's outcomes are likely impacted by those of another technology. For example, improvements in computing power made possible more maneuverable airplanes, because more maneuverable but unstable airplane designs could be stabilized through rapid surface adjustments made by a computer. As such, understanding and predicting individual technology outcomes may require tracking and anticipating multiple technologies' interactions. Such broader technology coverage would also allow for understanding and predicting scenarios of multiple technologies' outcomes together.

Outcome Frontiers vs. Populations

For assessment and prediction of Capabilities, the APTO program's primary object of interest is the frontier of what has ever been accomplished. This may be a single value for key technologies with only one performance dimension, while it may be a multi-dimensional surface for key technologies with multiple performance dimensions. For assessment and prediction of Production or Use, the APTO program's object of interest is both the frontier and the population. For example, predictions can include if/when Production or Use of a technology first begins in any location but also how that Production or Use subsequently expands to other areas or increases in quantity within those areas it is present.

Theoretical Advances

It is expected that advancing predictive accuracy may require significant advances in the theory of technological change. Submissions that aim to make such theoretical advances are welcomed, provided that such insights are leveraged to make more accurate predictions or demonstrate when theoretical limits have been reached.

5. Outputs

All models, data, and tools created for the APTO program should be made openly accessible and available. It is possible that certain data sets may be proprietary in nature and cannot be made open access — those should be clearly justified.

III. Award Information

Up to a total of $30 million may be awarded annually subject to the availability of funds. Awards may be 1 to 5 years in duration. Awards are expected to have an annual budget of $500,000 to $4 million. Awards of longer than 1 year will be made as cooperative agreements. Awards may be made for 1-year efforts for pilots, planning and preparation for larger subsequent efforts to be submitted to future APTO program solicitations.

For multi-year awards, support of each year beyond the first will be contingent on satisfactory review of progress towards targeted goals. Progress will be presented to NSF in a mid-year check-in and to NSF and/or external reviewers towards the end of each year. This review will determine if subsequent-year funding will be awarded.

IV. Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research laboratories, professional societies and similar organizations located in the U.S. that are directly associated with educational or research activities.
  • For-profit organizations: U.S.-based commercial organizations, including small businesses, with strong capabilities in scientific or engineering research or education and a passion for innovation.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or co-PI:

There are no restrictions or limits.

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Preliminary Proposals (required): Preliminary proposals are required and must be submitted via Research.gov, even if full proposals will be submitted via Grants.gov.

Submission of a preliminary proposal conveys an understanding on the part of the preliminary proposer that their preliminary proposal and associated contact information will be shared with other preliminary proposers for the purposes of teaming. All preliminary proposals should therefore additionally include, in the last line of the Overview section of the Project Summary, a contact email address to be shared along with the preliminary proposal for this purpose.

Preliminary proposals must include:

  1. Cover Sheet (1 page) (For planning purposes, as soon as November 1, 2023, should be shown as the start date.)
  2. Project Summary (1 page)
    1. Overview
      1. LOEs addressed (Technology Outcome Models and/or Data Sets and Associated Data Creation/Preparation Tools)
      2. Technology Outcomes addressed (Capabilities, Production, and/or Use)
      3. Years of effort proposed (1-5) (Projects of more than 1 year will be awarded as cooperative agreements.)
      4. Abstract (up to 125 words)
      5. Contact email address
    2. Intellectual Merit
    3. Broader Impacts
  3. Project Description (up to 6 pages)
    1. Project (up to 3 pages): For each LOE addressed, describe the goal(s), basic idea(s), and technical approach. Clearly describe the models, data or tools targeted by the effort. If there are initial results, describe them. If there are technical challenges to the proposed approach, describe how they will be addressed.
    2. Personnel (1 page): Please provide a Table of Investigators, including the names, institutions and expertise (keywords) of the PI and all Faculty Associates. Non-funded collaborators may also be included on this table but should be clearly marked as such.
    3. Budget Planning (up to 2 pages): For each LOE addressed, include a rough estimate of costs per year, to no greater precision than $100,000 per year. Include 2-3 options for funding levels for each proposed year of effort. For each funding option, provide one or more targeted goals and an estimated probability for the goals being attained by year.
  4. References Cited (1 page)

Other sections specified in the PAPPG are not required and will not be reviewed. Preliminary proposals must not contain proprietary data.

Preliminary proposals will be reviewed internally by NSF, which will send a notice of encouragement or discouragement to submit a full proposal. Only those encouraged to submit a full proposal should consider investing the significant time and effort into developing a full proposal.

Preliminary proposals will be returned without review if deemed inappropriate for funding by NSF, not responsive to this funding opportunity, or not within areas set forth for support in this program solicitation.

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Research.gov or Grants.gov.

  • Full Proposals submitted via Research.gov: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal and Award Policies and Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov. The Prepare New Proposal setup will prompt you for the program solicitation number.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via Research.gov. PAPPG Chapter II.E.3 provides additional information on collaborative proposals.

See PAPPG Chapter II.D.2 for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the PAPPG instructions.

B. Budgetary Information

Cost Sharing:

Inclusion of voluntary committed cost sharing is prohibited.

Other Budgetary Limitations:

Proposals for efforts longer than 1 year should include a year-by-year budget, with each year ranging from $500,000 to $4 million. For multi-year awards, support of each year beyond the first will be contingent on satisfactory review of progress towards intended goals.

C. Due Dates

  • Preliminary Proposal Due Date(s) (required) (due by 5 p.m. submitter's local time):

         August 21, 2023

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

         October 30, 2023

D. Research.gov/Grants.gov Requirements

For Proposals Submitted Via Research.gov:

To prepare and submit a proposal via Research.gov, see detailed technical instructions available at: https://www.research.gov/research-portal/appmanager/base/desktop?_nfpb=true&_pageLabel=research_node_display&_nodePath=/researchGov/Service/Desktop/ProposalPreparationandSubmission.html. For Research.gov user support, call the Research.gov Help Desk at 1-800-673-6188 or e-mail rgov@nsf.gov. The Research.gov Help Desk answers general technical questions related to the use of the Research.gov system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

For Proposals Submitted Via Grants.gov:

Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: https://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to Research.gov for further processing.

Proposers that submitted via Research.gov may use Research.gov to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF Proposal Processing And Review Procedures

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in PAPPG Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/bfa/dias/policy/merit_review/.

Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in Leading the World in Discovery and Innovation, STEM Talent Development and the Delivery of Benefits from Research - NSF Strategic Plan for Fiscal Years (FY) 2022 - 2026. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

One of the strategic objectives in support of NSF's mission is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. NSF's contribution to the national innovation ecosystem is to provide cutting-edge research under the guidance of the Nation's most creative scientists and engineers. NSF also supports development of a strong science, technology, engineering, and mathematics (STEM) workforce by investing in building the knowledge that informs improvements in STEM teaching and learning.

NSF's mission calls for the broadening of opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (PAPPG Chapter II.D.2.d(i). contains additional information for use by proposers in development of the Project Description section of the proposal). Reviewers are strongly encouraged to review the criteria, including PAPPG Chapter II.D.2.d(i), prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and other underrepresented groups in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

Additional Solicitation Specific Review Criteria

In addition to the Intellectual Merit and Broader Impacts criteria, reviewers will be asked to address the following questions:

  1. Contribution to goals of the program: How well will the proposed research contribute to the goals of the program? For example, will research under the Technology Outcome Models LOE result in predictive causal models, or something else? Will research under the Data Sets and Tools LOE make data sets and tools relevant to the key technologies of interest, or to other technologies? Research that does not make immediate contributions to the ultimate program goals may still make high-impact contributions to those goals in the long term, but a theory is needed for how those long-term contributions will be made.
  2. Probability of success: What are the probabilities that the proposal's goals will be reached? What are the probabilities that accomplishing these goals will result in success at the goals of the program?
  3. Integration across LOEs: Will research under the Data Sets and Tools LOE usefully contribute to other research under the Technology Outcome Models LOE? Will research under the Technology Outcome Models LOE make use of advances under the Data Sets and Tools LOE? Integration across LOEs may be achieved within a single proposal or across a portfolio of proposals.

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements or the Division of Acquisition and Cooperative Support for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by an NSF Grants and Agreements Officer. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Administrative and National Policy Requirements

Build America, Buy America

As expressed in Executive Order 14005, Ensuring the Future is Made in All of America by All of America's Workers (86 FR 7475), it is the policy of the executive branch to use terms and conditions of Federal financial assistance awards to maximize, consistent with law, the use of goods, products, and materials produced in, and services offered in, the United States.

Consistent with the requirements of the Build America, Buy America Act (Pub. L. 117-58, Division G, Title IX, Subtitle A, November 15, 2021), no funding made available through this funding opportunity may be obligated for an award unless all iron, steel, manufactured products, and construction materials used in the project are produced in the United States. For additional information, visit NSF's Build America, Buy America webpage.

Special Award Conditions:

Multi-year awards will be made as cooperative agreements. The cooperative agreement awards may include Special Conditions relating to the period of performance, statement of work, awardee responsibilities, NSF responsibilities, joint NSF-awardee responsibilities, funding and funding schedule, reporting requirements, and other conditions. PIs will be required to participate in several virtual meetings:

  1. Kickoff meeting: 2-day meeting within approximately first 30 days of the Award. All PIs of all awards.
  2. Check-in meeting: 1-day meeting with NSF midway through each award year. Each awarded PI individually.
  3. Evaluation meeting: 2-day meeting with NSF and reviewers in final part of each award year. All PIs of all awards. The purpose of the evaluation meeting is to assess progress the awardees have made towards their target goals. Each awardee will prepare briefing material (expected to be 10 pages or less) describing project accomplishments and make a short presentation which will be followed by questions and answers. The reviewers will evaluate the awardee's progress towards its target goals. Taking into account reviewers' input, NSF will determine whether the project will receive funding for the subsequent year.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

VIII. Agency Contacts

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

For questions related to the use of NSF systems contact:

  • NSF Help Desk: 1-800-673-6188
  • Research.gov Help Desk e-mail: rgov@nsf.gov

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail:support@grants.gov.

IX. Other Information

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at https://www.grants.gov.

About The National Science Foundation

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See the NSF Proposal & Award Policies & Procedures Guide Chapter II.F.7 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

  • Location:

2415 Eisenhower Avenue, Alexandria, VA 22314

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:
 

Send an e-mail to:

nsfpubs@nsf.gov

or telephone:

(703) 292-8134

  • To Locate NSF Employees:

(703) 292-5111

Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See System of Record Notices, NSF-50, "Principal Investigator/Proposal File and Associated Records," and NSF-51, "Reviewer/Proposal File and Associated Records." Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Policy Office, Division of Institution and Award Support
Office of Budget, Finance, and Award Management
National Science Foundation
Alexandria, VA 22314