Active funding opportunity

This document is the current version.

NSF 25-541: Test Bed: Toward a Network of Programmable Cloud Laboratories (PCL Test Bed)

Program Solicitation

Document Information

Document History

  • Posted: July 16, 2025

Program Solicitation NSF 25-541

NSF Logo

U.S. National Science Foundation

Directorate for Technology, Innovation and Partnerships

Directorate for Mathematical and Physical Sciences

Full Proposal Deadline(s) (due by 5 p.m. submitting organization's local time):

     November 20, 2025

Important Information And Revision Notes

Any proposal submitted in response to this solicitation should be submitted in accordance with the NSF Proposal & Award Policies & Procedures Guide (PAPPG) that is in effect for the relevant due date to which the proposal is being submitted. The NSF PAPPG is regularly revised and it is the responsibility of the proposer to ensure that the proposal meets the requirements specified in this solicitation and the applicable version of the PAPPG. Submitting a proposal prior to a specified deadline does not negate this requirement.

Summary Of Program Requirements

General Information

Program Title:

Test Bed: Toward a Network of Programmable Cloud Laboratories (PCL Test Bed)

Synopsis of Program:

Autonomous experimentation is poised to accelerate research and unlock critical scientific advances that bolster U.S. competitiveness and address pressing societal needs. Programmable Cloud Laboratories are able to execute automated workstreams, including self-driving lab workflows, to efficiently move research goals through artificial intelligence (AI) enabled experiment design, laboratory preparations, data collection, data analysis and interpretation. While limited-scale efforts have shown promise, versatile programmable and self-driving labs capable of addressing complex research questions with trustworthy results will require coordinated technological advances and an engaged research community. Additional challenges include the availability of automated laboratory infrastructure, standardized approaches to data collection for interoperability, advances in AI for data interpretation and experimental design, and more. This solicitation aims to address such gaps and realize the potential of autonomous experimentation.

The Test Bed: Toward a Network of Programmable Cloud Laboratories (PCL Test Bed) program seeks to establish and facilitate the operation of distributed autonomous laboratory facilities. These laboratories will combine technological and human capacity to enable integration, testing, evaluation, validation, and translation of cutting-edge technology solutions in automated science and engineering. The PCL Test Bed will consist of a set of Programmable Cloud Laboratory Nodes (PCL Nodes) that can be remotely accessed to run custom workflows specified and programmed by users, that are linked together via computational networking, shared science questions, and data and artificial intelligence (AI) standards.

The PCL Test Bed will facilitate access to advanced scientific equipment, accelerate translation and scaling of basic research into industry applications, enhance reproducibility and the exchange of experimental data, and assist in training the next generation of scientists and engineers in state-of-the art methodologies. It will help develop community norms, best practices, and formal standards for automated laboratory procedures, workflows, and instrument testing and validation. It will also advance consistent practices for the collection, sharing, and use of metadata and training data and the use and exploitation of AI methods. This program will also support the development of automated laboratory methods, including self-driving autonomous experiment workflows.

Proposals must have a set of well-defined science drivers poised to derive significant benefit from targeted use of the PCL Test Bed capabilities, including but not limited to synthesis, optimization, and/or characterization experiments, in specific sub-disciplines within materials science, biotechnology, chemistry or other areas of science and engineering. These science drivers will guide the protocols and standards necessary for each node and facilitate collaboration across the Test Bed. For example, science drivers could include but are not limited to:

  • Materials science, materials synthesis and characterization efforts that advance U.S. competitiveness.
  • Biotechnology experiments in scalable, high-throughput engineering and characterization services for proteins or microbes with novel applications in the U.S. bioeconomy.
  • High-throughput experimentation for the accelerated development of catalysts to support more efficient chemical synthesis to address urgent national needs.

User Recruitment and On-Boarding Workshops will be a key component of the PCL Test Bed program and will serve to recruit users to individual PCL Nodes and the Test Bed to help make progress on the proposed science drivers, provide access to technology, test the limits of the experimental set-up of the nodes, and explore new research opportunities between the PCL Nodes and institutions including, but not limited to, R2 Universities, PUI (Primarily Undergraduate Institutions), and two-year institutions.

The PCL Test Bed will be available to researchers in academia as well as industry, including current and former awardees from the Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs. The portfolio of projects is available here, https://seedfund.nsf.gov/portfolio.

PCL Nodes are expected to develop and implement plans for continued operation after the period of this award.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.041 --- Engineering
  • 47.049 --- Mathematical and Physical Sciences
  • 47.050 --- Geosciences
  • 47.070 --- Computer and Information Science and Engineering
  • 47.074 --- Biological Sciences
  • 47.075 --- Social Behavioral and Economic Sciences
  • 47.076 --- STEM Education
  • 47.079 --- Office of International Science and Engineering
  • 47.083 --- Office of Integrative Activities (OIA)
  • 47.084 --- NSF Technology, Innovation and Partnerships

Award Information

Anticipated Type of Award: Cooperative Agreement

Estimated Number of Awards: 4 to 6

PCL Nodes

Up to 6 PCL Node awards will be made. Only existing shared instrument facilities, or labs of similar capabilities, may submit a proposal to this program. The current solicitation does not support ab initio creation of new lab facilities. Plans for expansion or modification of existing shared instrument facilities can be included with justification for how this will advance the use of the PCL Nodes for the benefit of the proposed science drivers and the PCL Test Bed.

A PCL Node is an independent site at one physical location. Each PCL Node will be funded up to $5M/year for 4 years, for a total budget not to exceed $20M per PCL Node. The requested amount should be well justified based on the specific science drivers, experimental capabilities, and a broad user community that will be supported and, in general, on the level of expertise and resources that will be made available by the PCL Node to proposed groups of users.

A PCL Node is expected to provide remote access to a suite of instruments via open standardized interfaces (e.g., application programmable interfaces, or APIs); adhere to best practices and standards developed in collaboration with other nodes across the PCL Test Bed, e.g., for instrument use and validation, metadata, data, and AI models; allow users to specify and run bespoke experiments; and provide the necessary technical expertise to assist users with the facility. The set of PCL Nodes will be linked together — via computational networking as well as common science problems, data and AI standards — to form the PCL Test Bed. PCL Nodes will support experiments in one or more Science Drivers covering sub-disciplines of science and/or engineering in biotechnology, chemistry, materials science, and others. Notably, NSF will require PCL Nodes to work together on the development and deployment of conceptually common experimental design protocols, laboratory protocols, and metadata and data standards.

Each PCL Node will recruit new users for the Test Bed via Recruitment Workshops to further research and translation activities within the science drivers, provide access to new technology to accelerate research progress at all levels and from across the entire United States to support all Americans, explore the capacity and capabilities of the Nodes and the Test Bed, and advance research programs at under-resourced institutions. Following the Recruitment Workshops, led by the PCL Node, On-Boarding Workshops will instruct the selected outside users on how to access the Node and Test Bed resources.

Workshops will be expected to be publicized to the appropriate user communities by the PIs. NSF may also support public dissemination of the progress of PCL Nodes moving to Recruitment Workshops by issuance of Dear Colleague Letters (DCLs).

Anticipated Funding Amount: $100,000,000

Awards will be made as Cooperative Agreements and funds will be allocated one year at a time, subject to availability of funds, quality of proposals received, and progress against proposer- and NSF-defined metrics.

Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs): Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of sub-awards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research laboratories, professional societies and similar organizations located in the U.S. that are directly associated with educational or research activities.
  • For-profit organizations: U.S.-based commercial organizations, including small businesses, with strong capabilities in scientific or engineering research or education and a passion for innovation.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization: 1

An institution may submit only a single proposal in response to this solicitation, as the lead institution.

If more than one proposal is submitted from an institution, the first proposal submitted from that institution will be considered, and remaining proposals will be returned without review.

An institution may serve as a non-lead institution on more than one proposal.

Limit on Number of Proposals per PI or co-PI: 1

An individual may serve as PI, co-PI, or Senior Personnel only on one proposal submitted in response to this solicitation.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not required
  • Preliminary Proposal Submission: Not required
  • Full Proposals:

B. Budgetary Information

  • Cost Sharing Requirements:

    Inclusion of voluntary committed cost sharing is prohibited.

  • Indirect Cost (F&A) Limitations:

    Not Applicable

  • Other Budgetary Limitations:

    Not Applicable

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitting organization's local time):

         November 20, 2025

Proposal Review Information Criteria

Merit Review Criteria:

National Science Board approved criteria. Additional merit review criteria apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:

Additional award conditions apply. Please see the full text of this solicitation for further information.

Reporting Requirements:

Standard NSF reporting requirements apply.

I. Introduction

A. Automated Laboratory Science

The NSF Directorate for Technology, Innovation and Partnerships (NSF TIP) seeks to establish test beds to advance the development, operation, integration, deployment, and demonstration of new, innovative critical technologies. These test beds would provide access to innovative technologies that support new modes of work across research, development, and industry while also providing the opportunity to demonstrate commercial viability of new technologies and prospects for establishment of new enterprises and/or industry sectors. Such test beds would additionally nurture a workforce with the skills needed to operate the test beds with the expectation that the operation would continue after NSF and any other Federal funding ends.

This NSF solicitation calls for the establishment of a Programmable Cloud Laboratories (PCL) Test Bed consisting of a network of Programmable Cloud Laboratory Nodes (PCL Nodes) with the key objectives of accelerating automated science and engineering and democratizing access to state-of-the-art instruments, including AI-based methods across multiple domains of science and engineering. The National Security Commission on Emerging Biotechnology, created and tasked by Congress to examine the critical intersection of emerging biotechnology and national security, recommended that NSF establish a network of "cloud labs," to enable access to cutting edge tools and accelerate data generation to support biotechnology R&D. This PCL Test Bed funding opportunity aligns with this recommendation.

Progress in automated laboratory science has already begun, as demonstrated by efforts such as the Acceleration Consortium at the University of Toronto (https://acceleration.utoronto.ca/), the NIST Autonomous Formulation Lab, AFL (https://www.nist.gov/ncnr/ncnr-facility-upgrades/autonomous-formulation-lab-afl), and the CAPeX: Pioneer Center for Accelerating P2X Materials Discovery in Denmark (https://capex.dtu.dk/). Workshops organized by the NSF TIP Directorate in October 2023 and January 2024 — on Creating a National Network of Cloud and Self-Driving Labs (https://events.mcs.cmu.edu/ac-sdl_workshop/) and the FUTURE Labs Workshop (https://research.ncsu.edu/futurelabsworkshop/program/) and by the NSF Computer and Information Science and Engineering Directorate (https://nsf-sdl-2023.github.io), have identified the need for developing supporting infrastructure for automated research facilities and encouraging early adopters in this space in order to establish a successful and sustainable cohort of cloud labs, with capabilities to execute distributed experiments as well as self-driving experiments.

B. The Programmable Cloud Laboratories Test Bed

The vision of the PCL Test Bed program is to establish and operate distributed lab facilities with technological and human capacity to enable integration, testing, evaluation, validation, and translation of cutting-edge technology solutions in automated science and engineering. The PCL Test Bed consists of a set of independent PCL Nodes. For the current solicitation, only organizations with pre-existing instrument facilities are eligible to apply to this program to be a PCL Node.

Some or all the instruments in an existing facility may be dedicated for use by the PCL Node, including the possibility of time-sharing of existing instruments. Proposals may include costs of enhancing existing instruments or other lab capabilities in their budgets, and/or the cost of acquisition of new instruments. Any new instruments and/or capability acquired under this program must be fully available for use in this program and the usage must comply with the guidelines specified in the Code of Federal Regulations, 2 § CFR 200.313, including provisions related to equipment use, as detailed in 2 § CFR 200.313(c).

The PCL Test Bed will focus initially on specific Science Drivers chosen from biotechnology, chemistry, materials science, or other well-justified areas of science and engineering that are able to benefit immediately from the cloud lab approach. Once awarded, PCL Node projects will collaborate to establish common protocols and standards for laboratory workflows, data management, instrument validation, experiment verification, use of AI tools and models, and for other areas. Each PCL Node will have the ability to support the data needs of the Node and be equipped to develop and/or implement new AI tools for improved operation and experimentation.

A key goal of this program is to advance the testing and scale-up in real-world environments of automated science and engineering technologies and techniques (e.g., reproducibility of experiments, curation and exchange of experimental results, AI/ML training data, etc.). In addition, the PCL Test Bed will serve to demonstrate to potential users new techniques and technologies in automated science and engineering in a neutral, realistic, and rigorous environment, minimizing their need to perform additional rounds of in-house evaluation prior to use. To fulfill these goals, PCL Nodes are expected to propose pilot activities for testing the reliability and reproducibility of methods for automated science and engineering; testing scale-up of experiments, including high-throughput runs; performing failure analyses; and assessing safety and research security of the Node.

The PCL Test Bed will help accelerate scientific discovery by improving access to automated instrumentation for design-build-test-learn cycles facilitating the collection of experimental data for machine learning to be integrated with traditional scientific workflow and enabling development of "self-driving" labs. Researchers, including early-career scientists and engineers, would be empowered by the potentially increased pace of experimentation and by the anticipated improvements in the reproducibility of experiments. The capabilities provided by the Test Bed would help expedite commercialization and serve to lower costs for deep-technology startups.

II. Program Description

This solicitation invites organizations with pre-existing instrument facilities to apply to be a PCL Node, as part of a distributed PCL Test Bed. The solicitation also requires awardees to run Recruitment and On-Boarding Workshops to attract new users, including those from under-resourced institutions, who will conduct science on the PCL Node and Test Bed to result in accomplishments in basic and translational research, education, and workforce development.

A PCL Node may incorporate some or all of the instruments at an existing facility, including time-sharing of instruments with the PCL Node. In limited circumstances, and with adequate justification in the context of specified science drivers, a PCL Node may propose modest requests for acquiring a new instrument and/or enhancement of existing instruments or other lab capabilities that would become part of the PCL Node.

Proposals may not request support for physical plant infrastructure, e.g., building, utilities, and related costs. Proposals requesting funds for these items as part of the proposal budget will be considered non-compliant and will be returned without review.

A. Test Bed Objectives

The PCL Test Bed will help realize the vision of automated science and engineering through activities such as advancing the design, development, testing, and refinement of the laboratory protocols and standards necessary for creating the AI-driven autonomous laboratories of the future. Experiments identified in proposed science drivers will provide the driving force toward measurable progress in automated science and engineering.

The PCL Test Bed will help develop critical elements essential to the success of autonomous labs and automated science and engineering. These may include (but are not limited to):

  1. Execution of programmable workflows based on open instrument APIs and open lab protocols for experiment design, instrument validation, workflow verification, and others. Scientific workflows may be executed within a single PCL Node or across multiple nodes. PCL Nodes may be heterogeneous in their size (i.e., number of instruments) and capabilities (i.e., types of instruments).
  2. Development of common open metadata and data standards for science drivers, including for publication of data, models, and results. All projects should plan to collaborate across the Test Bed to establish protocols and standards that facilitate laboratory automation. PCL Nodes will collaborate with other (external) partners spanning industry, academia, government labs, and others.
  3. Use of AI for "self-driving" experiments in which automated systems design, execute, and analyze experiments using AI and Machine Learning.
  4. Use of AI in lab automation to improve operational efficiency.
  5. Support for distinct usage modes that can be supported by the capabilities and capacity of the PCL Nodes including, for example:
    1. Power users executing complex experiment workflows across multiple nodes.
    2. High-throughput users executing many variants of the same workflow.
    3. First-time users experimenting with lab automation in a cloud lab. These users would have demonstrated lab experience and would be looking to solve scientific problems of significance.
    4. Prototyping users testing specific workflows prior to replicating the entire instrument setup and workflow in their own labs for "on premise" use.
    5. Industry users including, but not limited to, startups and small businesses funded by the federal Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs using the system for research and testing.
    6. Education users using the Test Bed for undergraduate and graduate research and to augment classroom instruction.
    7. Research users conducting basic and translational research in science, e.g., biotechnology, chemistry, materials science, as well as in enabling technologies, e.g., development of new instrumentation, robotics, and AI/ML.
  6. Exploring challenges in research security. While ease of access and use are important considerations, all PCL Nodes in the Test Bed must also implement measures to prevent inappropriate use of laboratory facilities. This includes developing protocols for allowing secure access to the overall Test Bed as well as protocols for monitoring use of the Test Bed.
  7. Demonstrating ease of access. Like cloud computing, the PCL Test Bed has the potential of providing easy access to a set of powerful resources and capabilities to a larger set of science and engineering users. There is also tremendous potential to increase the productivity of Test Bed users and of providing access to communities of users that have not previously had easy access to such resources.
  8. Experimentation in Enabling Technologies. PCL Node proposals may choose to incorporate experimentation with enabling technologies for improved lab operation/productivity, increased reliability, reproducibility, and other aspects related to the PCL Node. For example, this may include the development of digital twins for the laboratory and laboratory workflows to enable pre-execution feasibility analysis experiments, supporting use in classroom instruction, and identifying key bottlenecks. The emphasis of such experimentation should be on improving the effectiveness of the PCL Test Bed in achieving its goals. This is another area where collaborations across the Test Bed and/or with external collaborators, e.g., industry partners, are encouraged.

B. Science Drivers

Each PCL Node proposal should include a clear description of one or more specific science driver(s) that will provide a framework for the development of the PCL Node, especially in Years 1 and 2. This should include examples of commonly used and key experimental protocols/workflows, and their mapping to the node's available facilities and expertise. Experiments may range from synthesis to optimization to characterization. PCL Nodes are encouraged to include at least some "self-driving" experiments in this phase, where AI and/or other automated methods are employed to use data output from one experiment to determine the next step(s) in the experiment workflow.

As described in section V.B. Budgetary Information below, the proposal budget should include support for scientists (senior researchers, postdocs, and/or students) working towards the science drivers, with the clear understanding that these project participants are being supported to synergistically test and improve the capabilities of nodes while making progress towards science drivers and supporting the broader Test Bed.

As described in item E. Recruitment and On-Boarding Workshops below, PCL Node proposals must incorporate new users of the Test Bed in Years 3 and 4 of this effort, or earlier if the equipment is fully operational. This will be facilitated by holding Recruitment and On-Boarding Workshops targeted at new users. Thus, early active outreach to this community of new users is encouraged to help cultivate that user base. In general, PCL Node proposals should include a quantitative assessment of the pool of likely users of the proposed Node/Test Bed, their projected utilization of the Node over time, and the value that such users will derive from utilizing the capabilities of the PCL Test Bed.

While examples of science drivers are provided below, proposals that propose other science drivers with a similar level of specificity and potential impact are also encouraged. Nodes working on science drivers in the same general areas of science will be required to collaborate with each other.

Support for Science Drivers

For any science driver, users must be able to program their corresponding experiment workflows. Tools must be available at the respective PCL Node to enable users to analyze data generated from their science workflows, including applying AI/ML methods to determine the next steps of an experiment, which could be performed at the same Node or at other Nodes in the PCL Test Bed. To facilitate interoperability among PCL Nodes, workflows must generate detailed reports for auditability and reproducibility, summarizing methods used, quality control procedures, data analysis procedures and other experiment details. PCL Nodes must adhere to standards, such as Good Manufacturing Practice (GMP) or Good Laboratory Practice (GLP), to ensure and document reproducibility, traceability, and regulatory compliance. The nodes must also implement appropriate biosafety and biosecurity measures in accordance with Dual Use Research of Concern/Pathogens with Enhanced Pandemic Potential (DURC/PEPP) policies.

Science Driver Example 1: Advanced Materials

Advancing materials development for many segments of materials science requires manipulations to process solid input raw materials and make measurements on solid material outputs. Examples of key advanced materials include ceramics, polymers, superconductors, electronic materials, 2D materials, and alloys. Whereas a wet chemistry PCL Node could employ liquid transfer protocols in its synthesis and characterization sections, solid materials require solid proportioning, blending, and transfer. Materials like ceramics and composites also can require high temperature melting or sintering with PCL handling of the hot output material.

Characterizations of solid materials in a PCL lab will generally involve surface analytical measurements such as X-ray methods, microscopy, and other methods potentially involving pulverization, dispersion, or solubilization. Advanced materials are also commonly characterized by their properties under use conditions. Other types of testing, such as physical-mechanical, thermal, rheological, and optical testing can be contemplated in this type of PCL Node.

It is also possible to envision a PCL Node for Advanced Materials where synthesis is performed by wet or dry methods on different "front end" instruments and results are fed to a shared characterization "back end" set of instruments that are agnostic to the synthetic method employed for the material.

Science Driver Example 2: High-throughput characterization services for bioeconomy supply chain (Biotechnology)

Biotechnology offers a path to overcome supply chain issues that impact economic and national security. High-throughput characterization infrastructure is essential for both academia and industry as they test and characterize cellular, acellular, and engineered protein solutions to use-inspired applications including but not limited to production of active pharmaceutical ingredients, recovery of critical minerals, and food security applications that benefit the U.S. bioeconomy. However, existing automated lab facilities often have limited flexibility in terms of remote access and programmability, and the diversity of experiments they can support. The PCL Test Bed could provide a comprehensive set of remotely accessible, end-to-end services for high quality recombinant protein production, advanced molecular biological, biochemical and biophysical assays, plant transformation and/or prototyping of engineered organisms. A scalable, high-throughput PCL platform may include capabilities to facilitate experimentation underpinning applications of biotechnology to bioeconomy solutions. While each proposal should detail the specific use case(s) or supply chains that it would advance, potential PCLs could address areas such as but not limited to synthetic biology, protein design and production, -omics analysis, phenotyping and/or characterization. Such a PCL could include capabilities like DNA synthesis; cloning; plasmid preparation; protein expression in cell-based and/or cell-free systems; post-translational modifications; and protein purification. Capabilities might also include cell sorting, cell screening, and cell phenotyping. In addition, inclusion of structural and functional characterization services should provide a broad range of molecular biological, biochemical and biophysical assays including, but not limited to, protein identification and quantitation; analysis of post-translational modifications; structure, function and material property analysis. A key component of work like this would be the curation of the acquired data in ways that make it amenable to further training /refinement of AI/ML models.

Science Driver Example 3: High-Throughput Experimentation for Catalyst Discovery (Chemistry)

High-throughput experimentation (HTE) has become an established technique for chemical reaction catalyst optimization that integrates consistent reproducible experimental data with catalyst descriptors and data science/machine learning tools for optimal catalyst prediction across a range of substrates. Accessibility to automated HTE tools and data rich analysis is currently limited to a relatively small number of labs with the required technology, and further limited by the need for extensive catalyst libraries as well as specialized instrumentation that supports light-catalyzed, electrochemical, or pressurized processes. A HTE PCL Node would expand the accessibility of HTE tools and provide access to large catalyst libraries with corresponding descriptor libraries. Complementary facilities for catalyst and/or ligand synthesis could be envisioned. Expanded access to this technology would significantly enhance the ability of a wider range of experimentalists to adopt data science tools and interpret results of machine-assisted chemical catalyst optimization, to support more efficient chemical synthesis to address urgent societal needs.

C. Data and AI Capabilities

Each team must demonstrate expertise and experience in data management and data curation and in relevant state-of-the-art AI methods, including deep learning and knowledge representation. PCL Nodes will enable the creation of AI models to support efficient and effective use of the Node and Test Bed, and to meet other needs in support of the Science Driver.

Data and AI leads within each PCL Node shall be able to assist with their respective metadata and data design and collection efforts, while recommending use of available, applicable AI models and/or development of new AI models, as needed. The scope for using AI methods across the PCL Test Bed is large — including for improving the overall efficiency and performance of individual PCL Nodes and the overall Test Bed itself, and for exploring an experiment "search space", determining the next step in an experiment, and developing "self-driving" experiments.

Data and AI lead staff should be prepared to assist users with "in-loop" data and AI issues, where data management and AI issues may arise as part of an on-going experiment. This may be a combination of assistance during experiment design as well as quick response during experiment execution, in a time frame that is commensurate with the experiment at hand. Additionally, the data and AI staff should be prepared to assist with "post experiment" support where data management and AI issues may arise after an experiment has completed. This includes support for publishing data, models, and outcomes of experiments.

Data sharing among PCL Nodes should be initiated from the very beginning of the overall effort, and should encompass all data, including instrument calibration data, environmental data, operational data, experiment data, and all other associated data and metadata. Establishing a data sharing culture in the PCL Test Bed will assist in the development of relevant standards and protocols and help instill trust in the overall Test Bed. A Node may propose to lead data and AI coordination activities across the Test Bed.

D. Biosecurity and AI Considerations that Apply to all Proposals

NSF acknowledges the rapidly evolving nature of computational biology and the advances in artificial intelligence in biology, and the potential that such technologies can both contribute to the advance of science and the production of dual-use biological knowledge, technology, and products. Proposals should therefore articulate how they will achieve a balance between enabling innovation and ensuring appropriate handling safety, security, and responsibility considerations.

E. User Support and Experiment Assistance

PCL Nodes should incorporate the workforce necessary for successful operation of the Node — including for user support and facility operation. Support for staff to assist with experiment design, execution, and post-experiment assistance should be included. This effort may involve assisting users with metadata/data design; creation, re-purposing, or re-use of AI models and training data associated with experiments; programming of laboratory workflows; and other user support activities.

In-loop assistance. Scientific and technical staff should be prepared to support users with "in-loop" data and AI issues, where data management and AI issues may arise as part of an on-going experiment. This may be a combination of assistance during experiment design as well as quick response during experiment execution in a time frame that is commensurate with the experiment at hand.

Post-experiment assistance. Science and technical staff should be prepared to assist users with "post-experiment" support where data management and AI issues may arise after an experiment has completed. This includes support for publishing data, models, and outcomes of experiments, as well as re-design of experiments.

Test Bed-wide Outreach, Training and Education. PCL Nodes should plan to survey the potential user community actively, in academia as well as industry, to gather needs and requirements. This outreach should include recruiting early adopter users to the facility from academia and industry. PCL Nodes should be prepared to provide training related to the use of the facility. They should also be prepared to support educational activities at all levels that utilize the Nodes. Outreach, training and education are expected to be done in collaboration with Nodes across the PCL Test Bed.

F. Recruitment and On-Boarding Workshops

PCL Nodes must plan to organize Recruitment Workshops and On-Boarding Workshops to recruit new users, including but not limited to users from under-resourced institutions. PCL Nodes must establish a review process to select new users from the Recruitment Workshops who will be required to attend the On-Boarding Workshops which will be designed to train new users in Node operations.

These workshops will identify users with interesting and impactful experiments that are suitable for the PCL Test Bed. They will also help identify the necessary education and training pathways needed to enable such users to effectively use the Test Bed in their research and education activities. PCL Nodes are required to set aside funds in Years 2, 3 and 4 to facilitate the on-boarding of these new users to the Test Bed.

Section V. A. Proposal Preparation Instructions provides details on all the sections that must be included in the proposal. This includes a Management Plan section with an Implementation Timeline; a Collaboration Plan section describing at what level and in which ways the PCL Node might collaborate with other awarded Nodes in the Test Bed working on similar science drivers, as well as more broadly on data and AI issues; and a Post-Award Continued Operation Plan section outlining plans for operation of the facility after the current award period. Please refer to that section of the solicitation for full details and instructions for proposal preparation.

III. Award Information

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds.

IV. Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs): Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the U.S., acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a U.S. institution of higher education (including through use of sub-awards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research laboratories, professional societies and similar organizations located in the U.S. that are directly associated with educational or research activities.
  • For-profit organizations: U.S.-based commercial organizations, including small businesses, with strong capabilities in scientific or engineering research or education and a passion for innovation.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization: 1

An institution may submit only a single proposal in response to this solicitation, as the lead institution.

If more than one proposal is submitted from an institution, the first proposal submitted from that institution will be considered, and remaining proposals will be returned without review.

An institution may serve as a non-lead institution on more than one proposal.

Limit on Number of Proposals per PI or co-PI: 1

An individual may serve as PI, co-PI, or Senior Personnel only on one proposal submitted in response to this solicitation.

Additional Eligibility Info:

  • Only organizations with pre-existing instrument facilities are eligible to apply to the program to be a PCL Node.
  • Proposals must include at least one Co-PI with relevant expertise in data management and AI to support the activities described in the proposal.

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Research.gov or Grants.gov.

  • Full Proposals submitted via Research.gov: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal and Award Policies and Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov. The Prepare New Proposal setup will prompt you for the program solicitation number.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via Research.gov. PAPPG Chapter II.E.3 provides additional information on collaborative proposals.

See PAPPG Chapter II.D.2 for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the PAPPG instructions.

The following information supplements the guidelines and requirements in the NSF PAPPG and NSF Grants.gov Application Guide:

Multi-Institutional Proposals: For projects involving a collaboration among multiple institutions, the proposal must be submitted by a single lead institution with funding for all other participating institutions made through subawards. See PAPPG Chapter II.E.3.a for additional information.

Proposals submitted as separately submitted NSF "collaborative proposals" as described under PAPPG Chapter II.E.3.b will be returned without review.

Page Limits: The Project Description section may be up to 20 pages. In addition, all proposals must include an Instrument Inventory Table, as described under Node Capabilities below, which is separate from the Project Description and not counted in the 20-page limit for the Project Description.

Proposal Title: The title of the proposal must begin with "PCL-Test Bed:".

All PCL Node proposals should clearly include sections for each of the following aspects:

Science drivers. What are the science drivers that will drive the development of the PCL Node and which aspects might be done in collaboration with other PCL Test Beds to potentially complement their Node? Are these drivers clearly articulated and specified? Are there well-identified users or user groups/communities in the identified science areas? How well do the capabilities offered by the PCL Node help transform the science/engineering areas? What specifically would users of the PCL Node gain from using those facilities, and the envisaged Test Bed? How will the PCL Node, as part of the national PCL Test Bed, impact U.S. national competitiveness in science and/or national security?

Node capabilities. This section of the proposal should describe what makes the PCL Node capable, unique, and/or comprehensive in supporting the identified science driver, and how would that benefit users? Can the instruments in the PCL Node facility adequately address the needs of the science drivers that are planned to be supported? Why are users unable to get the capabilities they need in the current ecosystem? The Node Capabilities section should include the following information:

  1. Instrument Inventory Table. A separate Instrument Inventory Table must be submitted listing all instruments that will be made available for use in the PCL Test Bed as part of the Node. The Instrument Inventory Table is not included in the page count for the Project Description. Pre-existing instruments may be fully or partially shared for use in the Test bed. Any new instruments and/or capability acquired under this program must be fully available for use in this program and the usage must comply with the guidelines specified in 2 § CFR 200.313, including provisions related to equipment use, as detailed in 2 § CFR 200.313(c). Instrument acquisitions may also be partially funded by this project, in which case the corresponding percentage of that instrument should be available as part of the PCL Node/Test Bed.

    The proposal should describe the types of work, acceleration of R&D, and other new opportunities that will be made possible by the specific instruments and other innovative technologies to be fielded by the node.

    The Instrument Inventory Table should include at least the following information for each instrument available to the PCL Test Bed:

    1. Type and description of the instrument and total number of such instruments.
    2. Relevance. Importance of this instrument for the science driver(s), PCL Node, and overall Test Bed. Examples of specific workflows/experiment/protocols they will support.
    3. Available time. Description of the instrument duty cycle and the amount of "active instrument time" that will be made available, e.g., in hours per day. Instruments that will be acquired via this program should be fully available for use in this program and the usage must comply with the guidelines specified in 2 § CFR 200.313, including provisions related to equipment use, as detailed in 2 § CFR 200.313(c).
  2. Node Expertise. This section should describe the team and corresponding expertise that will be available at the PCL Node to facilitate the science driver(s) identified as well as to deal with the various data and AI issues.
  3. Node Partnerships. This section should describe the external partnerships that the PCL Node plans to establish in support of its effort. What are the key partnerships that will enable the success of the Node as part of the Test Bed.

    Some of these partnerships may already be intrinsic to the projects in the form of participation by Co-PIs and other Senior Personnel. Other partnerships can be illustrated via Letters of Collaboration. Note that only Letters of Collaboration are permitted, while Letters of Support, as defined by NSF, are not permitted. Please see section II.D.2.i.(iii) on Documentation of Collaborative Arrangements of Significance to the Proposal through Letters of Collaboration in the NSF PAPPG (NSF 24-1) for further explanation. Letters of Collaboration could be provided, for example, by potential users of the Test Bed from industry, academia, and elsewhere, by parties interested in supporting specific instruments in the Test Bed, those interested in supporting the AI and software, those interested in supporting Test Bed operations, and others.

  4. Node Access control, user organization and recruitment plan, and research security. This section should describe what efforts the Node plans to undertake in recruiting, organizing, and vetting users as well as the science problems and experiments that will run at a Node, and across Nodes in the Test Bed. What types of users will be recruited to test Node capabilities and capacities and make important progress towards specified science drivers? What types of access controls will be in place to ensure safe and secure use of the facilities? What are the potential research security issues and how will they be addressed? How will access time be balanced across the user groups to ensure adequate support to a broader user base?
  5. Data and AI capabilities and expertise. This section should describe how the proposers plan to support the data management and AI needs of the Node, both to support reliability and consistency of Node operation and to support the range of scientific experiments executed on the Node. A strong proposal will contain a co-PI with extensive data and AI expertise. How will the proposers account for the range of data inputs and outputs across the Node and Test bed? What AI expertise will be leveraged to support the deployment of AI into experimental design?

Cross-Node Collaborations. As a distributed, multi-component facility, collaboration at every level is essential to the success of the PCL Test Bed. Recognizing this, proposals should set aside some time and effort, and corresponding budget, for this anticipated collaboration. The details of the collaborative activities will be worked out after all PCL Node awards have been made. Collaborations may occur in a range of areas including data/metadata standards, data sharing, AI model development and use, sharing of data for training, and sharing of the models themselves.

Training and education. Providing access to technology alone will typically not address the needs of Test Bed users. What are the plans for providing users access to relevant expertise — e.g., in instruments, lab operations, AI, software, and other areas; consultation; and training to enable efficient use of the Test Bed? What are the plans for providing education and training in areas of science represented by the science drivers, in relevant test bed technologies, and relevant AI strategies for automated science and engineering and cloud labs.

Broadening access. In addition to planning to participate in the Recruitment and On-Boarding Workshops that are scheduled to occur in Year 1- 2, does the project have other plans for reaching out to new users/user communities and raising awareness about the PCL Test Bed itself as well as about automated science and engineering?

Metrics. What are the metrics that will be utilized to evaluate success of the Test Bed, especially considering the above-mentioned criteria? What metrics will be used to evaluate the ability of the PCL Node to efficiently run the instruments and other equipment that power the PCL Node workflows? Is there a plan to collect user satisfaction data and a strategy for responding to user requests/suggestions and improving services over time, if and where needed?

Management Plan. A strong management structure, and a corresponding plan, will be necessary to successfully execute on all the aspects of this project. The Management Plan should describe the specific roles and responsibilities of the PI, any co-PIs, Senior Personnel, staff and paid consultants in ensuring that tasks are completed, and deliverables are delivered on time. Projects should set aside time and effort for regular coordination meetings, including zoom meetings, in-person meetings, in-person project workshops and all-hands meetings, and other collaboration and coordination activities over the duration of the funded effort. The plan should anticipate time and effort involved in collaborating with other projects on the same science driver(s) and with all PCL Nodes across the entire Test Bed on software environments, experiment workflows, instrumentation, data, and AI.

The management plan should also include an Implementation Timeline that provides the timeline and description of activities over the course of the project, with at least these milestones included:

  • Project Kickoff – at start of the award.
  • Alpha release to users – no later than 1 year after start. PCL Nodes are expected to focus on general organization/development of instruments, software, APIs, instrument workflows and users at their sites during the alpha phase, using the science drivers as use cases, wherever applicable.
  • Beta – no later than 1.5 years after start. PCL Nodes are expected to have completed running end-to-end workflows for the specific, identified science drivers during the beta phase.
  • Robust User Service – no later than 2 years after start. At the end of 2 years, PCL Nodes should be ready to receive new, "friendly" users who are not already part of the original proposal/project and assist them with their new workflows and data and AI needs and requirements.
  • Test Bed 2.0 – Based on outcomes from the PCL Test Bed Recruiting Workshops to be held during Year 2, the overall PCL Test Bed should be ready to on-board new users from under-resourced institutions. This will require availability of training and education materials, e.g. user guides, needed for effective use of PCL Test Bed resources and capabilities by first-time users of the facility.
  • Project end: At end of Year 4 – Deployment plan for Post-Award Sustained Operation. The individual PCL Nodes, along with other nodes in the PCL Test Bed as a whole, should develop plans for sustained operation after the initial funding from NSF is completed.

The details of the criteria at each stage, i.e., alpha, beta, and prototype stages, should be established by the cohort of awarded projects at the Project Kickoff Meeting, and annual project meetings, in conjunction with program staff at NSF.

Plan for Post-Award Continued Operation. The management plan should also include a plan for operation of the facility after the current award. What is the post-award plan for continued operation of the PCL Node? The thoughtfulness and attention to the financial viability of the effort after conclusion of NSF and any other federal funding will be a key evaluation criterion for proposals. The goal of this plan is to establish a PCL Test Bed that can sustain itself after the completion of any awards that result from this solicitation.

B. Budgetary Information

Cost Sharing:

Inclusion of voluntary committed cost sharing is prohibited.

Budget Preparation Instructions:

The level of funding provided to a PCL Node will depend upon the instruments and capabilities provided by the proposed Node, including the number and diversity of instruments, number of science drivers supported, and the corresponding staffing and other supporting resources.

PCL Node proposals may request funding for the following items as part of their project budget:

  • PCL Node Operation
    • Operations staff to operate the facility and facilitate execution of laboratory workflows.
    • Instrument maintenance costs.
    • Materials and supplies necessary for the various experiments supported by the node.
    • Interfacing with and incorporating pre-existing instruments into a PCL Node for joining the PCL Test Bed. The proposal should make clear why each instrument is essential for the Test Bed and what percentage of that instrument will be made available to the Test Bed (see V. A. 1. PCL Node Instrument Inventory).
    • Support of AI for node-level operations.
  • Science driver support
    • Technical support staff to support users of the Test Bed throughout the entire process of design, execution, and analysis of PCL-based experiments.
  • Data and AI support
    • Technical staff with expertise and experience in data management and data curation, as well as in AI/ML, to provide experiment assistance as described in F. User Support and Experiment Assistance above.
    • Data infrastructure and access to sufficient computing resources.
  • Years 3 and 4 budget set aside for on-boarding new users and new science drivers
    • A percentage of PCL Node resources (instrument time, staff) should be set aside in Years 3 and 4 for on-boarding new users from the Recruitment workshops, and to support their science drivers. This may include use of the Test Bed for teaching undergraduate or graduate courses.
  • Acquisition of new instruments or other lab resources
    • Such acquisitions must be fully justified in the context of the science drivers. Any new instruments and/or capability acquired under this program must be fully available for use in this program and the usage must comply with the guidelines specified in 2 § CFR 200.313, including provisions related to equipment use, as detailed in 2 § CFR 200.313(c).

Annual PI Meeting (required). All proposals must include a budget for an annual in-person PI meeting. The project PI team as well as key personnel should plan to travel to this meeting each year. The PI meetings will be held at a different PCL Node site each year. Since the site locations are not known, you may choose to use average- or worst-case travel estimates at the time of preparing the budget.

Additionally, each site should include the cost of hosting one PI meeting during the duration of this award. Cost for hosting a meeting may include charges for the venue, food expenses, and other meeting-related expenses at their respective site.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitting organization's local time):

         November 20, 2025

D. Research.gov/Grants.gov Requirements

For Proposals Submitted Via Research.gov:

To prepare and submit a proposal via Research.gov, see detailed technical instructions available at: https://www.research.gov/research-portal/appmanager/base/desktop?_nfpb=true&_pageLabel=research_node_display&_nodePath=/researchGov/Service/Desktop/ProposalPreparationandSubmission.html. For Research.gov user support, call the Research.gov Help Desk at 1-800-381-1532 or e-mail rgov@nsf.gov. The Research.gov Help Desk answers general technical questions related to the use of the Research.gov system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

For Proposals Submitted Via Grants.gov:

Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: https://www.grants.gov/applicants. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to Research.gov for further processing.

The NSF Grants.gov Proposal Processing in Research.gov informational page provides submission guidance to applicants and links to helpful resources including the NSF Grants.gov Application Guide, Grants.gov Proposal Processing in Research.gov how-to guide, and Grants.gov Submitted Proposals Frequently Asked Questions. Grants.gov proposals must pass all NSF pre-check and post-check validations in order to be accepted by Research.gov at NSF.

When submitting via Grants.gov, NSF strongly recommends applicants initiate proposal submission at least five business days in advance of a deadline to allow adequate time to address NSF compliance errors and resubmissions by 5:00 p.m. submitting organization's local time on the deadline. Please note that some errors cannot be corrected in Grants.gov. Once a proposal passes pre-checks but fails any post-check, an applicant can only correct and submit the in-progress proposal in Research.gov.

Proposers that submitted via Research.gov may use Research.gov to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF Proposal Processing And Review Procedures

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in PAPPG Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/funding/overview.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (PAPPG Chapter II.D.2.d(i). contains additional information for use by proposers in development of the Project Description section of the proposal). Reviewers are strongly encouraged to review the criteria, including PAPPG Chapter II.D.2.d(i), prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management and Sharing Plan and the Mentoring Plan, as appropriate.

Additional Solicitation Specific Review Criteria

In preparing your proposal, please pay attention to the:

  • Proposal Preparation Instructions section and
  • Budgetary Information section

in Section V. Proposal Preparation and Submission Instructions, above.

The information provided therein constitutes the additional solicitation-specific review criterion for the proposal and will be carefully evaluated.

All the sections and details requested in those sections are necessary and must be present in the proposal.

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Panel Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell proposers whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new recipients may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements or the Division of Acquisition and Cooperative Support for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by an NSF Grants and Agreements Officer. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Administrative and National Policy Requirements

Build America, Buy America

As expressed in Executive Order 14005, Ensuring the Future is Made in All of America by All of America's Workers (86 FR 7475), it is the policy of the executive branch to use terms and conditions of Federal financial assistance awards to maximize, consistent with law, the use of goods, products, and materials produced in, and services offered in, the United States.

Consistent with the requirements of the Build America, Buy America Act (Pub. L. 117-58, Division G, Title IX, Subtitle A, November 15, 2021), no funding made available through this funding opportunity may be obligated for infrastructure projects under an award unless all iron, steel, manufactured products, and construction materials used in the project are produced in the United States. For additional information, visit NSF's Build America, Buy America webpage.

Special Award Conditions:

In compliance with the CHIPS and Science Act of 2022, Section 10636 (Person or entity of concern prohibition) (42 U.S.C. 19235): No person published on the list under section 1237(b) of the Strom Thurmond National Defense Authorization Act for Fiscal Year 1999 (Public Law 105-261; 50 U.S.C. 1701 note) or entity identified under section 1260H of the William M. (Mac) Thornberry National Defense Authorization Act for Fiscal Year 2021 (10 U.S.C. 113 note; Public Law 116-283) may receive or participate in any grant, award, program, support, or other activity under the U.S. National Science Foundation Directorate for Technology, Innovation and Partnerships. See here for more details.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final annual project report, and a project outcomes report for the general public.

Failure to provide the required annual or final annual project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final annual project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

VIII. Agency Contacts

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

For questions related to the use of NSF systems contact:

  • NSF Help Desk: 1-800-381-1532
  • Research.gov Help Desk e-mail: rgov@nsf.gov

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

IX. Other Information

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at https://www.grants.gov.

About The National Science Foundation

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See the NSF Proposal & Award Policies & Procedures Guide Chapter II.F.7 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by proposers will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/recipients to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding proposers or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See System of Record Notices, NSF-50, "Principal Investigator/Proposal File and Associated Records," and NSF-51, "Reviewer/Proposal File and Associated Records." Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Policy Office, Division of Institution and Award Support
Office of Budget, Finance, and Award Management
National Science Foundation
Alexandria, VA 22314