This document has been archived.
Course, Curriculum, and Laboratory Improvement (CCLI)
National Science Foundation
Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):
May 09, 2006
For Phase 1 proposals from submitting organizations located in states or territories beginning with A through M.
May 10, 2006
For Phase 1 proposals from submitting organizations located in states or territories beginning with N through W.
January 10, 2007
For Phase 2 and 3 proposals.
In furtherance of the President's Management Agenda, in Fiscal Year 2006, NSF has identified programs that will offer proposers the option to utilize Grants.gov to prepare and submit proposals, or will require that proposers utilize Grants.gov to prepare and submit proposals. Grants.gov provides a single Government-wide portal for finding and applying for Federal grants online.
Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system. In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:
- by one organization (and which include one or more subawards); or
- as separate submissions from multiple organizations.
Proposers are advised that collaborative proposals submitted in response to this Program Solicitation via Grants.gov will be requested to be withdrawn and proposers will need to resubmit these proposals via FastLane. (Chapter II, Section D.3 of the Grant Proposal Guide provides additional information on collaborative proposals.)
The following items are major revisions to the previous program solicitation:
The additional review criteria in Section VI.A have been revised.
This solicitation allows letters showing collaborator commitments and organizational endorsement and samples of final products as supplementary documents. The sample materials should be concise and relevant.
Course, Curriculum, and Laboratory Improvement (CCLI)
Synopsis of Program:
The Course, Curriculum, and Laboratory Improvement (CCLI) program seeks to improve the quality of science, technology, engineering, and mathematics (STEM) education for all undergraduate students. The program supports efforts to create new learning materials and teaching strategies, develop faculty expertise, implement educational innovations, assess learning and evaluate innovations, and conduct research on STEM teaching and learning. The program supports three types of projects representing three different phases of development, ranging from small, exploratory investigations to large, comprehensive projects.
Cognizant Program Officer(s):
Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):
An individual may be the main Principal Investigator (PI) on only one proposal submitted for any deadline. There is no restriction on the number of proposals for which an individual may serve as a co-PI.
Full proposals submitted via FastLane:
Grant Proposal Guide (GPG) Guidelines apply
Full proposals submitted via Grants.gov:
NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov Guidelines apply (Note: The NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: https://www.nsf.gov/bfa/dias/policy/docs/grantsgovguide.pdf) To obtain copies of the Application Guide and Application Forms Package: click on the Apply tab on the Grants.gov website, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button.
The vision of the Course, Curriculum, and Laboratory Improvement (CCLI) program is excellent science, technology, engineering, and mathematics (STEM) education for all undergraduate students. Towards this vision the program supports projects based on high-quality science, technology, engineering or mathematics and recent advances in research on undergraduate STEM learning and teaching. The program seeks to stimulate, disseminate, and institutionalize innovative and effective developments in undergraduate STEM education through the introduction of new content reflecting cutting edge developments in STEM fields, the production of knowledge about learning, and the improvement of educational practice. The CCLI program design reflects current challenges and promising approaches reported in recent seminal meetings and publications sponsored by organizations concerned with the health of national STEM education.
The National Research Council (NRC) notes several challenges to effective undergraduate education in STEM disciplines. These challenges include providing engaging laboratory, classroom and field experiences; teaching large numbers of students from diverse backgrounds; improving assessment of learning outcomes; and informing science faculty about research on effective teaching (2003, "Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics," http://www.nap.edu/books/0309072778/html/).
Promising approaches to meeting these challenges have been identified by several national organizations. The NRC emphasizes the importance of teaching subject matter in depth, eliciting and working with students' preexisting knowledge, and helping students develop the skills of self-monitoring and reflection (2005, “How Students Learn”, http://www.nap.edu/books/0309074339/html/ and 2000, “How People Learn” http://books.nap.edu/catalog/9853/html). They further emphasize the importance of creating a body of knowledge about effective practices in STEM undergraduate education (“a STEM education knowledge base”) and of creating a community of scholars who can act as resources for each other and for those seeking information.
The NRC also describes several strategies for improving the assessment of learning outcomes. They recommend that research on effective teaching should: pose significant questions that can be investigated using empirical techniques; have the potential for replication and generalization across educational settings; and be publicized and subjected to professional critique (2002 “Scientific Research in Education", http://www.nap.edu/books/0309082919/html/).
The value of working with a community of people within or across specific STEM disciplines, or pursuing similar educational innovations, is highlighted in a recent report from Project Kaleidoscope that calls for “collective action” to share ideas and materials so that projects build on, connect to, and enhance the work of others (Project Kaleidoscope, 2002, “Recommendations for Action in Support of Undergraduate Science, Technology, Engineering, and Mathematics,” http://www.pkal.org/documents/ReportonReports.pdf). The need for collective action is also emphasized in a report from the National Academies (2003, "Improving Undergraduate Instruction in Science, Technology, Engineering and Mathematics," (http://www.nap.edu/books/0309089298/html), which identifies the importance of expanding faculty and scholarly networks to promote effective instruction and to support rapid dissemination and adaptation of successful educational innovations.
The CCLI Program acknowledges the need both for the development of exemplary courses and teaching practices and for assessment and research efforts in undergraduate STEM education that build on and contribute to the pool of knowledge concerning effective approaches in STEM undergraduate education. The Program recognizes the value of a cadre of STEM faculty committed to improving undergraduate STEM education and sharing their findings with each other, "the community of scholars" described by Project Kaleidoscope. The report "Invention and Impact: Building Excellence in Undergraduate Science, Technology, Engineering and Mathematics Education" (2005, http://www.aaas.org/publications/books_reports/CCLI) describes some of the successful efforts supported by the CCLI program and its predecessors (the Course and Curriculum Development (CCD), Instruction and Laboratory Improvement (ILI), and Undergraduate Faculty Enhancement (UFE) programs). It is based on knowledge shared at a 2004 meeting of Principal Investigators from these programs.
The CCLI program is based on a cyclic model depicting the relationship between knowledge production and improvement of practice in undergraduate STEM education. The model is adapted from the report, “Mathematical Proficiency for All Students” (see http://www.rand.org/publications/MR/MR1643/). In this model, research findings about learning and teaching challenge existing approaches, thus leading to new educational materials and teaching strategies. New materials and teaching strategies that show promise give rise to faculty development programs and methods that incorporate these materials. The most promising of these developments are first tested in limited environments and then implemented and adapted in diverse curricula and educational institutions. These innovations are carefully evaluated by assessing their impact on teaching and learning. In turn, these implementations and assessments generate new insights and research questions, initiating a new cycle of innovation.
All proposals must contribute to the development of exemplary undergraduate STEM education. Proposals may focus on one or more of the components of this cycle.
Creating Learning Materials and Teaching Strategies. Guided by research on teaching and learning, by evaluations of previous efforts, and by advances within the disciplines, projects should develop new learning materials and tools, or create new and innovative teaching methods and strategies. Projects may also revise or enhance existing educational materials and teaching strategies, based on prior results. All projects should lead to exemplary models that address the varied needs of the Nation's diverse undergraduate student population. They may include activities that help faculty develop expertise in adapting these innovations and incorporating them effectively into their courses, the next step in the cycle.
Developing Faculty Expertise. Using new learning materials and teaching strategies often requires faculty to acquire new knowledge and skills and to revise their curricula and teaching practices. Projects should design and implement methods that enable faculty to gain such expertise. These can range from short-term workshops to sustained activities that foster new communities or networks of practicing educators. Successful projects should provide professional development for a diverse group of faculty so that new materials and teaching strategies can be widely implemented.
Implementing Educational Innovations. To ensure their broad based adoption, successful educational innovations (such as learning materials, teaching strategies, faculty development materials, assessment and evaluation tools) and the research relating to them should be widely disseminated. These innovations may come from CCLI projects or from other sources in the STEM community. Funds may be requested for local adaptation and implementation projects, including instrumentation to support such projects. Results from implementation projects should illuminate the challenges to and opportunities for adapting innovations in diverse educational settings, and may provide a foundation for the development of new tools and processes for dissemination. They also may provide a foundation for assessments of learning and teaching.
Assessing Learning and Evaluating Innovations. Implementing educational innovations will create new needs to assess student learning and faculty development. Projects should design and test new assessment and evaluation tools and processes. Projects that apply new and existing tools to conduct broad-based assessments or evaluations may also be considered, provided they span multiple institutions and are of general interest. Results obtained using these tools and processes should provide a foundation that leads to new questions for conducting research on teaching and learning.
Conducting Research on Undergraduate STEM Teaching and Learning. Results from assessments of learning and teaching, and from projects emphasizing other components in the cyclic model, provide a foundation for developing new and revised models of how undergraduate students learn STEM concepts and for exploring how effective teaching strategies and curricula enhance that learning. Projects should have a practical focus; they should lead to testable new ideas for creating learning materials and teaching strategies that have the potential for a direct impact on STEM educational practices.
In all projects, testing to determine the effectiveness of the innovation should be appropriate to the stage of the project’s development and guide its further development and implementation. In addition, evaluation and assessment results from within one component should influence the design of other components. For example, results from faculty development efforts may lead to refinement of learning materials and teaching strategies, and results from projects implementing educational innovations may identify the need for new approaches for developing faculty expertise.
The CCLI program is accepting proposals under this solicitation for three types of projects representing different phases of development. These phases reflect the number of components of the cyclic model included in the project (scope); the number of academic institutions, students and faculty members involved in the project (scale); and the maturity of the proposed educational innovation (state).
Phase 1 Projects – total budget up to $150,000 ($200,000 when four-year colleges and universities collaborate with two-year colleges) for 1 to 3 years.
Phase 1 projects typically will address one program component and involve a limited number of students and faculty members at one academic institution. Projects with a broader scope or larger scale can be proposed provided they can be done within the budget limitations. Proposed evaluation efforts should be informative, based on the project's specific expected outcomes, and consistent with the scope of a Phase 1 project. An extensive evaluation of student learning or use of an independent external evaluator may be included as appropriate but is not a requirement. In order to encourage collaboration between four-year colleges and universities and two-year colleges, projects involving such collaboration may request an additional $50,000. The distribution of effort and funds should reflect a genuine collaboration. Results from Phase 1 projects are expected to be significant enough to contribute to the undergraduate STEM education knowledge base.
Phase 2 Projects – total budget up to $500,000 for 2 to 4 years.
Phase 2 projects build on smaller-scale successful innovations or implementations, such as those produced by Phase 1 projects, and refine and test these on diverse users in several settings. In terms of scope, their focus ordinarily includes two or more components of the cyclic model with the connections between components explicitly addressed. Phase 2 projects carry the development to a state where the results are conclusive so that successful products and processes can be distributed widely or commercialized when appropriate. At a minimum, the innovation, if successful, should be institutionalized at the participating colleges and universities.
Phase 3 Projects – total budget up to $2,000,000 for 3 to 5 years.
Phase 3 projects combine established results and mature products from several components of the cyclic model. Such projects involve several diverse academic institutions, often bringing different kinds of expertise to the project. Evaluation activities are deep and broad, demonstrating the impact of the project’s innovations on many students and faculty at a wide range of academic institutions. Dissemination and outreach activities that have national impact are an especially important element of Phase 3 projects, as are the opportunities for faculty to learn how to best adapt project innovations to the needs of their students and academic institutions.
Connections Between Phases
Although it is expected that some Phase 1 projects will lead to Phase 2 projects and some Phase 2 projects to Phase 3 projects, there is no requirement that a proposal be based on CCLI-funded work; however the antecedent(s) for all projects should be cited and discussed. While it is unlikely that the program would be able to support a single multi-year project to address all components in depth at a large scale, a succession of grants might support such an effort. In all cases the funds requested should be consistent with the scope and scale of the project.
Although projects may vary considerably in the number of components they address, in the number of academic institutions involved, in the number of faculty and students that participate, and in their stage of development, all promising projects should share certain characteristics.
Quality, Relevance, and Impact: Projects should address a recognized need or opportunity in the discipline, clearly indicate how they will meet this need, and be innovative in their production and use of new materials, processes, and ideas, or in their implementation of tested ones. They should have the potential to produce exemplary materials, processes, and models, or important assessment and research findings. They should be based on an accurate and current understanding of the disciplinary field and utilize appropriate technology in student laboratories, classrooms and other learning environments. These projects, even those that involve a local implementation, should address issues that have the potential for broad application in undergraduate STEM education. The results of these projects should advance knowledge and understanding within the discipline and within STEM education in general.
Student Focus: Projects should have a clear relation to student learning, with definite links between project activities and improvements in STEM learning. Moreover, they should involve approaches that are consistent with the nature of today’s students, reflect the students’ perspective and, when possible, solicit student input in the design of the project.
Use of and Contribution to Knowledge about STEM Education: Projects should reflect high quality science, technology, engineering, and mathematics. They should have a rationale and use methods derived from existing knowledge concerning undergraduate STEM education and acknowledge existing projects of a similar nature. They also should have an effective approach for adding to this knowledge by disseminating their results.
STEM Education Community-Building: Projects should include interactions between the investigators and others in the undergraduate STEM education community. As appropriate to the scope and scale of the project, these interactions may range from informal contacts with a few colleagues to the establishment of a formal body of scholars. These interactions should enable the project to benefit from the knowledge and experience of others in developing and evaluating the educational innovation. This collaborating network should involve investigators working on similar or related approaches in the proposer's discipline or in other STEM disciplines and may also include experts in evaluation, educational psychology or other related fields.
Expected Measurable Outcomes: Projects should have goals and objectives that have been translated into a set of expected measurable outcomes that can be monitored using quantitative or qualitative approaches or both. These outcomes should be used to track progress, guide the project, and evaluate its ultimate success. Some of the expected measurable outcomes should pay particular attention to student learning, contributions to the knowledge base, and community building.
- Project Evaluation: Projects should have an evaluation plan that includes both a strategy for monitoring the project as it evolves to provide feedback to guide these efforts (formative evaluation) and a strategy for evaluating the effectiveness of the project in achieving its goals and for identifying positive and negative findings when the project is completed (summative evaluation). These efforts should be based on the project’s specific expected measurable outcomes defined in the proposal and should rely on an appropriate mix of qualitative and quantitative approaches in measuring the outcomes.
The Division of Undergraduate Education (DUE) conducts an on-going program evaluation to determine how effectively the CCLI program is achieving its goal to stimulate, disseminate, and institutionalize innovative developments in STEM education through the production of knowledge and the improvement of practice. In particular, the program seeks to understand how effectively its projects are using current learning models in developing their innovations, contributing to the knowledge base on STEM education, and building a community of scholars in undergraduate STEM education. In addition to project-specific evaluations, all projects are expected to cooperate with this third party program evaluation and respond to all inquiries, including requests to participate in surveys, interviews and other approaches for collecting evaluation data.
Proposals are invited from all organizations and in any field eligible under the standard GPG guidelines. Specifically excluded are projects that address solely professional training in clinical fields such as medicine, nursing, and clinical psychology. There is no limit on the number of proposals an organization may submit. An individual may be the main Principal Investigator (PI) on only one proposal submitted for any deadline. There is no restriction on the number of proposals for which an individual may serve as a co-PI.
NSF anticipates having $31 million for all CCLI awards, pending the availability of funds. The awards will be made as standard or continuing grants. The number and size of awards will depend on the quality of the proposals received and the availability of funds. The expected number of awards, and duration and range of total NSF/DUE support over the lifetime of a CCLI project, including indirect costs, are as follows:
For collaborative projects, these limits apply to the total project budget.
Full Proposal Instructions:
Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system.
Proposals submitted in response to this program announcement/solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF Website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from firstname.lastname@example.org. Proposers are reminded to identify this program announcement/solicitation number in the program announcement/solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.
Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: https://www.nsf.gov/bfa/dias/policy/docs/grantsgovguide.pdf. To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from email@example.com.
The following information supplements the GPG and the NSF Grants.gov Application Guide:
Cost sharing is not required by NSF in proposals submitted under this Program Solicitation.
Proposals must be submitted by the following date(s):
Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):
Proposers should allow sufficient time for all organizational approvals and for correction of errors in uploading the proposal in FastLane. No corrections to submitted proposals will be accepted after the deadline. Proposals received after the deadline will be returned without review. PROPOSALS THAT DO NOT MEET THE GPG REQUIREMENT FOR SEPARATELY AND EXPLICITLY ADDRESSING INTELLECTUAL MERIT AND BROADER IMPACTS IN THE PROJECT SUMMARY WILL BE RETURNED WITHOUT REVIEW. Proposals that do not comply with the formatting requirements (e.g., page limitation, font size, margin limits, and organizational structure) specified in the GPG will be returned without review.
Detailed technical instructions for proposal preparation and submission via FastLane are available at: https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail firstname.lastname@example.org. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.
Submission of Electronically Signed Cover Sheets. The Authorized Organizational Representative (AOR) must electronically sign the proposal Cover Sheet to submit the required proposal certifications (see Chapter II, Section C of the Grant Proposal Guide for a listing of the certifications). The AOR must provide the required electronic certifications within five working days following the electronic submission of the proposal. Proposers are no longer required to provide a paper copy of the signed Proposal Cover Sheet to NSF. Further instructions regarding this process are available on the FastLane Website at: http://www.fastlane.nsf.gov/
Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant’s organization can then apply for any federal grant on the Grants.gov website. The Grants.gov’s Grant Community User Guide is a comprehensive reference document that provides technical information about Grants.gov. Proposers can download the User Guide as a Microsoft Word document or as a PDF document. The Grants.gov User Guide is available at: http://www.grants.gov/CustomerSupport. In addition, the NSF Grants.gov Application Guide provides additional technical guidance regarding preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: email@example.com. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.
Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.
Reviews of proposals submitted to NSF are solicited from peers with expertise in the substantive area of the proposed research or education project. These reviewers are selected by Program Officers charged with the oversight of the review process. NSF invites the proposer to suggest, at the time of submission, the names of appropriate or inappropriate reviewers. Care is taken to ensure that reviewers have no conflicts with the proposer. Special efforts are made to recruit reviewers from non-academic institutions, minority-serving institutions, or adjacent disciplines to that principally addressed in the proposal.
The National Science Board approved revised criteria for evaluating proposals at its meeting on March 28, 1997 (NSB 97-72). All NSF proposals are evaluated through use of the two merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.
On July 8, 2002, the NSF Director issued Important Notice 127, Implementation of new Grant Proposal Guide Requirements Related to the Broader Impacts Criterion. This Important Notice reinforces the importance of addressing both criteria in the preparation and review of all proposals submitted to NSF. NSF continues to strengthen its internal processes to ensure that both of the merit review criteria are addressed when making funding decisions.
In an effort to increase compliance with these requirements, the January 2002 issuance of the GPG incorporated revised proposal preparation guidelines relating to the development of the Project Summary and Project Description. Chapter II of the GPG specifies that Principal Investigators (PIs) must address both merit review criteria in separate statements within the one-page Project Summary. This chapter also reiterates that broader impacts resulting from the proposed project must be addressed in the Project Description and described as an integral part of the narrative.
Effective October 1, 2002, NSF will return without review proposals that do not separately address both merit review criteria within the Project Summary. It is believed that these changes to NSF proposal preparation and processing guidelines will more clearly articulate the importance of broader impacts to NSF-funded projects.
The two National Science Board approved merit review criteria are listed below (see the Grant Proposal Guide Chapter III.A for further information). The criteria include considerations that help define them. These considerations are suggestions and not all will apply to any given proposal. While proposers must address both merit review criteria, reviewers will be asked to address only those considerations that are relevant to the proposal being considered and for which he/she is qualified to make judgments.
NSF staff will give careful consideration to the following in making funding decisions:
Intellectual Merit: In addition to the items listed above in the standard review criteria for intellectual merit, the reviewers will consider the following questions: Will the project produce exemplary material, processes, or models that enhance student learning and will it yield important assessment or research findings related to student learning, as appropriate to the goals of the project? Does the project build on the existing STEM education knowledge base? Are appropriate expected measurable outcomes explicitly stated and are they integrated into an evaluation plan likely to produce useful information?
Broader Impacts: In addition to the items listed above in the standard review criteria for broader impacts, the reviewers will consider the following questions: Will the project contribute to the STEM education knowledge base? Will the project help build the STEM education community? Will the project have a broad impact on STEM education in an area of recognized need or opportunity?
All proposals are carefully reviewed by at least three other persons outside NSF who are experts in the particular field represented by the proposal. Proposals submitted in response to this announcement/solicitation will be reviewed by Panel Review.
Reviewers will be asked to formulate a recommendation to either support or decline each proposal. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.
A summary rating and accompanying narrative will be completed and submitted by each reviewer. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers, are sent to the Principal Investigator/Project Director by the Program Director. In addition, the proposer will receive an explanation of the decision to award or decline funding.
NSF is striving to be able to tell proposers whether their proposals have been declined or recommended for funding within six months. The time interval begins on the closing date of an announcement/solicitation, or the date of proposal receipt, whichever is later. The interval ends when the Division Director accepts the Program Officer's recommendation.
In all cases, after programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications and the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.
Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program Division administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See section VI.A. for additional information on the review process.)
An NSF award consists of: (1) the award letter, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award letter; (4) the applicable award conditions, such as Grant General Conditions (NSF-GC-1); * or Federal Demonstration Partnership (FDP) Terms and Conditions * and (5) any announcement or other NSF issuance that may be incorporated by reference in the award letter. Cooperative agreement awards are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC). Electronic mail notification is the preferred way to transmit NSF awards to organizations that have electronic mail capabilities and have requested such notification from the Division of Grants and Agreements.
Consistent with the requirements of OMB Circular A-16, Coordination of Geographic Information and Related Spatial Data Activities, and the Federal Geographic Data Committee, all NSF awards that result in relevant geospatial data must be submitted to Geospatial One-Stop in accordance with the guidelines provided at: www.geodata.gov.
More comprehensive information on NSF Award Conditions is contained in the NSF Grant Policy Manual (GPM) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpm. The GPM is also for sale through the Superintendent of Documents, Government Printing Office (GPO), Washington, DC 20402. The telephone number at GPO for subscription information is (202) 512-1800. The GPM may be ordered through the GPO Website at http://www.gpo.gov/.
*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/. Paper copies of these documents may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from firstname.lastname@example.org.
For all multi-year grants (including both standard and continuing grants), the PI must submit an annual project report to the cognizant Program Officer at least 90 days before the end of the current budget period.
There are two special CCLI reporting requirements. When CCLI PIs submit interim and final reports through FastLane, they will be asked to provide additional information for the Project Information Resource System (PIRS). In addition, PIs of CCLI grants will also be expected to cooperate with data collection associated with the CCLI program evaluation conducted by a third party organization supported by NSF.
Within 90 days after the expiration of an award, the PI also is required to submit a final project report. Failure to provide final technical reports delays NSF review and processing of pending proposals for the PI and all Co-PIs. PIs should examine the formats of the required reports in advance to assure availability of required data.
PIs are required to use NSF's electronic project reporting system, available through FastLane, for preparation and submission of annual and final project reports. This system permits electronic submission and updating of project reports, including information on project participants (individual and organizational), activities and findings, publications, and other specific products and contributions. PIs will not be required to re-enter information previously provided, either with a proposal or in earlier updates using the electronic system.
General inquiries regarding this program should be made to:
Myles G. Boylan, Lead Program Director, telephone: (703) 292-4617, email: email@example.com
Russell L. Pimmel, Lead Program Director, telephone: (703) 292-4618, email: firstname.lastname@example.org
Terry S. Woodin, Lead Program Director, telephone: (703) 292-4657, email: email@example.com
Proposers are encouraged to contact a DUE Program Director in their discipline:
Jeanne Rudzki Small, Program Director, telephone: (703) 292-4641, email: firstname.lastname@example.org
Terry S. Woodin, Program Director, telephone: (703) 292-4657, email: email@example.com
Susan H. Hixson, Program Director, telephone: (703) 292-4623, email: firstname.lastname@example.org
Kathleen A. Parson, Program Director, telephone: (703) 292-4653, email: email@example.com
Harry G. Ungar, Program Director, telephone: (703) 292-4647, email: firstname.lastname@example.org
Mark Burge, Program Director, telephone: (703) 292-4645, email: email@example.com
Diana Gant, Program Director, telephone: (703) 292-4642, email: firstname.lastname@example.org
Barbara N. Anderegg, Program Director, telephone: (703) 292-4634, email: email@example.com
Susan L. Burkett, Program Director, telephone: (703) 292-4629, email: firstname.lastname@example.org
Russell L. Pimmel, Program Director, telephone: (703) 292-4618, email: email@example.com
Bevlee A. Watford, Program Director, telephone: (703) 292-5323, email: firstname.lastname@example.org
Keith A. Sverdrup, Program Director, telephone: (703) 292-4644, email: email@example.com
John R. Haddock, Program Director, telephone: (703) 292-8670, email: firstname.lastname@example.org
Elizabeth J. Teles, Program Director, telephone: (703) 292-8670, email: email@example.com
Lee L. Zia, Program Director, telephone: (703) 292-5140, email: firstname.lastname@example.org
R. Corby Hovis, Program Director, telephone: (703) 292-4625, email: email@example.com
Duncan E. McBride, Program Director, telephone: (703) 292-4630, email: firstname.lastname@example.org
Myles G. Boylan, Program Director, telephone: (703) 292-4617, email: email@example.com
Russell L. Pimmel, Program Director, telephone: (703) 292-4618, email: firstname.lastname@example.org
Myles G. Boylan, Program Director, telephone: (703) 292-4617, email: email@example.com
David McArthur, Program Director, telephone: (703) 292-4622, email: firstname.lastname@example.org
Submission of proposals via Grants.gov is OPTIONAL. For questions relating to Grants.gov contact:
For questions related to the use of FastLane, contact:
FastLane Help Desk, telephone: 1-800-673-6188, email: email@example.com
Ms.Antoinette T Allen, Division of Undergraduate Education, telephone: 703-292-4646, email: firstname.lastname@example.org
The NSF Guide to Programs is a compilation of funding for research and education in science, mathematics, and engineering. The NSF Guide to Programs is available electronically at https://www.nsf.gov/cgi-bin/getpub?gp. General descriptions of NSF programs, research areas, and eligibility information for proposal submission are provided in each chapter.
Many NSF programs offer announcements or solicitations concerning specific proposal requirements. To obtain additional information about these requirements, contact the appropriate NSF program offices. Any changes in NSF's fiscal year programs occurring after press time for the Guide to Programs will be announced in the NSF E-Bulletin, which is updated daily on the NSF Website at https://www.nsf.gov/home/ebulletin, and in individual program announcements/solicitations. Subscribers can also sign up for NSF's MyNSF News Service (https://www.nsf.gov/mynsf/) to be notified of new funding opportunities that become available.
The National Science Foundation (NSF) funds research and education in most fields of science and engineering. Awardees are wholly responsible for conducting their project activities and preparing the results for publication. Thus, the Foundation does not assume responsibility for such findings or their interpretation.
NSF welcomes proposals from all qualified scientists, engineers and educators. The Foundation strongly encourages women, minorities and persons with disabilities to compete fully in its programs. In accordance with Federal statutes, regulations and NSF policies, no person on grounds of race, color, age, sex, national origin or disability shall be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving financial assistance from NSF, although some programs may have special requirements that limit eligibility.
Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities (investigators and other staff, including student research assistants) to work on NSF-supported projects. See the GPG Chapter II, Section D.2 for instructions regarding preparation of these types of proposals.
The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.
To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov
The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to applicant institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies needing information as part of the review process or in order to coordinate programs; and to another Federal agency, court or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 63 Federal Register 267 (January 5, 1998), and NSF-51, "Reviewer/Proposal File and Associated Records," 63 Federal Register 268 (January 5, 1998). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.
An agency may not conduct or sponsor, and a person is not required to respond to an information collection unless it displays a valid OMB control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding this burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to: Suzanne Plimpton, Reports Clearance Officer, Division of Administrative Services, National Science Foundation, Arlington, VA 22230.
OMB control number: 3145-0058.
The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA