Course, Curriculum, and Laboratory Improvement (CCLI)


Program Solicitation
NSF 07-543

Replaces Document(s):
NSF 06-536

 

NSF Logo

National Science Foundation

Directorate for Education & Human Resources
     Division of Undergraduate Education

 

Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

May 08, 2007

For Phase 1 proposals from submitting organizations located in states or territories beginning with A through M.

May 09, 2007

For Phase 1 proposals from submitting organizations located in states or territories beginning with N through W.

January 10, 2008

For Phase 2 and 3 proposals

REVISION NOTES

In furtherance of the President's Management Agenda, NSF has identified programs that will offer proposers the option to utilize Grants.gov to prepare and submit proposals, or will require that proposers utilize Grants.gov to prepare and submit proposals. Grants.gov provides a single Government-wide portal for finding and applying for Federal grants online.

In response to this program solicitation, proposers may opt to submit proposals via Grants.gov or via the NSF FastLane system. In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. Chapter II, Section D.3 of the Grant Proposal Guide provides additional information on collaborative proposals.

The following items are major revisions to the previous program solicitation:

Projects that plan to use the World Wide Web as a component of the overall dissemination strategy are required to connect the project's website to the National Science Digital Library and the proposal should describe the plans for accomplishing this.

The description of the potential project components (Section II-A) has been revised to rename and redefine the last two bullets.  The fourth bullet is now named Assessing Student Achievement and has been redefined to include projects that: (1)  design tools to measure the effectiveness of new materials and instructional methods, (2) develop and share valid and reliable tests of STEM knowledge, (3) collect, synthesize, and interpret information about student reasoning, practical skills, interests, or other valued outcomes, and (4) apply new and existing tools to conduct broad-based evaluations of educational programs or practices if they span multiple institutions and are of general interest.  Bullet five is now named Conducting Research on Undergraduate STEM Education and has been redefined to include projects that: (1) develop and revise models of how undergraduates STEM students learn and (2) explore how effective teaching strategies and curricula enhance learning

The Proposal Preparation Instructions Section now indicates that the proposer should mark the Human Subjects box on the cover sheet and then indicate that the proposed project is exempt, approved, or pending.  The process is pending if the IRB has not yet approved a submitted application or if the proposer has not yet submitted an application to the IRB.  This section should not be left blank.

SUMMARY OF PROGRAM REQUIREMENTS

General Information

Program Title: 

Course, Curriculum, and Laboratory Improvement  (CCLI)

Synopsis of Program:

The Course, Curriculum, and Laboratory Improvement (CCLI) program seeks to improve the quality of science, technology, engineering, and mathematics (STEM) education for all undergraduate students. The program supports efforts to create new learning materials and teaching strategies, develop faculty expertise, implement educational innovations, assess learning and evaluate innovations, and conduct research on STEM teaching and learning. The program supports three types of projects representing three different phases of development, ranging from small, exploratory investigations to large, comprehensive projects.

Cognizant Program Officer(s):

  • Russell L. Pimmel, Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4618, email: rpimmel@nsf.gov

  • Myles G. Boylan, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4617, email: mboylan@nsf.gov

  • Jill K. Singer, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-5323, email: jksinger@nsf.gov

  • Sheryl A. Sorby, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4647, email: ssorby@nsf.gov

  • Terry S. Woodin, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4657, email: twoodin@nsf.gov

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.076 --- Education and Human Resources

Award Information

Anticipated Type of Award:  Standard Grant or Continuing Grant

Estimated Number of Awards:    92 to  125   including 70 to 90 Phase 1 awards, 20 to 30 Phase 2 awards, and 2 to 5 Phase 3 awards

Anticipated Funding Amount:   $34,000,000  for new and ongoing awards, pending availability of funding.

Eligibility Information

Organization Limit: 

None Specified

PI Limit: 

None Specified

Limit on Number of Proposals per Organization: 

None Specified

Limit on Number of Proposals per PI:  

An individual may be the Principal Investigator (PI) on only one proposal submitted for any deadline. In applying this eligibility criterion, each proposal in a collaborative submission will be considered a separate proposal with a distinct PI.  There is no restriction on the number of proposals for which an individual may serve as a co-PI.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not Applicable
  • Full Proposals:

    • Full Proposals submitted via FastLane: NSF Proposal and Award Policies and Procedures Guide, Part I: Grant Proposal Guide (GPG) Guidelines apply. The complete text of the GPG is available electronically on the NSF website at: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg.

    • Full Proposals submitted via Grants.gov: NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov Guidelines apply (Note: The NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: http://www.nsf.gov/bfa/dias/policy/docs/grantsgovguide.pdf/)

B. Budgetary Information

  • Cost Sharing Requirements: Cost Sharing is not required by NSF.  
  • Indirect Cost (F&A) Limitations:  Not Applicable
  • Other Budgetary Limitations: Not Applicable

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

    May 08, 2007

    For Phase 1 proposals from submitting organizations located in states or territories beginning with A through M.

    May 09, 2007

    For Phase 1 proposals from submitting organizations located in states or territories beginning with N through W.

    January 10, 2008

    For Phase 2 and 3 proposals

Proposal Review Information Criteria

Merit Review Criteria:   National Science Board approved criteria. Additional merit review considerations apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:   Standard NSF award conditions apply

Reporting Requirements:   Additional reporting requirements apply. Please see the full text of this solicitation for further information.

TABLE OF CONTENTS

Summary of Program Requirements

  1. Introduction

  2. Program Description

  3. Award Information

  4. Eligibility Information

  5. Proposal Preparation and Submission Instructions
    1. Proposal Preparation Instructions
    2. Budgetary Information
    3. Due Dates
    4. FastLane/Grants.gov Requirements

  6. NSF Proposal Processing and Review Procedures
    1. NSF Merit Review Criteria
    2. Review and Selection Process

  7. Award Administration Information
    1. Notification of the Award
    2. Award Conditions
    3. Reporting Requirements

  8. Agency Contacts

  9. Other Information

I. INTRODUCTION

The vision of the Course, Curriculum, and Laboratory Improvement (CCLI) program is excellent science, technology, engineering, and mathematics (STEM) education for all undergraduate students. Towards this vision the program supports projects based on high-quality science, technology, engineering or mathematics and recent advances in research on undergraduate STEM learning and teaching. The program seeks to stimulate, evaluate, and disseminate innovative and effective developments in undergraduate STEM education through the introduction of new content reflecting cutting edge developments in STEM fields, the production of knowledge about learning, and the improvement of educational practice. The CCLI program design reflects current challenges and promising approaches reported in recent seminal meetings and publications sponsored by organizations concerned with the health of national STEM education.

The National Research Council (NRC) notes several challenges to effective undergraduate education in STEM disciplines. These challenges include providing engaging laboratory, classroom and field experiences; teaching large numbers of students from diverse backgrounds; improving assessment of learning outcomes; and informing science faculty about research on effective teaching (2003, "Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics," http://www.nap.edu/books/0309072778/html/).

Promising approaches to meeting these challenges have been identified by several national organizations. The NRC emphasizes the importance of teaching subject matter in depth, eliciting and working with students' preexisting knowledge, and helping students develop the skills of self-monitoring and reflection (2005, “How Students Learn”, http://www.nap.edu/books/0309074339/html/ and 2000, “How People Learn” http://books.nap.edu/catalog/9853/html). They further emphasize the importance of creating a body of knowledge about effective practices in STEM undergraduate education (“a STEM education knowledge base”) and of creating a community of scholars who can act as resources for each other and for those seeking information.

The NRC also describes several strategies for improving the assessment of learning outcomes. They recommend that research on effective teaching should: pose significant questions that can be investigated using empirical techniques; have the potential for replication and generalization across educational settings; and be publicized and subjected to professional critique (2002 “Scientific Research in Education", http://www.nap.edu/books/0309082919/html/).

The value of working with a community of people within or across specific STEM disciplines, or pursuing similar educational innovations, is highlighted in a recent report from Project Kaleidoscope that calls for “collective action” to share ideas and materials so that projects build on, connect to, and enhance the work of others (Project Kaleidoscope, 2002, “Recommendations for Action in Support of Undergraduate Science, Technology, Engineering, and Mathematics,” http://www.pkal.org/documents/ReportonReports.pdf). The need for collective action is also emphasized in a report from the National Academies (2003, "Improving Undergraduate Instruction in Science, Technology, Engineering and Mathematics," (http://www.nap.edu/books/0309089298/html/), which identifies the importance of expanding faculty and scholarly networks to promote effective instruction and to support rapid dissemination and adaptation of successful educational innovations.

Subsequent reports have restated the need for changes in undergraduate STEM education and emphasized the importance of increasing innovation and diversity in STEM education programs and the increasingly urgent need for change (2005, “Innovating America: Thriving in a World of Challenge and Change, Council on Competitiveness, http://innovateamerica.org/webscr/report.asp and 2006, “Recommendations for Urgent Action”, Project Kaleidoscope, http://www.pkal.org/documents/ReportOnReportsII.cfm).

The CCLI Program acknowledges the need both for the development of exemplary courses and teaching practices and for assessment and research efforts in undergraduate STEM education that build on and contribute to the pool of knowledge concerning effective approaches in STEM undergraduate education. The Program recognizes the value of a cadre of STEM faculty committed to improving undergraduate STEM education and sharing their findings with each other, "the community of scholars" described by Project Kaleidoscope. The report "Invention and Impact: Building Excellence in Undergraduate Science, Technology, Engineering and Mathematics Education" (2005, http://www.aaas.org/publications/books_reports/CCLI) describes some of the successful efforts supported by the CCLI program and its predecessors (the Course and Curriculum Development (CCD), Instruction and Laboratory Improvement (ILI), and Undergraduate Faculty Enhancement (UFE) programs). It is based on knowledge shared at a 2004 meeting of Principal Investigators from these programs.

II. PROGRAM DESCRIPTION

The CCLI program is based on a cyclic model depicting the relationship between knowledge production and improvement of practice in undergraduate STEM education. The model is adapted from the report, “Mathematical Proficiency for All Students” (see http://www.rand.org/publications/MR/MR1643/). In this model, research findings about learning and teaching challenge existing approaches, thus leading to new educational materials and teaching strategies. New materials and teaching strategies that show promise give rise to faculty development programs and methods that incorporate these materials. The most promising of these developments are first tested in limited environments and then implemented and adapted in diverse curricula and educational institutions. These innovations are carefully evaluated by assessing their impact on teaching and learning. In turn, these implementations and assessments generate new insights and research questions, initiating a new cycle of innovation.

[Cyclic model of knowledge production and improvement of practice in STEM education]

  1. Project Components

All proposals must contribute to the development of exemplary undergraduate STEM education. Proposals may focus on one or more of the components of this cycle.

  • Creating Learning Materials and Teaching Strategies. Guided by research on teaching and learning, by evaluations of previous efforts, and by advances within the disciplines, projects should develop new learning materials and tools, or create new and innovative teaching methods and strategies. Projects may also revise or enhance existing educational materials and teaching strategies, based on prior results. All projects should lead to exemplary models that address the varied needs of the Nation's diverse undergraduate student population. They may include activities that help faculty develop expertise in adapting these innovations and incorporating them effectively into their courses, the next step in the cycle.
  • Developing Faculty Expertise. Using new learning materials and teaching strategies often requires faculty to acquire new knowledge and skills and to revise their curricula and teaching practices. Projects should design and implement methods that enable faculty to gain such expertise. These can range from short-term workshops to sustained activities that foster new communities or networks of practicing educators. Successful projects should provide professional development for a diverse group of faculty so that new materials and teaching strategies can be widely implemented.
  • Implementing Educational Innovations. To ensure their broad based adoption, successful educational innovations (such as learning materials, teaching strategies, faculty development materials, assessment and evaluation tools) and the research relating to them should be widely disseminated. These innovations may come from CCLI projects or from other sources in the STEM community. Funds may be requested for local adaptation and implementation projects, including instrumentation to support such projects. Results from implementation projects should illuminate the challenges to and opportunities for adapting innovations in diverse educational settings, and may provide a foundation for the development of new tools and processes for dissemination. They also may provide a foundation for assessments of learning and teaching.
  • Assessing Student Achievement. Implementing educational innovations will create new needs to assess student learning. Projects for designing tools to measure the effectiveness of new materials and instructional methods are appropriate. Some projects may develop and share valid and reliable tests of STEM knowledge; other projects may collect, synthesize, and interpret information about student reasoning, practical skills, interests, or other valued outcomes. Projects that apply new and existing tools to conduct broad-based evaluations of educational programs or practices are appropriate if they span multiple institutions and are of general interest. Projects should carefully document population characteristics and context for abstracting what can be generalized. Results obtained using these tools and processes should provide a foundation that leads to new questions for conducting research on teaching and learning. Assessment projects likely to have only a local impact are discouraged.
  • Conducting Research on Undergraduate STEM Education. Results from assessments of learning and teaching as well as from projects emphasizing other components in the cyclic model provide a foundation for developing new and revised models of how undergraduate STEM students learn. Research to explore how effective teaching strategies and curricula enhance learning is appropriate. Some research results may compel faculty to rethink STEM education for the future. Other projects will have a practical focus. All projects should lead to testable new ideas for creating learning materials and teaching strategies that have the potential for a direct impact on STEM educational practices.

In all projects, testing to determine the effectiveness of the innovation should be appropriate to the stage of the project’s development and guide its further development and implementation. In addition, evaluation and assessment results from within one component should influence the design of other components. For example, results from faculty development efforts may lead to refinement of learning materials and teaching strategies, and results from projects implementing educational innovations may identify the need for new approaches for developing faculty expertise.

  1. Project Types

The CCLI program is accepting proposals under this solicitation for three types of projects representing different phases of development. These phases reflect the number of components of the cyclic model included in the project (scope); the number of academic institutions, students and faculty members involved in the project (scale); and the maturity of the proposed educational innovation (state).

Phase 1 Projects – total budget up to $150,000 ($200,000 when four-year colleges and universities collaborate with two-year colleges) for 1 to 3 years.

Phase 1 projects typically will address one program component and involve a limited number of students and faculty members at one academic institution. Projects with a broader scope or larger scale can be proposed provided they can be done within the budget limitations. Proposed evaluation efforts should be informative, based on the project's specific expected outcomes, and consistent with the scope of a Phase 1 project. An extensive evaluation of student learning or use of an independent external evaluator may be included as appropriate but is not a requirement. In order to encourage collaboration between four-year colleges and universities and two-year colleges, projects involving such collaboration may request an additional $50,000. The distribution of effort and funds between the four-year institution and the community college should reflect a genuine collaboration. Results from Phase 1 projects are expected to be significant enough to contribute to the undergraduate STEM education knowledge base.

Phase 2 Projects – total budget up to $500,000 for 2 to 4 years.

Phase 2 projects build on smaller-scale successful innovations or implementations, such as those produced by Phase 1 projects, and refine and test these on diverse users in several settings. In terms of scope, their focus ordinarily includes two or more components of the cyclic model with the connections between components explicitly addressed. Phase 2 projects carry the development to a state where the results are conclusive so that successful products and processes can be distributed widely or commercialized when appropriate. At a minimum, the innovation, if successful, should be institutionalized at the participating colleges and universities.

Phase 3 Projects – total budget up to $2,000,000 for 3 to 5 years.

Phase 3 projects combine established results and mature products from several components of the cyclic model. These projects should include an explicit discussion of the results and evidence produced by the work on which the proposed project is based. Such projects include a diversity of academic institutions and student populations. Evaluation activities are deep and broad, demonstrating the impact of the project’s innovations on many students and faculty at a wide range of academic institutions. Dissemination and outreach activities that have national impact are an especially important element of Phase 3 projects, as are the opportunities for faculty to learn how to best adapt project innovations to the needs of their students and academic institutions.

Connections Between Phases

Although it is expected that some Phase 1 projects will lead to Phase 2 projects and some Phase 2 projects to Phase 3 projects, there is no requirement that a proposal be based on CCLI-funded work; however the antecedent(s) for all projects should be cited and discussed. While it is unlikely that the program would be able to support a single multi-year project to address all components in depth at a large scale, a succession of grants might support such an effort. In all cases the funds requested should be consistent with the scope and scale of the project.

  1. Important Project Features

Although projects may vary considerably in the number of components they address, in the number of academic institutions involved, in the number of faculty and students that participate, and in their stage of development, all promising projects should share certain characteristics.

  • Quality, Relevance, and Impact: Projects should address a recognized need or opportunity in the discipline, clearly indicate how they will meet this need, and be innovative in their production and use of new materials, processes, and ideas, or in their implementation of tested ones. They should have the potential to produce exemplary materials, processes, and models, or important assessment and research findings. They should be based on an accurate and current understanding of the disciplinary field and utilize appropriate technology in student laboratories, classrooms and other learning environments. These projects, even those that involve a local implementation, should address issues that have the potential for broad application in undergraduate STEM education. The results of these projects should advance knowledge and understanding within the discipline and within STEM education in general.
  • Student Focus: Projects should have a clear relation to student learning, with definite links between project activities and improvements in STEM learning. Moreover, they should involve approaches that are consistent with the nature of today’s students, reflect the students’ perspective and, when possible, solicit student input in the design of the project.
  • Use of and Contribution to Knowledge about STEM Education: Projects should reflect high quality science, technology, engineering, and mathematics. They should have a clear and compelling rationale and use methods derived from existing knowledge concerning undergraduate STEM education and acknowledge existing projects of a similar nature. They also should have an effective approach for adding to this knowledge by disseminating their results.
  • STEM Education Community-Building: Projects should include interactions between the investigators and others in the undergraduate STEM education community. As appropriate to the scope and scale of the project, these interactions may range from informal contacts with a few colleagues to the establishment of a formal body of scholars. These interactions should enable the project to benefit from the knowledge and experience of others in developing and evaluating the educational innovation. This collaborating network should involve investigators working on similar or related approaches in the proposer's discipline or in other STEM disciplines and may also include experts in evaluation, educational psychology or other related fields.
  • Expected Measurable Outcomes: Projects should have goals and objectives that have been translated into a set of expected measurable outcomes that can be monitored using quantitative or qualitative approaches or both. These outcomes should be used to track progress, guide the project, and evaluate its ultimate success. Some of the expected measurable outcomes should pay particular attention to student learning, contributions to the knowledge base, and community building.
  • Project Evaluation: All projects, regardless of the phase or main component of the cyclic model they represent, should have an evaluation plan that includes both a strategy for monitoring the project as it evolves to provide feedback to guide these efforts (formative evaluation) and a strategy for evaluating the effectiveness of the project in achieving its goals and for identifying positive and negative findings when the project is completed (summative evaluation). These efforts should be based on the project’s specific expected measurable outcomes defined in the proposal and should rely on an appropriate mix of qualitative and quantitative approaches in measuring the outcomes.
  1. Program Evaluation

The Division of Undergraduate Education (DUE) conducts an on-going program evaluation to determine how effectively the CCLI program is achieving its goal to stimulate, disseminate, and institutionalize innovative developments in STEM education through the production of knowledge and the improvement of practice. In particular, the program seeks to understand how effectively its projects are using current learning models in developing their innovations, contributing to the knowledge base on STEM education, and building a community of scholars in undergraduate STEM education. In addition to project-specific evaluations, all projects are expected to cooperate with this third party program evaluation and respond to all inquiries, including requests to participate in surveys, interviews and other approaches for collecting evaluation data.

III. AWARD INFORMATION

NSF anticipates having $34 million for new and ongoing CCLI awards, pending the availability of funds. The awards will be made as standard or continuing grants. The number and size of awards will depend on the quality of the proposals received and the availability of funds. The expected number of awards, and duration and range of total NSF/DUE support over the lifetime of a CCLI project, including indirect costs, are as follows:

  • Phase 1: Exploratory Projects – 70 to 90 awards expected, each with a total budget up to $150,000 ($200,000 when four-year colleges and universities collaborate with two-year colleges) for 1 to 3 years.
  • Phase 2: Expansion Projects – 20 to 30 awards expected, each with a total budget up to $500,000 for 2 to 4 years.
  • Phase 3: Comprehensive Projects – 2 to 5 awards expected, each with a total budget up to $2,000,000 for 3 to 5 years.

For collaborative projects, these limits apply to the total project budget.

IV. ELIGIBILITY INFORMATION

Organization Limit: 

None Specified

PI Limit: 

None Specified

Limit on Number of Proposals per Organization: 

None Specified

Limit on Number of Proposals per PI:  

An individual may be the Principal Investigator (PI) on only one proposal submitted for any deadline. In applying this eligibility criterion, each proposal in a collaborative submission will be considered a separate proposal with a distinct PI.  There is no restriction on the number of proposals for which an individual may serve as a co-PI.

Additional Eligibility Info:

Proposals are invited from all organizations and in any field eligible under the standard GPG guidelines. Specifically excluded are projects that address solely professional training in clinical fields such as medicine, nursing, and clinical psychology. There is no limit on the number of proposals an organization may submit.

V. PROPOSAL PREPARATION AND SUBMISSION INSTRUCTIONS

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system.

  • Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF website at: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from pubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (http://www.nsf.gov/bfa/dias/policy/docs/grantsgovguide.pdf). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from pubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. Chapter II, Section D.3 of the Grant Proposal Guide provides additional information on collaborative proposals.

Additional Full Proposal Instructions:

The following information supplements the GPG and the NSF Grants.gov Application Guide:

  • Proposers should make sure that their proposals respond to the list of questions provided both in the NSB general review criteria and in the additional program-specific review criteria in Section VI.A below. They should review the discussion of the components, phases, and important features in Section II above. Additional information on writing proposals can be found in "A Guide for Proposal Writing" (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04016).
  • Principal Investigators are strongly encouraged to match their proposed budgets carefully to the scope and scale of a project. Excessive or poorly justified budgets indicate that the project is not well designed.
  • Projects that plan to use the World Wide Web as a component of their overall dissemination strategy should connect the project's website to the National Science Digital Library (NSDL).  The proposal should describe how the web pages will be tagged with descriptive metadata (see http://dublincore.org) so that the material becomes part of the NSDL.  The following website http://nsdl.org/ provides information and instructions  for connecting the project's website.
  • All projects must comply with the section of the GPG on Proposals Involving Human Subjects (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg).  The proposer should mark the Human Subjects box on the cover sheet and then indicate that the proposed project is exempt, approved, or pending.  The process is pending if the IRB has not yet approved a submitted application or if the proposer has not yet submitted an application.  This section should not be left blank.  
  • While all material relevant to determining the quality of the proposed work must be included within the 15-page Project Description or as part of the budget justification, proposers may, as a part of the Supplementary Documentation, include letters showing collaborator commitments and organizational endorsement. In addition, for those projects whose deliverables include a final product, samples of these products (such as excerpts from book chapters, assessment tools, screen shots of software, sample teaching modules and other project deliverables) may be placed within the Supplementary Documentation section. These sample materials should be concise and relevant.

B. Budgetary Information

Cost Sharing:   Cost sharing is not required by NSF.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

    May 08, 2007

    For Phase 1 proposals from submitting organizations located in states or territories beginning with A through M.

    May 09, 2007

    For Phase 1 proposals from submitting organizations located in states or territories beginning with N through W.

    January 10, 2008

    For Phase 2 and 3 proposals

Proposers should allow sufficient time for all organizational approvals and for correction of errors in uploading the proposal in FastLane or Grants.gov. No corrections to submitted proposals will be accepted after the deadline. Proposals received after the deadline will be returned without review. PROPOSALS THAT DO NOT MEET THE REQUIREMENT FOR SEPARATELY AND EXPLICITLY ADDRESSING INTELLECTUAL MERIT AND BROADER IMPACTS IN THE PROJECT SUMMARY WILL BE RETURNED WITHOUT REVIEW. Proposals that do not comply with the formatting requirements (e.g., page limitation, font size, margin limits, and organizational structure) specified in the GPG or NSF Grants.gov Application Guide will be returned without review.

D. FastLane/Grants.gov Requirements

  • For Proposals Submitted Via FastLane:

    Detailed technical instructions regarding the technical aspects of preparation and submission via FastLane are available at: https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

    Submission of Electronically Signed Cover Sheets. The Authorized Organizational Representative (AOR) must electronically sign the proposal Cover Sheet to submit the required proposal certifications (see Chapter II, Section C of the Grant Proposal Guide for a listing of the certifications). The AOR must provide the required electronic certifications within five working days following the electronic submission of the proposal. Further instructions regarding this process are available on the FastLane Website at: https://www.fastlane.nsf.gov/fastlane.jsp.

  • For Proposals Submitted Via Grants.gov:
  • Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. The Grants.gov's Grant Community User Guide is a comprehensive reference document that provides technical information about Grants.gov. Proposers can download the User Guide as a Microsoft Word document or as a PDF document. The Grants.gov User Guide is available at: http://www.grants.gov/CustomerSupport. In addition, the NSF Grants.gov Application Guide provides additional technical guidance regarding preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

    Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

VI. NSF PROPOSAL PROCESSING AND REVIEW PROCEDURES   

Proposals received by NSF are assigned to the appropriate NSF program and, if they meet NSF proposal preparation requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with the oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts with the proposer.

A. NSF Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board (NSB)-approved merit review criteria: intellectual merit and the broader impacts of the proposed effort. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two NSB-approved merit review criteria are listed below. The criteria include considerations that help define them. These considerations are suggestions and not all will apply to any given proposal. While proposers must address both merit review criteria, reviewers will be asked to address only those considerations that are relevant to the proposal being considered and for which the reviewer is qualified to make judgements.

What is the intellectual merit of the proposed activity?
How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, the reviewer will comment on the quality of the prior work.) To what extent does the proposed activity suggest and explore creative, original, or potentially transformative concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources?

What are the broader impacts of the proposed activity?
How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?

Examples illustrating activities likely to demonstrate broader impacts are available electronically on the NSF website at: http://www.nsf.gov/pubs/gpg/broaderimpacts.pdf.

NSF staff will give careful consideration to the following in making funding decisions:

Integration of Research and Education
One of the principal strategies in support of NSF's goals is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions provide abundant opportunities where individuals may concurrently assume responsibilities as researchers, educators, and students and where all can engage in joint efforts that infuse education with the excitement of discovery and enrich research through the diversity of learning perspectives.

Integrating Diversity into NSF Programs, Projects, and Activities
Broadening opportunities and enabling the participation of all citizens -- women and men, underrepresented minorities, and persons with disabilities -- is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

    Additional Review Criteria:

    In reviewing CCLI proposals, the standard criteria will be expanded to include the following additional review criteria as appropriate to the phase and main component of the proposed work:

    Intellectual Merit: Will the project produce exemplary material, processes, or models that enhance student learning? Will it yield important assessment or research findings related to student learning, as appropriate to the goals of the project? Does the project build on the existing STEM education knowledge base? Are appropriate expected measurable outcomes explicitly stated and are they integrated into an evaluation plan? Is the evaluation effort likely to produce useful information?

    Broader Impacts: Will the project contribute to the STEM education knowledge base? Will the project help build the STEM education community? Will the project have a broad impact on STEM education in an area of recognized need or opportunity?

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Panel Review.

Reviewers will be asked to formulate a recommendation to either support or decline each proposal. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF is striving to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. The time interval begins on the date of receipt.  The interval ends when the Division Director accepts the Program Officer's recommendation.

A summary rating and accompanying narrative will be completed and submitted by each reviewer. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers, are sent to the Principal Investigator/Project Director by the Program Officer.  In addition, the proposer will receive an explanation of the decision to award or decline funding.

In all cases, after programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications and the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

VII. AWARD ADMINISTRATION INFORMATION

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award letter, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award letter; (4) the applicable award conditions, such as Grant General Conditions (GC-1); * or Federal Demonstration Partnership (FDP) Terms and Conditions * and (5) any announcement or other NSF issuance that may be incorporated by reference in the award letter. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at http://www.nsf.gov/awards/managing/general_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from pubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at http://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer at least 90 days before the end of the current budget period. (Some programs or awards require more frequent project reports). Within 90 days after expiration of a grant, the PI also is required to submit a final project report.

Failure to provide the required annual or final project reports will delay NSF review and processing of any future funding increments as well as any pending proposals for that PI. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through FastLane, for preparation and submission of annual and final project reports.  Such reports provide information on activities and findings, project participants (individual and organizational) publications; and, other specific products and contributions.  PIs will not be required to re-enter information previously provided, either with a proposal or in earlier updates using the electronic system.  Submission of the report via FastLane constitutes certification by the PI that the contents of the report are accurate and complete.  

There are two special CCLI reporting requirements. When CCLI PIs submit interim and final reports through FastLane, they will be asked to provide additional information for the Project Information Resource System (PIRS). In addition, PIs of CCLI grants will also be expected to cooperate with data collection associated with the CCLI program evaluation conducted by a third party organization supported by NSF.

VIII. AGENCY CONTACTS

General inquiries regarding this program should be made to:

  • Russell L. Pimmel, Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4618, email: rpimmel@nsf.gov

  • Myles G. Boylan, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4617, email: mboylan@nsf.gov

  • Jill K. Singer, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-5323, email: jksinger@nsf.gov

  • Sheryl A. Sorby, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4647, email: ssorby@nsf.gov

  • Terry S. Woodin, Co-Lead Program Director, CCLI, Division of Undergraduate Education, 835 N, telephone: (703) 292-4657, email: twoodin@nsf.gov

For questions related to the use of FastLane, contact:

  • Antoinette T. Allen, Information Technology Specialist, Division of Undergraduate Education, 835 N, telephone: 703-292-4646, email: duefl@nsf.gov

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

Proposers are encouraged to contact a DUE Program Director in their discipline:

Biological Sciences

  • V. Celeste Carter, Program Director, telephone: (703) 292-4634, email: vccarter@nsf.gov

  • Joan T. Prival, Program Director, telephone: (703) 292-4635, email: jprival@nsf.gov

  • Daniel Udovic, Program Director, telephone: (703) 292-4766, email: dudovic@nsf.gov

  • Terry S. Woodin, Program Director, telephone: (703) 292-4657, email: twoodin@nsf.gov

Chemistry

  • Susan H. Hixson, Program Director, telephone: (703) 292-4623, email: shixson@nsf.gov

  • Eileen L. Lewis, Program Director, telephone: (703) 292-4627, email: ellewis@nsf.gov

  • Pratibha Varma-Nelson, Program Director, telephone: (703) 292-4653, email: pvarmane@nsf.gov

Computer Science

  • Stephen C. Cooper, Program Director, telephone: (703) 292-4645, email: sccooper@nsf.gov

  • Timothy V. Fossum, Program Director, telephone: (703) 292-5141, email: tfossum@nsf.gov

Engineering

  • Kathleen A. Alfano, Program Director, telephone: (703) 292-4641, email: kalfano@nsf.gov

  • Lesia L. Crumpton-Young, Program Director, telephone: (703) 292-4629, email: lcrumpto@nsf.gov

  • Russell L. Pimmel, Program Director, telephone: (703) 292-4618, email: rpimmel@nsf.gov

  • Sheryl A. Sorby, Program Director, telephone: (703) 292-4647, email: ssorby@nsf.gov

Geological Sciences

  • Jill K. Singer, Program Director, telephone: (703) 292-5323, email: jksinger@nsf.gov

Interdisciplinary

  • Herbert H. Richtol, Program Director, telephone: (703) 292-4648, email: hrichtol@nsf.gov
  • Curtis T. Sears, Program Director, telephone: (703) 292-4639, email: csears@nsf.gov

Mathematics

  • Daniel P. Maki, Program Director, telephone: (703) 292-4620, email: dmaki@nsf.gov

  • Elizabeth J. Teles, Program Director, telephone: (703) 292-8670, email: ejteles@nsf.gov

  • Lee L. Zia, Program Director, telephone: (703) 292-5140, email: lzia@nsf.gov

Physics/Astronomy

  • Warren W. Hein, Program Director, telephone: (703) 292-4644, email: whein@nsf.gov

  • R. Corby Hovis, Program Director, telephone: (703) 292-4625, email: chovis@nsf.gov

  • Duncan E. McBride, Program Director, telephone: (703) 292-4630, email: dmcbride@nsf.gov

Research/Assessment

  • Myles G. Boylan, Program Director, telephone: (703) 292-4617, email: mboylan@nsf.gov

  • Russell L. Pimmel, Program Director, telephone: (703) 292-4618, email: rpimmel@nsf.gov

Social Sciences

  • Myles G. Boylan, Program Director, telephone: (703) 292-4617, email: mboylan@nsf.gov

IX. OTHER INFORMATION

The NSF Website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this Website by potential proposers is strongly encouraged. In addition, MyNSF (formerly the Custom News Service) is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Regional Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. MyNSF also is available on NSF's Website at http://www.nsf.gov/mynsf/.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this new mechanism. Further information on Grants.gov may be obtained at http://www.grants.gov.

ABOUT THE NATIONAL SCIENCE FOUNDATION

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 40,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See Grant Proposal Guide Chapter II, Section D.2 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at http://www.nsf.gov

  • Location:

4201 Wilson Blvd. Arlington, VA 22230

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:

Send an e-mail to:

pubs@nsf.gov

or telephone:

(703) 292-7827

  • To Locate NSF Employees:

(703) 292-5111


PRIVACY ACT AND PUBLIC BURDEN STATEMENTS

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004), and NSF-51, "Reviewer/Proposal File and Associated Records, " 69 Federal Register 26410 (May 12, 2004). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Division of Administrative Services
National Science Foundation
Arlington, VA 22230



 

Policies and Important Links

|

Privacy | FOIA | Help | Contact NSF | Contact Web Master | SiteMap  

National Science Foundation

The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749

Last Updated:
11/07/06
Text Only