Archived funding opportunity

This solicitation is archived.

NSF 03-584: Course, Curriculum, and Laboratory Improvement: Assessment of Student Achievement in Undergraduate Education

Program Solicitation

Document Information

Document History

  • Posted: July 31, 2003

This document has been archived.

Course, Curriculum, and Laboratory Improvement (CCLI)
Assessment of Student Achievement in Undergraduate Education (ASA) Track

Program Solicitation
NSF 03-584
Replaces Document 01-82

NSF Logo

National Science Foundation
Directorate for Education and Human Resources
      Division of Undergraduate Education



Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):



    October 29, 2003

Summary Of Program Requirements

General Information

Program Title:

Course, Curriculum, and Laboratory Improvement (CCLI)
Assessment of Student Achievement (ASA) Track

Synopsis of Program:

The Course, Curriculum, and Laboratory Improvement (CCLI) program seeks to improve the quality of Science, Technology, Engineering,and Mathematics (STEM) education for all students, based on research concerning needs and opportunities in undergraduate education andeffective ways to address them. It targets activities affecting learning environments, course content, curricula, and educational practices, with the aim of contributing to the relevant research base that will support efforts to enhance STEM education.

The program has four tracks. Assessment of Student Achievement in Undergraduate Education (CCLI - ASA), which began as a separate program in March 2001, is now one of these four tracks. With this addition, CCLI is more effectively positioned as a self-contained program to support the cycle of research and innovation that leads to improvement in undergraduate education. The other three tracks are Educational Materials Development (EMD), National Dissemination (ND), and Adaptation and Implementation (A&I). These other tracks and the relationships among these four tracks is described in greater detail in the Introduction, under the heading "Rationale for CCLI Program." The program solicitation for CCLI – EMD/ND is published as NSF 03-558 and is available at: https://www.nsf.gov/pubsys/ods/getpub.cfm?nsf03558. The program solicitation for CCLI - A&I is anticipated in August 2003. A November or December 2003 proposal deadline is anticipated. These three program solicitations have different proposal deadlines in order to spread the reviewing and processing of proposals more evenly throughout the year.

CCLI - ASA (or ASA) supports research on assessment, and the development and dissemination of assessment practices, materials (tools), and measures to guide efforts that improve the effectiveness of courses,curricula, programs of study, and academic institutions in promoting student achievement, particularly in science, technology, engineering, and mathematics. ASA also promotes the full integration of assessment with these educational efforts. ASA projects may be integrated with research on learning, particularly research focused in the STEM disciplines. It supports projects in three areas:

1. New Development: developing and validating new assessment materials (tools) and practices for use in single or multiple undergraduate disciplines.

2. Adaptation: adapting assessment materials and practices that have proven effective for one setting or audience for use in a new setting or with a different audience.

3. Dissemination: efforts to spread the use of effective assessment practices through workshops or Web-based materials that have been validated and are thoroughly documented with detailed instructions.

Cognizant Program Officer(s):

  • Myles G. Boylan, Lead Program Director (CCLI-ASA), Directorate for Education & Human Resources, Division of Undergraduate Education, 812 N, telephone: (703) 292-4617, fax: (703) 292-9015, email: mboylan@nsf.gov

  • Susan H. Hixson, Program Director (CHEM), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4623, fax: (703) 292-9015, email: shixson@nsf.gov

  • Theodore W. Hodapp, Program Director (PHY), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4640, email: thodapp@nsf.gov

  • Ernest L. McDuffie, Program Director (CS), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4655, fax: (703) 292-9016, email: emcduffi@nsf.gov

  • Calvin L. Williams, Program Director (MATH), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4642, email: cwilliam@nsf.gov

  • Division of Undergraduate Education, telephone: 703-292-8666, email: undergrad@nsf.gov

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.076 --- Education and Human Resources

Eligibility Information

  • Organization Limit: None Specified.
  • PI Eligibility Limit: To help ensure that project results will effectively serve the STEM community, at least one investigator (PI or Co-PI) in a project must be a STEM faculty member.
  • Limit on Number of Proposals: None Specified.

Award Information

  • Anticipated Type of Award: Standard or Continuing Grant
  • Estimated Number of Awards: 12
  • Anticipated Funding Amount: $3,000,000 in FY 2004, pending availability of funding

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions
  • Full Proposal Preparation Instructions: Standard GPG Guidelines apply.
B. Budgetary Information
  • Cost Sharing Requirements: Cost Sharing is not required.
  • Indirect Cost (F&A) Limitations: Not Applicable.
  • Other Budgetary Limitations: Not Applicable.
C. Due Dates
  • Full Proposal Deadline Date(s) (due by 5 p.m. proposer's local time):
    • October 29, 2003

Proposal Review Information

  • Merit Review Criteria: National Science Board approved criteria apply.

Award Administration Information

  • Award Conditions: Standard NSF award conditions apply.
  • Reporting Requirements: Standard NSF reporting requirements apply.

I. Introduction

Undergraduate education is central to the National Science Foundation's mission in human resource development. Whether preparing students to participate as citizens in a technological society, to enter the workforce with two- or four-year degrees, to continue their formal education in graduate school, or to further their education in response to new career goals or workplace expectations, undergraduate education provides the critical link between the Nation's secondary schools and a society increasingly dependent upon science and technology. CCLI’s major goal is to support efforts in undergraduate institutions to develop the capacity to meet the learning needs of all undergraduate students in science, technology, engineering, and mathematics (STEM), including in particular:

  • STEM majors
  • Students preparing for the technological workplace
  • All students, as citizens in a society increasingly dependent upon science and technology
  • Prospective pre-Kindergarten through grade 12 (preK-12) teachers

Although prospective teachers are not the primary focus of the CCLI program, they may, nevertheless, be included as target populations in CCLI proposals. A new (in 2003) Teacher Professional Continuum (TPC) program supports projects addressing research issues related to teacher pre-service and in-service education, including the development of professional education materials. (See: www.nsf.gov/pubs/2003/nsf03534/nsf03534.htm)

The CCLI program has four tracks that emphasize, respectively, the development of new educational materials and practices for a national audience (EMD), the local adaptation and implementation of previously developed exemplary materials and practices (A&I), including laboratory experiences and support for instrumentation, the national dissemination of exemplary materials and/or practices through faculty professional development (ND), and the assessment of student achievement, including research on assessment and the development of assessment tools and practices (ASA). Projects may address the needs of a single discipline or cut across disciplinary boundaries. Abstracts of previously funded projects can be found at http://www.ehr.nsf.gov/pirs_prs_web/search/.

The most competitive proposals will be those that either refer to an existing research base or provide credible plans to a) research the needs and opportunities that exist, b) assess the impact of innovative educational practices and materials on student learning, c) develop assessment tools and practices to support efforts to measure student achievement, and d) study how best to prepare faculty to use the materials effectively.

This program solicitation describes the Assessment of Student Achievement in Undergraduate Education Track. A separate solicitation issued on March 20, 2003, describes the characteristics of the Educational Materials Development and the National Dissemination Tracks (NSF 03-558). (See www.nsf.gov/cgi-bin/getpub?nsf03558.) A separate solicitation describing the Adaptation and Implementation Track is anticipated in August. There are three program solicitations with different proposal deadlines in order to spread the reviewing and processing of proposals more evenly throughout the year.

Rationale for CCLI Program

The component tracks of CCLI collectively encourage and nurture innovative improvements in undergraduate education through processes that reflect a “cycle of innovation.” Effective educational innovation is a multi-faceted and challenging process. Knowledge is advancing about effective instructional practices and how students learn. [See, for example, John Bransford, A.L. Brown, and R.R. Cocking, editors, "How People Learn: Brain, Mind, Experience, and School," Expanded Edition (NAS Press, 2000).] Educational technology has made large gains that have not been fully explored. Basic scientific and technological knowledge is advancing at a rapid pace and modern technology has made it possible for researchers and students to gain access to data and information reflecting those advances. As a consequence, there is an ever present opportunity and need to develop new undergraduate modules, courses, curricula, and instructional methods, supported by modern technology, and to evaluate the impact of these innovations on student learning and achievement.

The Educational Materials Development (EMD) track seeks to support this needed innovation. EMD "Proof-of-Concept" projects are intended to develop the scientific and educational feasibility of an idea. Some of these will lead to "full scale" EMD projects to develop "products" and practices that by the end of project have been tested in diverse institutions and carefully evaluated. That development process contributes to our understanding of what works and under what circumstances, so that others can use and build on this base of knowledge. (The CCLI - EMD program solicitation is located at www.nsf.gov/cgi-bin/getpub?nsf03558.) The completion of an EMD project is often only a starting point in the further development and testing of innovative products and practices. The other three tracks of CCLI are designed to encourage further research and development and to support broad dissemination of those new courses, curricula, and instructional methods that have been proven effective in some circumstances.

The Assessment of Student Achievement track (ASA) – the focus of this solicitation -- invites projects to develop tools and practices to measure learning achievement. The ASA track is the newest track of the CCLI Program. Established initially as a separate program in March 2001, it is now combined with the other three tracks because of its potential for providing formative guidance to EMD, A&I, and ND projects and because it is an important link to research on learning in the STEM disciplines. Some current ASA projects have been derived from evaluation issues that arose in earlier CCLI-EMD projects by the same principal investigator. CCLI staff anticipate that EMD, A&I, and ND projects will use assessment materials created under the ASA track (or other validated materials) increasingly over time as they become available, in order to support project-level efforts to evaluate and assess their impact (for example, as reflected in "Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics," National Research Council, National Academy Press, 2003). DUE expects that CCLI projects will draw from research on learning in STEM disciplines. For example, the design and sequencing of content knowledge, "end-of-chapter" problems, and student experiments in an EMD project might draw on findings from current research on learning (e.g., on models of how students acquire increasing expertise in a subject area). In turn, evaluations of EMD projects (as judged through the use of assessment instruments and practices) have the potential to contribute useful knowledge to basic and applied research on learning in STEM disciplines. Similarly, the design and testing of new assessment tools and practices will ideally be built on (and aligned with) both evidence of how students learn (learning models) and the subject matter that students are expected to master. There are currently three NSF programs that support research on learning on a large scale: Research on Learning and Education (ROLE)  and Evaluative Research and Evaluation Capacity Building (EREC), both managed by the Division of Research, Evaluation, and Communication [see www.nsf.gov/pubsys/ods/getpub.cfm?nsf03542] and the new Science of Learning Centers Program (SLC), a cross-directorate program [see www.nsf.gov/pubsys/ods/getpub.cfm?nsf03573].

Through the Adaptation and Implementation (A&I) track, CCLI supports and encourages faculty, departments, and institutions to use innovative materials and practices of demonstrated effectiveness, including laboratory experiences and instrumentation, in their own courses and curricula. This track has been designed to increase the pace of innovation diffusion and address the "not invented here" syndrome that is often mentioned as a retardant to the diffusion of innovation. A&I proposals are expected to provide evidence that the innovative materials and practices they propose to implement have been effective on other campuses, to explain why they are anticipated to be effective for the students enrolled on the applicant’s campus, and to provide evaluation data by the end of the project. Through the subsequent synthesis of evaluation results, we seek to continue building knowledge about effective practices and materials (what works under what circumstances), thereby strengthening the cycle of innovation. The A&I track is prepared to support educational improvement efforts at the departmental and even the institutional level. It is our vehicle for supporting major efforts at reform. (Its program solicitation is anticipated to be available in August 2003.)

The National Dissemination (ND) track provides STEM faculty with professional development opportunities, and is both a complement and stimulant to the A&I track. ND projects are required to be offered at the national level, open to all faculty, and focused on offering knowledge about new courses, educational materials, laboratory practices, instructional methods, and assessment tools and practices that have been used and evaluated to the point where they can be considered to be proven effective in many circumstances. The logic of ND is to provide faculty with enough information about new effective materials and practices so that they can make informed decisions about their own courses and teaching activities, and become part of a national network of faculty working to institute modern teaching methods in their departments. ND projects are encouraged to create and nurture these networks. (Its program solicitation is located at www.nsf.gov/cgi-bin/getpub?nsf03558.)

II. Program Description

In its "Best Practices in Assessment" the American Psychological Association (APA) observes that: "Assessment can serve dual purposes: Assessment can promote student learning or provide evidence for accountability requirements through an evaluation of strengths and weaknesses." [See: "Understanding Assessment;Best Practices in Assessment: Top 10 Task Force Recommendations" in "The Assessment CyberGuide for Learning Goals and Outcomes in the Undergraduate Psychology Major," Task Force on Undergraduate Psychology Major Competencies, Board of Educational Affairs, APA, www.apa.org/ed/guide_outline.html.] The CCLI – ASA track supports both of these purposes. In broad terms the CCLI - ASA track is intended to serve faculty, departments, administrators, and education officials interested in the measurement of student achievement in courses, curricula, programs of study, and the cumulative undergraduate experience. A focal point is student learning within a major field of study. Other examples of possible projects are assessment of the general education core curriculum in STEM and student mastery of essential skills in quantitative literacy. In general, there are four broad areas on which CCLI - ASA projects can focus:

1. Developing and adapting assessment materials that can be used to:

  • measure student achievement in courses and curricula, and
  • improve student achievement of explicit learning objectives.

2. Developing tools, procedures, and measures for assessing student achievement derived from a group of courses constituting a minor or major field of study.

3. Assessing the effectiveness of educational practices intended to improve the undergraduate learning environment. Examples are:

  • integrating knowledge from multiple disciplines,
  • using technology of various types (or not)
  • increasing the use of student teams,
  • strengthening co-curricular activities (e.g., service learning),
  • organizing and implementing learning communities, and
  • increasing laboratory and field experiences (i.e., direct experiences with research methods).

4. Developing indicators of student learning and achievement within certain domains (e.g., the general education core) or measures of institutional program quality based on student achievement.

CCLI - ASA Program Emphases

Within any of the areas outlined above, applicants may propose a project that emphasizes one of the following:

New Development:
Developing new assessment materials (tools) and practices for use in single or multiple undergraduate disciplines. Proposed projects may be either pilot projects to experiment with new tools and practices, broader efforts to develop new assessment materials and measures for formative use by disciplinary faculty in improving course design and instructional practices, or projects to develop and refine assessment tools to assist in STEM program and institutional accreditation activities or performance reviews. Proposals should specify explicitly the planned levels of assessment in terms of student achievement (e.g., basic knowledge, deeper comprehension, application skills, the ability to analyze complex ideas) and scope (e.g., within a single course, a course sequence, core STEM general education courses, a STEM major, or an entire degree program).

Adaptation:
Adapting assessment materials and practices that have shown to be effective for one setting or audience for use in a new setting or with a different audience. This may include projects to adapt and integrate effective assessment practices into the undergraduate instructional activities of a department of college. Proposals should review what is known about the effectiveness of the assessment materials and practices that are to be adapted. They should also include an explicit statement of the proposed interpretation of the assessment results and a rationale for its use in the new setting or with a different audience, a plan to evaluate the success of the adapted materials and practices, and a plan to disseminate information about project results.

Dissemination:
Spreading the use of effective assessment practices through workshops or Web-based materials that have been validated and that are thoroughly documented with detailed instructions. For example, workshops for faculty may be proposed to deepen their understanding and strengthen their skills in using existing assessment tools and practices in their discipline. A second example is workshops for disciplinary faculty, educational researchers, and employers to define areas where assessment needs and opportunities exist. Applicants are requested to identify on the project data form (NSF Form 1295) which track most closely fits their project. Projects pursuing other objectives will also be considered.

Desired Characteristics of CCLI - ASA Projects

Portability   
Given the broad need for assessment tools and procedures that are portable to other academic settings and with different populations, it is requested that ASA proposals submitted under the first two areas develop a plan to document the reliability of assessment tools and/or procedures and the validity of the interpretation and use of the assessment results.  In order to meet this goal, proposals are expected to describe a process for testing the reliability of the product and/or procedure and for evaluating the soundness or trustworthiness of the interpretations and use of students’ assessment results. [See, for example, American Educational Research Association, American Psychological Association and National Council on Measurement in Education, "Standards for Educational and Psychological Testing" (AERA, APA, and NCME, 1999) and The Joint Committee on Standards for Education Evaluation, "The Program Evaluation Standards" (Sage Publications, 1994)].  Proposals should review the literature on assessment and existing assessment practices and materials (tools) in their application area and describe the gaps they are filling.  A specific plan for preparing others to use the product and/or procedures in a responsible manner should be described. 

Formative Feedback:
A key feature of highly-rated ASA proposals during the first three cycles of review has been the purposeful creation of formative feedback loops into the curricular areas and instructional methods being assessed in order to improve the underlying courses. It is anticipated that many of the tools and measures developed under ASA will have the potential to be used for formative assessment. In fact, the APA recommends this: "Wherever possible, students should experience a direct, positive benefit from their participation in assessment activities. Sharing [learning] expectations explicitly with students can provide an effective learning scaffold on which students can build their experiences and render effective performance. Outcomes can be specified in a developmental hierarchy, where possible." [APA, opus cit.] In order to facilitate this process of improvement, proposals should be designed for use by faculty and administrators as well as assessment experts.

Focus on Higher-Order Thinking Skills:
Basic STEM course design is increasingly reflecting the recommendation in Bransford et al. (opus cit.) that instructors should ensure that students learn some subject matter in depth by providing them with many examples in which the same concept is at work as well as providing a firm foundation of factual knowledge. Modern STEM course and curriculum design seeks to improve students' problem-solving skills and to strengthen their ability to transform knowledge into new (related) domains of practice. For example, this is reflected in the "outcomes" orientation of ABET standards and in the AAHE perspective: "Learning is a complex process. It entails not only what students know but what they can do with what they know." (in Alexander Astin et al., "AAHE's 9 Principles of Good Practice for Assessing Student Learning," www.aahe.org/assessment/principl.htm.) Traditional course examinations often do not measure deeper and conceptual knowledge that is gained from well-designed courses that engage students in a variety of active learning experiences. For example, improvements in introductory STEM courses supported by CCLI – EMD funding have often been evaluated comparatively, using traditional end-of-course examinations for both "traditional" and modified courses to allow direct comparison of the results. A challenge for CCLI – ASA projects is to design more appropriate assessment tools that are able to document students' full learning in the modified courses and curricula. Possible guidelines for assessment of higher order thinking skills can be found in the work of Bransford, et al. (opus cit.) and Lorin Anderson and David Krathwohl: "A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives" (New York: Longman, 2001). CCLI – ASA projects are expected to address pertinent validity and reliability issues.

Validity:
The American Educational Research Association, American Psychological Association and National Council on Measurement in Education define validity as the degree to which evidence and theory support the interpretations and use of information gathered by assessment tools and procedures. In other words, the process of validation involves a sound scientific basis for the proposed interpretation and use of the information gathered rather than the test or assessment itself. [See, for example, American Educational Research Association, American Psychological Association and National Council on Measurement in Education, "Standards for Educational and Psychological Testing" (AERA, APA, and NCME, 1999) and The Joint Committee on Standards for Education Evaluation, "The Program Evaluation Standards" (Sage Publications, 1994)].

Reliability:
The American Educational Research Association (AERA) defines reliability as "the degree to which a set of items consistently measures the same thing across respondents and institutional settings." AERA further notes: "Another characteristic of a reliable instrument is stability, the degree to which the students respond in similar ways at two different points in time. One approach to measuring stability is test-retest, wherein the same students are asked to take the [assessment test] two or more times within a reasonably short period of time." [See: AERA, APA, & National Council on Measurement in Education, "Standards for educational and psychological testing." (Washington, DC: American Educational Research Association, 1999.) The degree to which an instrument is reliable is an indicator of its psychometric quality.

ROLE, a Related Program Supporting Research on STEM Learning

NSF's Division of Research, Evaluation and Communication offers a related program, Research on Learning and Education (ROLE). It has four areas, or "quadrants," of concentration. The third quadrant supports "Research on STEM Learning in Educational Settings," which has broad objectives similar to those of the CCLI - ASA track. In describing the third quadrant, the ROLE program solicitation (NSF 03-542) states: "Many educational approaches, curriculum materials, assessments, and technological tools have been developed to mediate the learning process without the benefit of a strong research foundation. In some instances, this is because the appropriate research does not exist. In other cases, this is because of insufficient exchange of information and knowledge between research, development and implementation communities. "

"A principal expectation for research related to this quadrant is to provide a stronger evidentiary base to support sustained improvement in STEM educational practice both in formal classroom settings and in informal learning sites (including the home). Additionally, ROLE seeks proposals that promise to build a stronger research base in adult workplace STEM learning and in other educational settings, such as e-learning or distributed environments.) All submissions should identify critical, practice-derived research questions and should provide a means for interacting significantly and in partnership with STEM educational practitioners. ROLE seeks significant national progress in the integration of research and practice." [See "Project Description" in www.nsf.gov/pubs/2003/nsf03542/nsf03542.htm] The third quadrant of ROLE has and will support some projects that contribute to some of the goals of CCLI - ASA. However, this quadrant is more broadly focused on learning processes at all educational levels. By contrast, the prime focus of ASA is on undergraduate STEM programs. 

III. Eligibility Information

The categories of proposers identified in the Grant Proposal Guide are eligible to submit proposals under this program announcement/solicitation.  To help ensure that project results will effectively serve the STEM community, at least one investigator (PI or Co-PI) must be a STEM faculty member.

IV. Award Information

  • Anticipated Type of Award: Standard or Continuing Grant
  • Estimated Number of Awards: 12
  • Anticipated Funding Amount: $3,000,000

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Full Proposal Instructions:

Proposals submitted in response to this program announcement/solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF Website at: https://www.nsf.gov/cgi-bin/getpub?gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (301) 947-2722 or by e-mail from pubs@nsf.gov.

Proposers are reminded to identify the program announcement/solicitation number (03-584) in the program announcement/solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.

B. Budgetary Information

Cost Sharing:

Cost sharing is not required in proposals submitted under this Program Solicitation.

C. Due Dates

Proposals must be submitted by the following date(s):

Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

    October 29, 2003

D. FastLane Requirements

Proposers are required to prepare and submit all proposals for this announcement/solicitation through the FastLane system. Detailed instructions for proposal preparation and submission via FastLane are available at: http://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program announcement/solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this announcement/solicitation.

Submission of Electronically Signed Cover Sheets. The Authorized Organizational Representative (AOR) must electronically sign the proposal Cover Sheet to submit the required proposal certifications (see Chapter II, Section C of the Grant Proposal Guide for a listing of the certifications). The AOR must provide the required electronic certifications within five working days following the electronic submission of the proposal. Proposers are no longer required to provide a paper copy of the signed Proposal Cover Sheet to NSF. Further instructions regarding this process are available on the FastLane Website at: http://www.fastlane.nsf.gov

VI. Proposal Review Information

A. NSF Proposal Review Process

Reviews of proposals submitted to NSF are solicited from peers with expertise in the substantive area of the proposed research or education project. These reviewers are selected by Program Officers charged with the oversight of the review process. NSF invites the proposer to suggest, at the time of submission, the names of appropriate or inappropriate reviewers. Care is taken to ensure that reviewers have no conflicts with the proposer. Special efforts are made to recruit reviewers from non-academic institutions, minority-serving institutions, or adjacent disciplines to that principally addressed in the proposal.

The National Science Board approved revised criteria for evaluating proposals at its meeting on March 28, 1997 (NSB 97-72). All NSF proposals are evaluated through use of the two merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

On July 8, 2002, the NSF Director issued Important Notice 127, Implementation of new Grant Proposal Guide Requirements Related to the Broader Impacts Criterion. This Important Notice reinforces the importance of addressing both criteria in the preparation and review of all proposals submitted to NSF. NSF continues to strengthen its internal processes to ensure that both of the merit review criteria are addressed when making funding decisions.

In an effort to increase compliance with these requirements, the January 2002 issuance of the GPG incorporated revised proposal preparation guidelines relating to the development of the Project Summary and Project Description. Chapter II of the GPG specifies that Principal Investigators (PIs) must address both merit review criteria in separate statements within the one-page Project Summary. This chapter also reiterates that broader impacts resulting from the proposed project must be addressed in the Project Description and described as an integral part of the narrative.

Effective October 1, 2002, NSF will return without review proposals that do not separately address both merit review criteria within the Project Summary. It is believed that these changes to NSF proposal preparation and processing guidelines will more clearly articulate the importance of broader impacts to NSF-funded projects.

The two National Science Board approved merit review criteria are listed below (see the Grant Proposal Guide Chapter III.A for further information). The criteria include considerations that help define them. These considerations are suggestions and not all will apply to any given proposal. While proposers must address both merit review criteria, reviewers will be asked to address only those considerations that are relevant to the proposal being considered and for which he/she is qualified to make judgments.

    What is the intellectual merit of the proposed activity?
    How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, the reviewer will comment on the quality of the prior work.) To what extent does the proposed activity suggest and explore creative and original concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources?
    What are the broader impacts of the proposed activity?
    How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?

NSF staff will give careful consideration to the following in making funding decisions:

    Integration of Research and Education
    One of the principal strategies in support of NSF's goals is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions provide abundant opportunities where individuals may concurrently assume responsibilities as researchers, educators, and students and where all can engage in joint efforts that infuse education with the excitement of discovery and enrich research through the diversity of learning perspectives.
    Integrating Diversity into NSF Programs, Projects, and Activities
    Broadening opportunities and enabling the participation of all citizens -- women and men, underrepresented minorities, and persons with disabilities -- is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

B. Review Protocol and Associated Customer Service Standard

All proposals are carefully reviewed by at least three other persons outside NSF who are experts in the particular field represented by the proposal. Proposals submitted in response to this announcement/solicitation will be reviewed by Panel Review.

Reviewers will be asked to formulate a recommendation to either support or decline each proposal. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

A summary rating and accompanying narrative will be completed and submitted by each reviewer. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers, are sent to the Principal Investigator/Project Director by the Program Director. In addition, the proposer will receive an explanation of the decision to award or decline funding.

NSF is striving to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. The time interval begins on the date of receipt. The interval ends when the Division Director accepts the Program Officer's recommendation.

In all cases, after programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications and the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program Division administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See section VI.A. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award letter, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award letter; (4) the applicable award conditions, such as Grant General Conditions (NSF-GC-1); * or Federal Demonstration Partnership (FDP) Terms and Conditions * and (5) any announcement or other NSF issuance that may be incorporated by reference in the award letter. Cooperative agreement awards also are administered in accordance with NSF Cooperative Agreement Terms and Conditions (CA-1). Electronic mail notification is the preferred way to transmit NSF awards to organizations that have electronic mail capabilities and have requested such notification from the Division of Grants and Agreements.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/home/grants/grants_gac.htm. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (301) 947-2722 or by e-mail from pubs@nsf.gov.

More comprehensive information on NSF Award Conditions is contained in the NSF Grant Policy Manual (GPM) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/cgi-bin/getpub?gpm. The GPM is also for sale through the Superintendent of Documents, Government Printing Office (GPO), Washington, DC 20402. The telephone number at GPO for subscription information is (202) 512-1800. The GPM may be ordered through the GPO Website at http://www.gpo.gov.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the PI must submit an annual project report to the cognizant Program Officer at least 90 days before the end of the current budget period.

Within 90 days after the expiration of an award, the PI also is required to submit a final project report. Failure to provide final technical reports delays NSF review and processing of pending proposals for the PI and all Co-PIs. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project reporting system, available through FastLane, for preparation and submission of annual and final project reports. This system permits electronic submission and updating of project reports, including information on project participants (individual and organizational), activities and findings, publications, and other specific products and contributions. PIs will not be required to re-enter information previously provided, either with a proposal or in earlier updates using the electronic system.

VIII. Contacts For Additional Information

General inquiries regarding this program should be made to:

  • Myles G. Boylan, Lead Program Director (CCLI-ASA), Directorate for Education & Human Resources, Division of Undergraduate Education, 812 N, telephone: (703) 292-4617, fax: (703) 292-9015, email: mboylan@nsf.gov

  • Susan H. Hixson, Program Director (CHEM), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4623, fax: (703) 292-9015, email: shixson@nsf.gov

  • Theodore W. Hodapp, Program Director (PHY), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4640, email: thodapp@nsf.gov

  • Ernest L. McDuffie, Program Director (CS), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4655, fax: (703) 292-9016, email: emcduffi@nsf.gov

  • Calvin L. Williams, Program Director (MATH), Directorate for Education & Human Resources, Division of Undergraduate Education, 835 N, telephone: (703) 292-4642, email: cwilliam@nsf.gov

  • Division of Undergraduate Education, telephone: 703-292-8666, email: undergrad@nsf.gov

For questions related to the use of FastLane, contact:

  • FastLane Help Desk, telephone: 1-800-673-6188, email: fastlane@nsf.gov

  • Ms. Antoinette Allen, Division of Undergraduate Education, telephone: 703-292-4646, email: duefl@nsf.gov

IX. Other Programs Of Interest

The NSF Guide to Programs is a compilation of funding for research and education in science, mathematics, and engineering. The NSF Guide to Programs is available electronically at https://www.nsf.gov/cgi-bin/getpub?gp. General descriptions of NSF programs, research areas, and eligibility information for proposal submission are provided in each chapter.

Many NSF programs offer announcements or solicitations concerning specific proposal requirements. To obtain additional information about these requirements, contact the appropriate NSF program offices. Any changes in NSF's fiscal year programs occurring after press time for the Guide to Programs will be announced in the NSF E-Bulletin, which is updated daily on the NSF Website at https://www.nsf.gov/home/ebulletin, and in individual program announcements/solicitations. Subscribers can also sign up for NSF's Custom News Service (https://www.nsf.gov/home/cns/start.htm) to be notified of new funding opportunities that become available.

There are currently three NSF programs that support research on learning on a large scale: Research on Learning and Education (ROLE) and Evaluative Research and Evaluation Capacity Building (EREC), both managed by the Division of Research, Evaluation, and Communication [see www.nsf.gov/pubsys/ods/getpub.cfm?nsf03542] and the new Science of Learning Centers Program (SLC), a cross-directorate program [see www.nsf.gov/pubsys/ods/getpub.cfm?nsf03573]. 

About The National Science Foundation

The National Science Foundation (NSF) funds research and education in most fields of science and engineering. Awardees are wholly responsible for conducting their project activities and preparing the results for publication. Thus, the Foundation does not assume responsibility for such findings or their interpretation.

NSF welcomes proposals from all qualified scientists, engineers and educators. The Foundation strongly encourages women, minorities and persons with disabilities to compete fully in its programs. In accordance with Federal statutes, regulations and NSF policies, no person on grounds of race, color, age, sex, national origin or disability shall be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving financial assistance from NSF, although some programs may have special requirements that limit eligibility.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities (investigators and other staff, including student research assistants) to work on NSF-supported projects. See the GPG Chapter II, Section D.2 for instructions regarding preparation of these types of proposals.

Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to applicant institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies needing information as part of the review process or in order to coordinate programs; and to another Federal agency, court or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 63 Federal Register 267 (January 5, 1998), and NSF-51, "Reviewer/Proposal File and Associated Records," 63 Federal Register 268 (January 5, 1998). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to an information collection unless it displays a valid OMB control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding this burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to: Suzanne Plimpton, Reports Clearance Officer, Division of Administrative Services, National Science Foundation, Arlington, VA 22230.

OMB control number: 3145-0058.