GPRA Plan    
NSF GPRA Strategic Plan
FY 2001 - 2006

HOME

   
CONTENTS

About the NSF

NSF Role

I.  Introduction

II.  Vision and Mission

III.  Outcome Goals

IV.  Strategy

Appendices:

Appendix 1: Critical Factors for Success

Appendix 2: External Factors Affecting Success

Appendix 3: Assessing NSF’s Performance

Appendix 4: Integration of NSF Plans with those of Other Agencies

Appendix 5: Resource Utilization

Appendix 6: Linking the Strategic Plan to the Performance Plan

Appendix 7: Crosswalk of NSF Goals and Programs

How We Operate

Our Attributes

National Science Board

Director's Policy Group

APPENDIX 3:  ASSESSING NSF’s PERFORMANCE

The challenge of performance assessment for NSF is that both the substance and the timing of outcomes from research and education activities are largely unpredictable. NSF staff members do not conduct the research and education projects. They provide support for others to undertake these activities based on proposals for the work to be done, the best information available as to the likely outputs and outcomes, and their knowledge of NSF’s outcome goals and the strategies for achieving them. They influence rather than control the outputs and outcomes.

OMB authorized NSF to use alternative format performance goals for NSF's outcomes in research and education. This approach allows for expert judgment to consider both quantitative and qualitative information on performance and to weigh that information in a balanced assessment. NSF uses the descriptive performance goals in our management process through a combination of internal self-assessment and review by independent external panels of experts and peers.

For the three outcome goals, NSF performance is successful if the outcomes of NSF investments for a given period of time are judged to have achieved or to have made significant progress in achieving the specific performance goals. These assessments are made by independent external panels of experts, who use their collective experienced-based norms in determining the level of "significance" necessary for a rating of successful.

Assessment of goal achievement, by external groups of peers and experts will take into account such factors as (1) identified performance indicators for each performance goal, (2) the success to which NSF strategies and plans are implemented; (3) the level of resources invested; (4) external events beyond control of the agency; and (5) the agency’s capability to be flexible and respond rapidly to emerging opportunities. NSF makes use of the following stages in the grant award cycle to assess performance:

  • Applicant and Grantee Information/Merit Review
    All applicants and grantees provide results from previous NSF support, information about existing facilities and equipment available to conduct the proposed research, where the research is to be conducted, biographical information on the primary investigators, other sources of support, and certifications specific to NSF. Information is required at the time of application, at the time of an award, and in annual and final project reports. Awards are made based on merit review by peers who are experts in the field using NSF’s merit review criteria, and availability of resources.

  • Program Evaluation by Committees of Visitors (COVs)
    To ensure the highest quality in processing and recommending proposals for award, qualified external experts review each program every three years. COVs report on the integrity and efficiency of the processes for proposal review and the quality of results of NSF’s programs in the form of outputs and outcomes that appear over time. COVs report on the noteworthy achievements of each year, ways in which projects have collectively affected progress, and expectations for future performance. The recommendations of COVs are reviewed by management and taken into consideration by NSF when evaluating existing programs and future directions for the Foundation.

  • Directorate Assessment by Advisory Committees
    Directorate advisory committees review internal self-assessments, COV reports, available external evaluations, and annual directorate performance reports, judging program effectiveness, and describing strengths and weaknesses. The advisory committees' reports are reviewed by NSF management, which then integrates committee recommendations into the NSF Annual Performance Report.

  • Much of this performance assessment is retrospective, addressing investments made at some point in the past. In order to tie this effectively to current issues in management of the programs, the assessments must also address the quality of the set of awards made in the fiscal year under consideration. The focus of this portfolio assessment is the likelihood that the package of awards will produce strong results in the future. Special emphases within the plans for the fiscal year merit special attention in the assessment process.

    NSF staff has control over budget allocations and the decision processes that determine the set of awards. NSF performance goals for investment processes, along with those for management of the agency, are generally quantitative. They refer to processes conducted during the fiscal year that are generally captured in NSF systems.

Data Collection, Verification, and Validation for NSF’s Results Goals

Two types of data are used to assess goal performance: (a) non-quantitative output and outcome information, collected and reported using the alternative format, which are used to assess the Outcome Goals and the implementation of the new merit review criteria; and (b) quantitative data collected through systems for the performance target levels of the Investment Process and Management Goals.

NSF sources of data include central databases such as the electronic Project Reporting System, the Enterprise Information System, the FastLane system, the Proposal system, the Awards system, the Reviewer System, the Integrated Personnel System, the Finance System, Online Document System, and the Performance Reporting System; distributed sources such as scientific publications, press releases, independent assessments including Committee of Visitor (COV) and Advisory Committee (AC) reports, program and division annual reports, directorate annual reports, and internally maintained local databases. In a few cases, NSF makes use of externally maintained contractor databases.

Through these sources, output indicators such as the following will be available to program staff, third party evaluators, and advisory committees:

  • Related to Ideas:  Results, published and disseminated: journal publications, books, software, audio or video products; contributions within and across disciplines; organizations of participants and collaborators (including collaborations with industry); contributions to other disciplines, infrastructure, and beyond science and engineering; use beyond the research group of specific products, instruments, and equipment resulting from NSF awards; role of NSF-sponsored activities in stimulating innovation and policy development.

  • Related to People:  student participants; demographics of participants; descriptions of student involvement; education and outreach activities under grants; demographics of science and engineering students and workforce; numbers and quality of educational models, products and practices; number and quality of teachers trained; student outcomes including enrollments in mathematics and science courses, retention, achievement, and science and mathematics degrees received.

  • Related to Tools:  new tools and technologies, multidisciplinary databases; software, newly-developed instrumentation, and other inventions; data, samples, specimens, germ lines, and related products of awards placed in shared repositories; facilities construction and upgrade costs and schedules; operating efficiency of shared-use facilities.

NSF’s electronic Project Reporting System permits organized reporting of aggregate information. We anticipate that the reliability of the information in the system will improve over time, as investigators and institutions become comfortable with its use. FY 1999 was the first year of its full implementation. Electronic submission of project reports is required in FY 2000.

The scientific data from the reporting system will be tested for plausibility as a natural part of the external assessment process. In addition, data from the reporting system will be used to address progress under prior support when investigators reapply to NSF. Thus, the investigators have a strong incentive to provide accurate information that reviewers may rely upon.

Issues Specific to NSF:

Because it is difficult to predict or quantify research results, or to report them in a timely way, NSF’s Outcome goals are expressed in an OMB-approved alternative format. Research results cannot be predicted beforehand, and the time frame for reporting outcomes is typically long after the fiscal year in which an award was made. For example, a grant provided in one fiscal year might not produce a reportable outcome for five years or more, if at all.

It should be noted that while NSF made use of the alternative format using the two standard approach required by the Act ("successful" or "minimally effective"), it was found that there was little to be gained in defining the use of "minimally effective," and that in many instances it was confusing to the evaluators.

Therefore, for FY 2000 and beyond, NSF will define one standard only: the "successful" standard. The programs will be evaluated on whether they succeed in achieving the target goals and their impact.

Collection of data for all goals takes place throughout the year, and is completed near the end of the fiscal year. Depending on the specific type of data, data are collected into a report for a given goal by the group responsible for that goal, and then organized for reporting. The data obtained are reviewed on a continuing basis by senior NSF management throughout the year, to observe whether the results are as expected, or need to be improved, or whether the information being obtained is useful to the agency. Data collection systems are also under constant observance and refinement, as in the case of the new FastLane reporting system.

During FY 1999, NSF staff began to implement a Data Quality Project for the quantitative Investment Process and Management goals. This project is currently underway with the first priority placed on the central data systems used to support the performance plan.

In addition, NSF staff implemented new guidelines and reporting procedures for collecting data for the qualitative Outcome goals. The Committee of Visitor (COV) guidelines were revised in FY 1999 to incorporate the GPRA related reporting requirements. Reporting templates were developed for the COVs to address the performance of programs in a systematic way to allow for aggregating information across NSF. COVs address a common set of questions for all programs reviewed in a fiscal year.

Reporting guidelines were also developed for Advisory Committees to allow for a systematic aggregation of information. The results of using the new procedures have identified areas for improvement that are being incorporated into the FY 2000 reporting guidelines. Many of the results learned while conducting these assessments have been used in revising the FY 2000 performance goals, and the revised strategic plan.

NSF Program Assessment/Evaluation Schedule

Assessed Activity

Fequency

Conducted by

Use in Strategic Planning

Program level assessment1

30% per year

External Committee of Visitors

Yes

Directorate level assessment2

100% per year

External Advisory Committees

Yes

Special programs (NSF-wide activities such as MRI, CAREER, STC, PFF, GRF, GRT, IGERT)3

Varies annually

External Committee of Visitors or external contractor

Yes

All agency GPRA related activities4

Weekly

Internal senior management DPG, GIIC, GIIC WG

Yes

__________________________

1One-third of NSF programs assessed annually; assessments take place throughout the fiscal year. All programs assessed on a three-year cycle. COVs address management of programs and achievement of outcome goals; information used by senior management and in aggregate for performance reporting.

2Advisory committees review directorate activities annually and approve COV reports; assess contributions of directorate in achieving NSFs goals; provide reports for use by NSF management and in aggregating NSF performance results. Schedule: meet twice annually with assessment at end of fiscal year. Advisory committees use COV reports as basis for strategic planning discussions with directorates.

3NSF-wide programs evaluated by external contractors to assess impact of programs. Schedule varies depending on program. MRI= Major Research Instrumentation program external contractor reviewed in FY 2000; CAREER =Faculty Early Career Development program external contractor review being organized in FY 2000 for evaluation in FY 2001; STC= Science and Technology Centers; PFF=Presidential Faculty Fellows; GRF=Graduate Research Fellowships program; GRT=Graduate Research Traineeships program evaluation completed in FY 2000; IGERT= Integrative Graduate Education and Research Training program evaluation ongoing in FY 2000.

4Internal staff meetings to review GPRA activities across NSF and make recommendations for implementation of GPRA. DPG = Director’s Policy Group, GIIC= GPRA Infrastructure Implementation Council, GIIC WG= GPRA Infrastructure Implementation Council Working Group

NSF ~ OD ~ OIA