Attachment I: Exemptions to External Peer Review
The FY 1996 Report on the NSF Merit Review System responds to a National Science Board (NSB) policy endorsed in 1977 and amended in 1984, requesting that the Director of the National Science Foundation (NSF) submit an annual report on the NSF proposal review system. This report provides summary information about levels of proposal and award activity and the process by which proposals are reviewed and awarded.
During FY 1996, NSF, with the cooperation of external peer reviewers, competitively reviewed 29,953 research and education proposals. NSF funding was awarded to 8,796 of the proposals, producing a funding rate of 29 percent. The number of proposals reviewed annually by NSF has been reasonably stable at about 30,000 proposals since 1992. The number of awards however, has dropped by 13 percent, resulting in a declining funding rate as shown in Text Figure 1. Funding rates vary among directorates, ranging from 23 percent to 37 percent as shown in Appendix Table 1.
In addition to funding proposals that were competitively reviewed during FY 1996, NSF awarded 7,708 continuing grant increments (CGIs) based on proposals which had been competitively reviewed in earlier years. CGIs are funded in annual increments from current year appropriations. The CGI procedure complements the other major award instrument - standard grants - where all funds for a multiple year project are obligated out of a single year appropriation. NSF policy limits the amount of future year CGI commitments to 65 percent of a program's current fiscal year operating plan.
Characteristics of Principal Investigators
The number of proposals received from female and minority Principal Investigators (PIs) has increased slowly since 1990. Proposals, awards, funding rates and trends by PI characteristics are shown in Appendix Table 2. During FY 1996, about 17 percent of competitively reviewed proposals were from female PIs. Funding rates of proposals from female PIs have been higher than proposals from male PIs for five of the past seven years. Funding rate trends for female and minority PIs are graphed in Text Figure 2.
Five percent of competitively reviewed proposals during FY 1996 were from minority PIs. The funding rate for proposals from minority PIs were below the NSF rate from FY 1990 through FY 1995, but exceeded the NSF rate in FY 1996.
Forty-five percent of competitively reviewed proposals in FY 1996 were from PIs who had not received an NSF award in a previous fiscal year (new PIs); down from fifty-one percent in FY 1990. The funding rate for proposals from PIs who had received an NSF award (prior PIs) was higher than proposals from new PIs (36 percent and 21 percent, respectively in FY 1996). The difference in funding rates between proposals from new and prior PIs has been declining since 1991.
The median annual award amount (adjusted for multiple year projects) among competitive awards made during FY 1996 was $52,313. The average award amount among the same set of awards was $85,385. The difference between the median and average award amounts is due to a combination of numerous small awards pulling the median down, and large awards for centers and facilities pulling the average up. Award amounts have been consistent over the past decade, when adjusted to constant dollars as measured by the Consumer Price Index. There are considerable variations among directorates as shown in Appendix Table 3.
Peer Review and Merit Review
There is a tendency to use the terms "peer review" and "merit review" interchangeably. It is more accurate to refer to the NSF proposal review process as "merit review with peer evaluation." The involvement of knowledgeable peers from outside the Foundation in the review of proposals is the keystone of NSF's proposal review system. Their judgments of the extent to which proposals address established criteria are vital for informing NSF staff and influencing funding recommendations. However, development of a portfolio of program awards that address all of NSF's diverse objectives requires a broader perspective than can be achieved solely on the basis of a set of independent or panel-generated proposal reviews. Therefore, NSF relies on the judgment of qualified program officers to make funding recommendations which, on the whole, produce a portfolio of awards addressing NSF's strategic goals and related factors such as:
Each program officer funding recommendation is subject to a programmatic review by a higher level reviewing official (usually the division director), and an administrative review by a grants officer. Awards in excess of a $3 million commitment during a project year, or $15 million over five years, require approval by the National Science Board.
Mail-only, Panel-only and Mail-plus-Panel
NSF programs obtain external peer review by two principal methods, mail and panel. In addition to mail and panel reviews, site visits by NSF staff and external peers are often used to review large facility and center proposals. NSF program officers are given discretion in the specific use of review methods, subject to supervisory approval.
In mail reviews, peers are sent proposals and asked to submit written comments to NSF by postal mail, electronic mail, or facsimile. These mail reviews may either be used by the NSF program officer directly to support a funding recommendation ("mail-only" review), or presented to a panel to inform discussion of the proposal.
Many programs use a combination of mail and panel methods to obtain peer reviews. Major variations of such mail-plus-panel reviews are:
The mail-plus-panel method was used for 59 percent of proposals reviewed during FY 1996; 23 percent of proposals were reviewed by mail-only, and 18 percent by panel-only. The percentage of NSF proposals reviewed by mail-plus-panel has increased during the past decade, with a corresponding drop in the percentage of proposals reviewed by mail-only. These trends are shown in Text Figure 3. [See Appendix Table 4 for details on NSF trends.]
Directorates vary widely in their use of proposal review methods as summarized in Text Figure 4, and detailed in Appendix Table 5.
Diversity of the reviewer pool is an important feature of the NSF merit review system. Reviewers from diverse backgrounds help ensure that a wide range of perspectives are taken into consideration in the process of funding research and education proposals. NSF emphasizes reviewer diversity through a variety of processes, including a large and expanding Foundation-wide reviewer database, explicit policy guidance, mandatory training for all program officers, and directorate-level initiatives. NSF maintains a central electronic database of over 220,000 reviewers from which over 50,000 reviewers are selected annually. This database is continuously being updated; over 40,000 new potential reviewers have been added since 1992. Potential reviewers are identified from a variety of sources including applicant suggestions, references attached to proposals and published papers, and input from visiting scientists.
NSF policy states that each program funding recommendation must be accompanied by at least three external reviews (exceptions are described in Section F below). On average, proposals during FY 1996 were considered by 8.4 reviewers. The average number of reviews for proposals reviewed by mail-only, panel-only, and mail-plus-panels is shown in Text Figure 5. There is considerable variation among directorates as shown in Appendix Table 6.
Participation in the peer review process is voluntarily. Panelists are reimbursed for expenses; mail reviewers receive no financial compensation. Nevertheless, seventy percent (70%) of mail-only requests in FY 1996 produced reviews. This rate has been steady since 1990.
Reviewer Proposal Ratings
The NSF merit review system emphasizes reviewer narratives over summary ratings. Summary ratings are but one indicator of reviewer judgment of the proposal quality; written narratives, panel discussions, and portfolio management considerations make important contributions to program officer funding recommendations. The distribution of average summary ratings of mail reviews for awarded and declined proposals is provided in Appendix Table 7.
Every proposer receives a description of the context in which the proposal was reviewed from the NSF program officer, along with a copy of each review considered in making the funding decision. A declined PI may ask for additional clarification of the decision. If the PI is not satisfied that the proposal was fairly handled and reasonably reviewed a formal reconsideration may be requested from the cognizant Assistant Director (AD). If the AD upholds the original action, the applicant's institution may request a second reconsideration from the Foundation's Deputy Director (O/DD). The objective of the reconsideration process is to ensure that NSF's review has been fair and reasonable, both substantively and procedurally.
On average, NSF annually declines over 20,000 proposals but receives only 35 requests for formal reconsideration. Most program-level decisions are upheld in the reconsideration process. The number of requests for formal reconsideration and resulting decisions at both the AD and O/DD levels from FY 1992 through FY 1996 are displayed in Appendix Table 8.
Since the beginning of FY 1990, the Small Grants for Exploratory Research (SGER) option has permitted program officers throughout the Foundation to make short-term (one to two years), small-scale (less than $50 K) grants without formal external review. Characteristics of activities which can be supported by an SGER award include:
NSF received 220 SGER proposals in FY 1996 and made 164 awards; a funding rate of 75 percent. The SGER funding rate is much higher than for regular, competitively reviewed proposals in large part because potential SGER applicants are encouraged to contact an NSF program officer before submitting an SGER proposal to determine its appropriateness for the SGER funding option. As potential SGER applicants have become familiar with this practice, the SGER funding rate has increased from 55 percent in the first year to 75 percent in FY 1996, as shown in Text Figure 7. [See Appendix Table 9 for SGER proposal and award trends by diretorate.
Program officers may commit up to five percent of their operating budget to SGER awards, but the NSF-wide average is less than one percent. The average SGER award amount in FY 1996 was $33,869, well under the maximum authorized SGER award amount of $50,000. The average award amount has been relatively stable over the seven-year existence of the SGER funding option. SGER funding trends by directorate are provided in Appendix Table 10.
A Committee of Visitors (COV) is a panel of external experts convened to review the technical and managerial stewardship of a specific NSF program or cluster of programs. Each program that awards grants or cooperative agreements is reviewed on a three-year cycle. There are currently 178 such programs at NSF; 63 programs were reviewed during FY 1996. A list of all programs subject to review by a Committee of Visitors and the fiscal year of the most recent review is provided in Appendix Table 11.
Each COV must operate in accordance with the Federal Advisory Committee Act (FACA) of 1972. In compliance with FACA regulations, virtually all COVs are established as subcommittees of an existing chartered advisory committee, and the COV report is reviewed and approved by the parent advisory committee. The cognizant Assistant Director (AD) provides the parent advisory committee with a written response to each COV report. The COV's report and the AD's response are public documents; some have been publicized in the professional literature.
Authorized exemptions to the peer review process are listed in NSF Manual 10, Section 122 (Attachment I) and include routine award actions such as continuing grant increments and no-cost extensions. In special circumstances, the Director or designee may waive peer review requirements. Such waivers of peer review were granted twice during FY 1996; one for the Education and Human Resources (EHR) directorate and one for the Office of Policy Support (OPS).
No changes in the peer review system were implemented during FY 1996 which required Board approval. However, several significant activities involving the merit review system have taken place during the year, as described in the following section.
In 1994, an applicant for an NSF grant filed suit against NSF in Federal Court asserting that the law required the agency to disclose the names of reviewers who wrote reviews of the applicant's declined proposal. In May, 1996, the U.S. Court of Appeals for the District of Columbia unanimously ruled that NSF may withhold the names of reviewers in connection with their reviews of specific proposals.
During FY 1996, NSF continued work on a major examination of its merit review system. One of the drivers of this effort was a Fall, 1994, Government Accounting Office (GAO) report on peer review at three government agencies. Partly in response to the GAO report, NSF established a senior-level Peer Review Study Group (PRSG) to examine relevant issues associated with merit review. Subsequently, several task groups of NSF staff examined the efficacy and implications of specific options and made recommendations for action. Most of the task groups' recommendations have been acted upon; a few are still under consideration.
In the process of obtaining external comments about the merit review system, several stresses and strains on the merit review system were identified that deserved further attention. To address these concerns, an external Proposal Review Advisory Team (PRAT) advisory committee was chartered in late FY 1996 to:
The PRAT met for two days in December, 1996, and is expected to present their report to the Deputy Director by the Summer of 1997.
A joint NSF and NSB Task Force on Merit Review was charged at the October, 1996, Board meeting with examining the Board's generic review criteria and making recommendations on retaining or changing them, along with accompanying guidance on their use. The Task Force carried out an assessment of how the criteria have been applied in the review process, and recommended that the criteria be simplified and the language be harmonized with the NSF Strategic Plan. The Task Force released a Discussion Report in November, 1996 and asked NSF to publicize it widely and solicit comments from the science and engineering community. The Task Force will study the comments and make recommendations to the Board in the Spring of 1997.
If you have any questions or comments concerning this report, please send to Dr. Robert Webber at the following email address: firstname.lastname@example.org
Back to NSB Publications