Dr. Warren M. Washington, Chair
Dr. Mary K. Gaillard
Dr. Shirley M. Malcom
Dr. Eamon M. Kelly
National Science Foundation Staff
Dr. Mary E. Clutter
Dr. John B. Hunt
Mr. Paul J. Herer,
Merit review is the cornerstone of the NSF's work. Through the use of merit review, NSF seeks to maintain the high standards of excellence and accountability for which it is known around the world. NSF's current criteria were adopted by the National Science Board in 1981.
At the November 1996 meeting of the National Science Board, the NSB-NSF Staff Merit Review Task Force recommended that the current merit review criteria be simplified and that the language be harmonized with the NSF strategic plan. These recommendations are contained in the report: NSB/MR-96-15, NSB-NSF Staff Merit Review Task Force Discussion Report, November 20, 1996 (Appendix I).
The Board received this report and asked the NSF Director to share the proposed revisions to the merit review criteria with the science and engineering community in order to solicit its input. To encourage the broadest possible comment and discussion, NSF solicited the comments through press coverage and through direct contacts among staff, universities, and professional associations. The proposed criteria were also posted on the World Wide Web, with a response form to facilitate suggestions and reactions.
Via the feedback mechanisms provided on these Web pages, NSF received over 300 responses, most from tenured faculty who had experience with the NSF merit review process. A number of comments also arrived in the form of written letters. A majority of respondents who gave an overall positive or negative opinion favored the change, although many had some suggestions for further improvement or clarification. Overall, the community responses, although not a representative sample of the community, were informative and useful in helping the task force to draft improved criteria.
The Task Force recommends that the following two criteria be adopted in place of the four criteria that are currently used.
2. What are the broader impacts of the proposed activity?
The following are suggested questions to consider in assessing how well the proposal meets this criterion: How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?
When assigning your summary rating, remember that the two criteria need not be weighted equally. Emphasis should depend upon either (1) additional guidance you have received from NSF or (2) your own judgment of the relative importance of the criteria to the proposed work. Finally, you are requested to write a summary statement that explains the rating that you assigned to the proposal. This statement should address the relative importance of the criteria and the extent to which the proposal actually meets both criteria.
In implementing the new criteria, the Task Force believes that NSF should address such issues as: (1) designing proposal review forms (both paper and electronic) that are clear and easy to use (2) training NSF staff, and (3) revising NSF's proposal preparation guidelines. The Task Force recommends that NSF proceed without delay to full implementation of the proposed changes.
Analysis of Public Comment and Rationale for Task Force Recommendations
A brief analysis of public responses to the Task Force recommendations was prepared by the NSF Office of Policy Support (OPS). This report (Appendix III), which proved very useful to the Task Force, attempts to characterize the individuals who responded and summarize their views about the changes in the proposed criteria.
The Task Force members also read each of the individual responses received by NSF and prepared an analysis of the issues that were raised. It then met on February 19, 1997 to discuss and resolve these issues and prepare its final recommendations. The Task Force's analysis of these issues is presented below.
#1 A central issue is the "weighting or threshold" issue, which was raised by perhaps a third of the respondents. Many respondents expressed concern that adopting the new criteria will lead to a decline in NSF's standards of excellence; i.e. "excellent research with ok relevance" will be equated with "ok research with excellent relevance." Others stated that, for research proposals, Criterion #1 is much more important than Criterion #2, and should be weighted accordingly (some suggested 90/10). Still others criticized Criterion #2 as irrelevant, ambiguous, or poorly worded.
Several options for responding to this issue were identified and discussed:
b) For most, if not all, proposals, NSF should present the first criterion as a "threshold" criterion. In other words, NSF will not fund anything that does not pass muster on Criterion #1. Criterion #2 should be used to select among those proposals that exceed the Criterion #1 threshold.
c) Differentiate the criteria for basic research, applied research, and education proposals. This can be done with language introducing/explaining the criteria; i.e., what is currently done for the current four criteria. The extreme implementation of this is to have different sets of criteria for these different categories of activities
d) Resolve the imbalance between the two criteria by having Criterion #2 address BOTH the intellectual impact and the "broader" impacts, which it currently does not do. This would be accomplished by adding something like the following question to Criterion #2: "How important is the proposed activity to advancing knowledge and learning within its own field and across different fields?"
#2 Another issue raised by the community is the "presentation issue;" i.e. how is NSF going to get reviewers to pay attention to the new criteria, which will be printed on the back of a form?
Recommendation: The Task Force recommends that NSF prepare a sample reviewer form, both for regular mailing and for e-mail reviews. A half page tear-off cover sheet can be attached to the review form, which presents the context for using the criteria. For example, at the top of the page, it could state: PLEASE READ THIS BEFORE BEGINNING YOUR REVIEW!
#3 A third issue, raised by perhaps 20% of the respondents, can be summed up as follows: In the review of the quality of the proposed research, NSF should give greater prominence to research performer competence. Many individuals who made this recommendation suggested that NSF have a separate criteria for this.
Recommendation: The Task Force believes this could be handled with some editing rather than creating a third criterion. It also thinks that, in giving prominence to proposer competence, it is important not to create a bias against new investigators entering the field. Hence, in Criterion #1 the following question should be changed to: "How well qualified is the proposer (individual or team) to conduct the project?" Also, this question should be moved up from third to second position.
#4 A substantial fraction of the respondents indicated that the question under Criterion #2 dealing with "diversity" was ambiguous.
Recommendation: This can be addressed with some revised wording.
To: "How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, geographic, etc.)?"
#5 A number of respondents point out that the criteria need to encourage greater innovation, risk, and creativity in NSF-supported activities.
Recommendation: The Task Force believes that creativity and originality are among the most important characteristics of an NSF-supported activity; hence it recommends the following revised wording.
To: "To what extent does the proposed activity suggest and explore creative and original concepts?"
Recommendation: The Task Force believes that respondents may be interpreting this question too narrowly. While it may not be possible to predict specific potential applications for one's research, one should be able to discuss the value or applicability of the line of inquiry or research area. The following revised wording was recommended:
To: "And, what may be the benefits of the proposed activity to society?"
Recommendation: The Task Force believes that reviewers would not be in a very good position to assess budgets and time-tables. However, it recommended the following revision in order to clarify the issue:
To: "How well-conceived and organized is the proposed activity?"
Recommendation: The Task Force recommends that questions under each criterion be reworded, using such phrases as: "To what degree does ---?" or "What is the potential ---?" or "How well does ---?"
#9 There were a number of suggestions for placing greater emphasis on dissemination of results. Also, respondents had problems interpreting the question concerning scientific literacy.
Recommendation: The Task Force recommends the following revised wording.
To: "Will the results be disseminated broadly to enhance scientific and technological understanding?"
Recommendation: The Task Force recommends that the NSF "generic" reviewer form provide for the following:
The Task Force believes that the proposed new criteria are flexible enough, both in their design and proposed implementation, to be useful and relevant across NSF's many different programs. Furthermore, it is expected that NSF will continue to employ special criteria to respond to the specific objectives of certain programs and activities. Hence, the Task Force recommends that NSF proceed without delay to full implementation of the proposed changes. Adoption of the new criteria will facilitate, clarify and simplify the proposal evaluation process. Excellence will continue to be the hallmark of all NSF-sponsored activities.
II. Sample NSF Proposal Review Form
III. Analysis of Responses to the NSB/NSF Report on Merit Review Criteria, Office of Policy Support, March 6, 1997.