NSB/MR-97-05

National Science Board and
National Science Foundation Staff
Task Force on Merit Review

Final Recommendations

March 1997


NATIONAL SCIENCE BOARD

*Member, Executive Committee
**NSB nominee pending U.S. Senate confirmation


Members of the Task Force

National Science Board Members

Dr. Warren M. Washington, Chair

Dr. Mary K. Gaillard

Dr. Shirley M. Malcom

Dr. Eamon M. Kelly

National Science Foundation Staff

Dr. Mary E. Clutter

Dr. John B. Hunt

Mr. Paul J. Herer,

Executive Secretary


NSB/MR-97-05
March 18, 1997

NSB-NSF Staff Merit Review Task Force
Final Recommendations

I. Introduction

Merit review is the cornerstone of the NSF's work. Through the use of merit review, NSF seeks to maintain the high standards of excellence and accountability for which it is known around the world. NSF's current criteria were adopted by the National Science Board in 1981.

At the November 1996 meeting of the National Science Board, the NSB-NSF Staff Merit Review Task Force recommended that the current merit review criteria be simplified and that the language be harmonized with the NSF strategic plan. These recommendations are contained in the report: NSB/MR-96-15, NSB-NSF Staff Merit Review Task Force Discussion Report, November 20, 1996 (Appendix I).

The Board received this report and asked the NSF Director to share the proposed revisions to the merit review criteria with the science and engineering community in order to solicit its input. To encourage the broadest possible comment and discussion, NSF solicited the comments through press coverage and through direct contacts among staff, universities, and professional associations. The proposed criteria were also posted on the World Wide Web, with a response form to facilitate suggestions and reactions.

Via the feedback mechanisms provided on these Web pages, NSF received over 300 responses, most from tenured faculty who had experience with the NSF merit review process. A number of comments also arrived in the form of written letters. A majority of respondents who gave an overall positive or negative opinion favored the change, although many had some suggestions for further improvement or clarification. Overall, the community responses, although not a representative sample of the community, were informative and useful in helping the task force to draft improved criteria.

Final Recommendations

The Task Force recommends that the following two criteria be adopted in place of the four criteria that are currently used.

The Task Force further recommends that a cover sheet be attached to the proposal review form, which presents the context for using the criteria. The suggested language for this cover sheet is as follows:

Regarding the "ratings" issue, which was highlighted in the Discussion Report, the Task Force recommends that the NSF "generic" proposal review form provide for the following:

Note: The Task Force's recommendations are exemplified in the attached sample NSF Proposal Review Form (Appendix II).

In implementing the new criteria, the Task Force believes that NSF should address such issues as: (1) designing proposal review forms (both paper and electronic) that are clear and easy to use (2) training NSF staff, and (3) revising NSF's proposal preparation guidelines. The Task Force recommends that NSF proceed without delay to full implementation of the proposed changes.

Analysis of Public Comment and Rationale for Task Force Recommendations

A brief analysis of public responses to the Task Force recommendations was prepared by the NSF Office of Policy Support (OPS). This report (Appendix III), which proved very useful to the Task Force, attempts to characterize the individuals who responded and summarize their views about the changes in the proposed criteria.

The Task Force members also read each of the individual responses received by NSF and prepared an analysis of the issues that were raised. It then met on February 19, 1997 to discuss and resolve these issues and prepare its final recommendations. The Task Force's analysis of these issues is presented below.

#1 A central issue is the "weighting or threshold" issue, which was raised by perhaps a third of the respondents. Many respondents expressed concern that adopting the new criteria will lead to a decline in NSF's standards of excellence; i.e. "excellent research with ok relevance" will be equated with "ok research with excellent relevance." Others stated that, for research proposals, Criterion #1 is much more important than Criterion #2, and should be weighted accordingly (some suggested 90/10). Still others criticized Criterion #2 as irrelevant, ambiguous, or poorly worded.

Several options for responding to this issue were identified and discussed:

Recommendation: The Task Force believes that option (a) is the best one because it does not polarize the research and education communities and can be applied very flexibly. For example, when the reviewer assigns an overall rating, the two criteria need not be weighted equally but should depend upon either (1) additional guidance received from NSF and/or (2) the reviewer's judgment of the relative importance of the criteria to the proposed work.

#2 Another issue raised by the community is the "presentation issue;" i.e. how is NSF going to get reviewers to pay attention to the new criteria, which will be printed on the back of a form?

Recommendation: The Task Force recommends that NSF prepare a sample reviewer form, both for regular mailing and for e-mail reviews. A half page tear-off cover sheet can be attached to the review form, which presents the context for using the criteria. For example, at the top of the page, it could state: PLEASE READ THIS BEFORE BEGINNING YOUR REVIEW!

#3 A third issue, raised by perhaps 20% of the respondents, can be summed up as follows: In the review of the quality of the proposed research, NSF should give greater prominence to research performer competence. Many individuals who made this recommendation suggested that NSF have a separate criteria for this.

Recommendation: The Task Force believes this could be handled with some editing rather than creating a third criterion. It also thinks that, in giving prominence to proposer competence, it is important not to create a bias against new investigators entering the field. Hence, in Criterion #1 the following question should be changed to: "How well qualified is the proposer (individual or team) to conduct the project?" Also, this question should be moved up from third to second position.

#4 A substantial fraction of the respondents indicated that the question under Criterion #2 dealing with "diversity" was ambiguous.

Recommendation: This can be addressed with some revised wording.

The Task Force also felt that some language should be included that addresses the need "to avoid undue concentration of resources." However, it is difficult to see how reviewers could respond to this issues.

#5 A number of respondents point out that the criteria need to encourage greater innovation, risk, and creativity in NSF-supported activities.

Recommendation: The Task Force believes that creativity and originality are among the most important characteristics of an NSF-supported activity; hence it recommends the following revised wording.

#6 Some respondents stated that, for much of basic research, it is not possible to make a meaningful statement about the potential usefulness of the research.

Recommendation: The Task Force believes that respondents may be interpreting this question too narrowly. While it may not be possible to predict specific potential applications for one's research, one should be able to discuss the value or applicability of the line of inquiry or research area. The following revised wording was recommended:

#7 A number of respondents suggested that a question of criterion #1 should have reviewers take into account the benefits in relation to the research risks and costs of the proposed research activity. This may involve asking a question such as: "Is the project well-designed, with a reasonable budget and achievable time-tables?"

Recommendation: The Task Force believes that reviewers would not be in a very good position to assess budgets and time-tables. However, it recommended the following revision in order to clarify the issue:

#8 Current wording of the questions under each criterion tends to encourage "yes-no" responses instead of explanations from the reviewers.

Recommendation: The Task Force recommends that questions under each criterion be reworded, using such phrases as: "To what degree does ---?" or "What is the potential ---?" or "How well does ---?"

#9 There were a number of suggestions for placing greater emphasis on dissemination of results. Also, respondents had problems interpreting the question concerning scientific literacy.

Recommendation: The Task Force recommends the following revised wording.

#10 The "ratings" issue, which was highlighted in the Task Force Discussion Report, remains very difficult to resolve. The community is divided in its preference for a single composite rating, and separate ratings for each of the two criteria. Also, a number of people have suggested that NSF discontinue basing its ratings on hypothetical distributions, e.g., "among the top 5%."

Recommendation: The Task Force recommends that the NSF "generic" reviewer form provide for the following:

It is also recommended that that NSF discontinue describing its ratings by referring to hypothetical distributions.

Conclusion

The Task Force believes that the proposed new criteria are flexible enough, both in their design and proposed implementation, to be useful and relevant across NSF's many different programs. Furthermore, it is expected that NSF will continue to employ special criteria to respond to the specific objectives of certain programs and activities. Hence, the Task Force recommends that NSF proceed without delay to full implementation of the proposed changes. Adoption of the new criteria will facilitate, clarify and simplify the proposal evaluation process. Excellence will continue to be the hallmark of all NSF-sponsored activities.


NSB/MR-97-05

National Science Board and
National Science Foundation Staff
Task Force on Merit Review






APPENDICES






I. NSB/MR-96-15, Discussion Report, NSB and NSF Staff Task Force on Merit Review, November 20, 1996

II. Sample NSF Proposal Review Form

III. Analysis of Responses to the NSB/NSF Report on Merit Review Criteria, Office of Policy Support, March 6, 1997.