Chapter 7. Post-Program Activities
7.1 Require final written technical reports
Clearly explain to students the reporting requirements associated with program participation; distribute reporting instructions at the outset (Section 5.1.5). Quite separate from other evaluation and assessment reports, the final written technical report will summarize the scientific and scholarly accomplishments of the student. Since the final days of the on-site program are occupied in the completion of student projects and in preparation for the student symposium or other presentation, set a reasonable deadline after the formal close of the on-site program for submission of final technical reports, if this cannot be completed before departure. As with evaluation questionnaires, it is difficult to retrieve these reports after the students have dispersed.
Provide clear instructions and format requirements for the final technical reports; include a template for information such as host scholar and laboratory assignment, and individual site visits arranged through the host scholar/engineer. If a published record of the international REU program is envisioned, strict adherence to formatting requirements yields significant timesaving in assembly and publication. In addition to typical technical report elements, include components such as a listing of site visits or special activities.
Student reports have additional uses for REU program directors. The technical reports provide perspective on the effectiveness of host scholars in guiding undergraduates in research. Use summaries and extracted “nuggets” of student accomplishments in annual reports to funding agencies as evidence of international REU site program performance. Compile technical reports into a “yearbook” as a formal record of the REU program, and send this to participants as a memento of their summer experience and for inclusion in their personal academic portfolios, and to university administrators as part of program promotion locally. After appropriate editing and reformatting, post technical reports on the program’s Web site as part of the advertising and recruitment campaign (Sections 5.1.1 and 5.1.2).
Enforcing student compliance with reporting requirements is difficult in some cases, especially when students return to a distant home campus. There are few remedies to non-compliance after the students have left the REU program. Follow-up reminders noting the importance of these reports sometimes elicit responses from students. If an installment plan for delivery of the participant allowance is used, consider holding the final payment until the technical report and any other reporting and evaluation documents are submitted.
7.2 Conduct a program evaluation
The program evaluation offers participants and host mentors the opportunity to report their satisfaction with the program. It also provides to REU directors as much as possible a robust measure of program success, and can lead to important improvements in the program’s structure and management.
The evaluation plan for an international REU site program includes: (1) measures to gauge program success in meeting its overall stated objectives and goals, including any integrated academic and research components; (2) mechanisms for assessment from the perspectives of student participants, host scientists/engineers, and program directors; (3) plans for tracking and longitudinal follow-up of student participants with regard to their continued interest and involvement in research and in global issues and collaborations.
In formulating the evaluation plan, include appropriate measures to address the program’s success in meeting the broader NSF objectives. Consult the NSF Strategic Plan regarding the Foundation’s outcome goals for investments in people, ideas and tools. The NSF Strategic Plan is available on-line at:
In this broader context, outcome measures for international REU programs might include:
Careful selection of control groups for comparison is important for a robust evaluation.
- Program success in introducing students to the profession;
- research results and publications;
- student persistence to the baccalaureate and/or Ph.D. degrees;
- student recruitment into science and engineering careers; and
- long-term effects on career decisions.
As with other program components, a well-developed evaluation plan will strengthen the international REU site proposal during review.
7.2.1 Use evaluation questionnaires
Evaluation questionnaires for student and host researchers reflect the specific REU program for which they are developed, and for reciprocal programs, parallel evaluation instruments should be designed for domestic and foreign students and hosts. Nevertheless, several general categories of evaluation questions are envisioned for student participants in most international REU programs. Student questionnaires address:
Samples of evaluation questions for students appear in Appendix 4.
- organizational quality and overall experience;
- value of the scientific research component;
- quality of research and/or laboratory facilities;
- quality of individual research guidance and mentorship;
- influence/effect on long-term career planning;
- administration and helpfulness of program administrators;
- usefulness of the pre-trip and on-site orientations;
- quality and value of formal program activities (language instruction, seminars and lectures, site visits);
- quality and value of cultural program; and
- quality of support services (housing, food, transportation).
Both quantitative and qualitative data are very useful in assessing the productivity and success of the research component of the student experience. This is a difficult area for assessment over the short term since typical measures of accomplishment such as research presentations at professional meetings and published reports are realized only in the months and years following the conclusion of the actual research work. Whereas these metrics can be collected in longer-term follow-up surveys, more realistic (but more qualitative) short-term measures for productivity by REU program participants are whether data presentations were made to laboratories or work groups at the host site, whether plans exist for the participant to continue his/her collaboration with the host lab or scholar, and whether the research will continue as a senior thesis or independent study project at the student’s home institution in the U.S.
Evaluation questionnaires designed for host researchers and mentors address complementary issues relating to host satisfaction. For these instruments, the relevant categories are:
Samples of evaluation questions for host scholars appear in Appendix 4.
- organizational quality and overall experience;
- value of undergraduate student researcher in the lab generally;
- quality of individual undergraduate scholar;
- helpfulness of program administrators;
- usefulness of the host scientist orientation; and
- willingness to participate in the future.
The development of useful evaluation questions is a multi-step process involving: (1) specification of the goals and objectives of the evaluation, (2) identification of the audiences for the evaluation, (3) consideration of the resources available to conduct the evaluation, and (4) formulation and refinement of evaluation questions.
Follow-up questionnaires to those students who were accepted to the program but declined to participate are informative to program directors in diagnosing and correcting such problems as non-competitive allowances (i.e., an applicant accepted a better offer), late date for notification of acceptance (i.e., the applicant accepted another offer before receiving acceptance to this program), or poor match to a host scholar or research project.
7.2.2 Use available resources for designing the program evaluation plan and instruments
Guidelines for the development of a project evaluation plan are available in two NSF publications, both of which are available on the World Wide Web:
Frechtling, J.A. (Ed.) 1993. User-Friendly Handbook for Project Evaluation: Science, Mathematics, Engineering and Technology Education. National Science Foundation Publication 93-152.
Available on-line at: http://www.nsf.gov/cgi-bin/getpub?nsf93152
Frechtling, J., and L. Sharp (Eds.) 1997. User-Friendly Handbook for Mixed Methods Evaluations. National Science Foundation Publication NSF 97-153.
Available on-line at:
This pair of documents recognizes that both quantitative and qualitative techniques can be combined in a mixed method evaluation to produce the greatest utility in outcome measurement and in program improvement. Although decision makers often demand quantitative measures of results, additional qualitative evidence complements qualitative data and completes the evaluation story. Mixed method evaluation incorporates student and faculty questionnaires, interviews, and the reports of observers. Both of these volumes include annotated bibliographies of technical references.
Another useful handbook of clear guidelines for effective program evaluation is:
Joint Committee on Standards for Educational Evaluation (J.R. Sanders, Chair) 1994. The Program Evaluation Standards: How to Assess Evaluations of Educational Programs, 2nd Edition. Sage Publications: Thousand Oaks, CA. 222 pp. ISBN 0-8039-5732-7
This reference manual was prepared by the Joint Committee on Standards for Educational Evaluation, comprising educators and evaluation specialists. It addresses program evaluation in a variety of settings. It is very readable and highly organized. It offers a hands-on approach to educational evaluation, leading the reader through a series of steps from designing the evaluation to collecting and analyzing the data to reporting.
The International Education of Students (IES) Model Assessment Practice is available on-line (together with additional information and materials) at:
This assessment model, developed by the Institute for the International Education of Students, is a conceptual framework for defining quality in study abroad programs. Four areas are addressed:
The following article discusses the development of the IES Model Assessment Practice:
- the student learning environment;
- student learning and development of intercultural competence;
- resources for academic student support; and
- program administration and development.
Gillespie, J., L.A. Braskamp and D.C. Braskamp 1999. Evaluation and study abroad: developing assessment criteria and practices to promote excellence. Frontiers: The Interdisciplinary Journal of Study Abroad, Fall 1999. Available on-line at:
7.2.3 Include a budget for evaluation
Engage the services of an expert in program and project evaluation, and include him/her on the management team. Do not neglect to include this important component of a successful, enduring REU program in the initial budget request.
7.3 Encourage alumni communication
Provide the means for program alumni to stay “connected” with the program and with each other, and use this in program evaluation and recruitment.
7.3.1 Develop an alumni Web site
Consider development of a “program alumni Web site” or a listserv. Make it a useful resource; post participant reports at this site, and provide links to domestic and international research and employment opportunities. Include a directory of alumni, and encourage students to update their contact information after graduation and at intervals thereafter. Use the alumni Web site as one part of the recruiting program and as a resource for new participants preparing to depart Sections 5.1.1, 5.1.2 and 5.1.4). Enlist a student assistant (perhaps a program alumna or alumnus) to design and maintain the Web site.
7.3.2 Organize reunion meetings
Organize reunion meetings, especially if the program follows a single-institution model and alumni are more likely to stay in residence nearby. Run alumni reunions in parallel with pre-departure orientation meetings and host a mixer where new participants can meet alumni and share practical advice.
Chapter 8 - Overall International REU Site Project Evaluation
Last updated July, 2002