text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
 
About NSF
design element
About
History
Visit NSF
Staff Directory
Organization List
Career Opportunities
Contracting Opportunities
NSF & Congress
Highlights
Hearings
Program Awards by State/District
Major Legislation
Science & Policy Links
NSF & Congress Archive
Contact Congressional Affairs
Related
Science & Engineering Statistics
Budget
Performance Assessment Information
Partners
Use of NSF Logo
 


NSF & Congress
Testimony

Dr. Julia Lane

Julia I. Lane
Program Director
Science of Science and Innovation Policy

National Science Foundation

Statement
To the Committee on Science and Technology
Subcommittee on Research and Science Education
U.S. House of Representatives

The Science of Science and Innovation Policy

September 23, 2010

Chairman Lipinski, Ranking Member Ehlers, and Members of the Subcommittee, it is my distinct privilege to be here with you today to discuss NSF's Science of Science and Innovation Policy (SciSIP), the activities of the Science of Science Policy Interagency Group, and the STAR METRICS program--a new federal effort designed to create a scientific quantifiable measurement of the economic and social impacts of federal research spending. I am Dr. Julia Lane, the program Director of the SciSIP program at the National Science Foundation, and co-chair of the NSTC working group on Science of Science Policy (SOCP).

At the outset, I would like to express my appreciation to all the Members on the House Committee on Science and Technology for their unstinting support to advance both the cause, and the frontiers of science. This Committee has long held steadfast in the knowledge that America's present and future strength, prosperity and global preeminence depend directly on fundamental research.

The National Science Foundation has always believed that optimal use of limited Federal funds relies on two conditions: Ensuring that research is aimed--and continuously re-aimed--at the frontiers of understanding; and certifying that every dollar goes to competitive, merit-reviewed, and time-limited awards with clear criteria for success. When these two conditions are met, the nation gets the most intellectual and economic leverage from its research investments.

Yet our portfolio keeps changing. We have a minimal vested interest in maintaining the status quo, and pride ourselves on an ability to shift resources quickly to the most exciting subjects and most ingenious researchers.

Moreover, we regard it as an essential part of our mission to constantly re-think old categories and traditional perspectives. This ability is crucial now, because conventional boundaries are disappearing--boundaries between nations, boundaries between disciplines, boundaries between science and engineering, and boundaries between what is fundamental and what is application. At the border where research meets the unknown, the knowledge structures and techniques of life science, physical science, and information science are merging.

Additionally, our scope is extremely wide, extending across all the traditional mathematics, science and engineering disciplines. That is a major advantage in today's research climate, where advances in one field frequently have immediate and important applications to another. The same mathematics used to describe the physics of turbulent air masses may suddenly explain a phenomenon in ecology or in the stock market, or the changes in brain waves preceding an epileptic seizure. The same algorithms used by astronomers to discern patterns in the distant heavens can aid radiologists to understand a mammogram, or intelligence systems to identify a threat. Only an agency that sees the "big picture" can assure this kind of interdisciplinary synergy.

For all of these reasons, the National Science Foundation is fostering the development of the knowledge, theories, data, tools and human capital needed to cultivate a Science of Science and Innovation Policy program. The program has three major aims: advancing evidence-based science and innovation policy decision making; developing and building a scientific community to study science and innovation policy; and developing new and improved datasets.

The overarching goal in this effort, however, is to conduct basic research that creates new objective models, analytic tools, and datasets to inform our nation's public and private sectors about the processes through which investments in science and engineering research may be transformed into scientific, social and economic outcomes.

We need to better understand the contexts, structures, and processes of scientific and engineering research, to evaluate reliably the tangible and intangible returns from investments in research and development (R&D), and to predict the likely returns from future R&D investments.

It is not only leaders in the scientific and engineering fields, but policymakers as well in the Executive and Legislative Branches who recognize that we need better approaches for developing science policy, which is why the National Science and Technology Council established the Science of Science Policy Interagency Task Group. That task group's roadmap characterized our current systems of measurement of the science and engineering enterprise as inadequate. There is enormous potential to do better.

To begin to create a scientific, quantifiable measurement of the economic and social impacts of our federal research investments, this Administration has initiated an innovative new program, STAR METRICS (Science and Technology in America's Reinvestment--Measuring the EffecTs of Research on Innovation, Competitiveness and Science). This initiative is led by the National Institutes of Health and the National Science Foundation under the auspices of the White House Office of Science and Technology Policy. The goal is to develop a system that can be used to track the impact of federal science investments. I will return to the topic of STAR METRICS later in my testimony.

    1. The overall vision and strategy for research and education in the Science of Science and Innovation Policy.

Federally funded basic and applied scientific research has had a significant impact on innovation, economic growth and America's social well-being. We know this in the broad sense from numerous economic analyses but it is difficult to disentangle the impact of Federal investment versus private, state and industrial investments. We have little information about the impact of individual projects and programs, whether federally or privately funded. We have little information about the impact of science agencies. Thus, although determining which federally funded research projects yield solid results and which do not is a subject of high national interest, since American taxpayers invest more than $140 billion annually in research and development (R&D), there is little evidence to support such analysis. In short, although Congressional and Executive Branch policy decisions are strongly influenced by past practices or data trends that may be dated,or have limited relevance to today's economic situation. A deeper understanding of the changing framework in which scientific and technical innovation occurs would help policymakers decide how best to make and manage limited public R&D investments to exploit the most promising and important opportunities.

The lack of analytical capacity in science policy is in sharp contrast to other policy fields that focus on workforce, health and education. Debate in these fields is informed by the rich availability of data, high quality analysis of the relative impact of different interventions and computational models that often allow for forward-looking analyses with impressive results. For example, in workforce policy, the evaluation of the impact of distinct education and training programs has been transformed by careful attention to issues such as selection bias and the development of appropriate comparison groups. The analysis of data about geographic differences in health care costs and health care outcomes has featured prominently in guiding health policy debates. And education policy has moved from a "invest more money" and "launch a thousand pilot projects" imperative to a more systematic analysis of programs that actually work and that promote local and national reform efforts.

Each of those efforts, however, has benefited from an understanding of the systems that are being analyzed. In the case of science policy, no such agreement currently exists. NSF's Science of Science & Innovation Policy (SciSIP) program is designed to advance the scientific basis of science and innovation policy.

Vision

The principal goal is to advance the scientific basis of making science policy decisions, particularly those involving budgets, through the development of improved data collection, theoretical frameworks, computational models and new analytic tools.

A major component of the SciSIP program is the funding of investigator initiated research. Through direct engagement of the federal policy community with the research community, it is hoped that future policy decisions can be informed by empirically validated hypotheses and informed judgment. Our aim, as noted in the program's description, is to "engage researchers from all of the social, behavioral and economic sciences as well as those working in domain-specific applications such as chemistry, biology, physics, or nanotechnology in the study of science and innovation policy. The program welcomes proposals for individual or multi-investigator research projects, doctoral dissertation improvement awards, conferences, workshops, symposia, experimental research, data collection and dissemination, computer equipment and other instrumentation, and research experience for undergraduates. The program places a high priority on interdisciplinary research as well as international collaboration."

The program explicitly fosters a multi-level science (in addition to more obviously being an interdisciplinary science) that spans from the study of cognitive phenomena in individual scientists (e.g., the study of fixation, insight, reasoning, and decision making) to the study of whole industries and policies at the industry level. What makes the overall effort a potentially transformative effort is the support of research at multiple levels: industry level policies are only successful if it has individual-level effects (i.e., that engineers and scientists change), and individual-level effects are only important if they scale to produce industry-level differences.

Another focus of the SciSIP program is the redesign of the surveys undertaken by NSF's Science Resources Statistics, the federal statistical agency responsible for collecting and disseminating data on the U.S. science and engineering enterprise. The most visible activity has been the redesign of the Business R&D and Innovation Survey (BRDIS) which collects information from a nationally representative sample of about 40,000 companies, including companies in both manufacturing and nonmanufacturing industries. This survey is the primary source of information on business, domestic and global R&D expenditures, and workforce. The new structure enables respondents to provide detailed data on the following:

  • How much is a company investing in its domestic and worldwide R&D relationships, including R&D agreements, R&D "outsourcing," and R&D paid for by others?
  • What is the strategic purpose of a company's worldwide R&D activities and what are their technology applications?
  • What are the details of a company's patenting, licensing, and technology transfer activities, and companies' innovative activities?

In addition, a limited number of questions are asked about activities related to new or improved products or processes. These are intended to serve as basis for collecting an expanded set of innovation metrics in the future. The results of this data collection are now being published as part of SRS's ongoing reporting activity:

Strategy

The focus of the program's strategy has been to convince the academic community that the study of science policy is a worthwhile academic endeavor. This has taken three main forms. The first has been to engage in a substantial amount of outreach through presenting at professional workshops and conferences (an average of five or six a year), through supporting specific workshops on various science policy topics (two or three a year), through establishing a very active listserv (which has grown to over 720 members in less than two years) and through supporting a Science of Science Policy Web site (http://scienceofsciencepolicy.net).

The second part of the strategy has been to invest in high quality research datasets. Good bricks need straw, and good research in an empirical field like science and innovation policy requires good data. Fields as disparate as biotechnology, geosciences, and astronomy have been transformed by both data and knowledge access. NSF hopes to similarly transform the study of science policy by improving science data. Such a transformation will occur in three ways. First, the scientific challenge is compelling: the way in which scientists create, disseminate and adopt knowledge in cyberspace is changing in new and exciting ways. Collaborations between computer scientists and social scientists, fostered by SciSIP, can capture these activities using new cybertools. Second, new and exciting data attract new researchers to the field. This in turn attracts new graduate students, who see new ground being broken and exciting opportunities for research. Finally, we aim to actively engage the federal science policy community through a variety of workshops, as well as direct engagement through the Science of Science Policy Interagency Group.

The program has made a total of 99 awards: 19 in 2007, 23 in 2008, 31 in 2009, and 26 in 2010. The program began to accept doctoral dissertation proposals in 2010; five of those were funded. The success rate for standard proposals is currently about 25%; higher for doctoral dissertation proposals. A total of 182 principal investigators have been supported--of those 147 are scientists from Social, Behavioral and Economic Science domains and the balance are from areas as diverse as Computer and Information Sciences, Education, Physics, Biology and Law.

Results

The program is beginning to achieve some of its ambitious goals. A SciSIP Principal Investigator (PI), who is a Business School Dean at a university with a strong focus on publicly-funded research, has noted "I know full well that this new program provides unique grant opportunities for faculty members in management, information systems, and other fields of business administration. He cites the following from his personal experience "… in the field of business research, and business management, the Science of Science & Innovation Policy papers are featured in some of the best sessions at the Academy of Management Meetings. This innovative program has sparked considerable interest in public policy among management scholars, and particularly in business schools. The impact of the research you are funding struck home when I read the latest issue of BizEd, the magazine of The Association to Advance Collegiate Schools of Business (AACSB, an association of educational institutions, businesses, and other organizations devoted to the advancement of higher education in management education. It is also the premier accrediting agency of collegiate business schools and accounting programs worldwide. The research you have funded was prominently featured in their magazine, which is circulated to thousands of business schools worldwide."

Additionally, the impact of the SciSIP program has influenced several National Research Council studies, and thus impacted public policy with respect to technology commercialization and academic and public sector entrepreneurship. One is the Congressionally-mandated evaluation of the Small Business Innovation Research Program. Another is a committee entitled "Best Practices in National Innovation Programs for Flexible Electronics" and a third is "Management of University Intellectual Property: Lessons from a Generation of Experience, Research, and Dialogue."

In another example, a major part of the science and innovation policy debate has been the role of R&D and research tax credits whose budgetary cost is about $15 billion each year.1 The obvious policy question is how effective are these tax credits in stimulating innovation? SciSIP funded PIs have examined changes in R&D tax credit generosity across countries and US states over time to evaluate business firms' response. They estimate that for every $1 of tax credits firms spend about $1 more on R&D. However, the research also extends to the impact of firms' response to the uncertainty about the duration of the federal Research and Experimentation (R&E) tax credit, which is not currently permanent and in fact expired at the end of 2009. The uncertainty about renewal has offsetting effects--one is to increase short-term expenditures because firms think they need to do R&D now to get the credit. The reverse is to reduce overall R&D expenditures since uncertainty is detrimental to the expected payoffs from long-term investments such as fundamental R&D. The sign of the net effect is an empirical question, and again something the SciSIP PI has been working on--he finds a strong negative effect of uncertainty on general investment and employment, and is currently extending this work to R&D. The same PI presented in September 2009 to the Federal Reserve Bank Board of Governors, including Governor Bernanke; the Fed was trying to understand why the IT "productivity miracle," which was a major driver of US economic growth in the late 1990s, has slowed down by the late 2000s. One possible reason is that better use of IT is associated with organizational change, and the rate of organizational change has potentially slowed down; a major SciSIP-funded grant supports collecting a large national survey to try to examine why and how that change has occurred.

We can also learn from history. Another SciSIP PI has looked at two case studies in depth: the invention of the airplane and Edison's invention of the electric light. In both cases, the invention took a long period of time--110 years and 80 years, respectively. In both cases even the earliest attempts were based on many years [of work on mathematics and technology and hundreds of years of work of science. To illustrate, Sir Humphry Davy first demonstrated incandescence of materials in 1808. His work drew on the Voltaic pile (battery) invented in 1800, the Leyden jar developed in 1744, and carbon produced as charcoal during the Roman Empire no later than 25 A.D. Leyden jars depended on work by the ancient Greeks in 600 B.C. Thus, the foundation of the science behind electric light dates back 2400 years before incandescence, after which it took 80 more years of R&D to develop an effective electric light. The airplane also has a similarly long foundational period and duration of invention. In looking at various inventions, research has shown that there are several different weak methods but also some powerful strategies that vastly speed things along. Edison succeeded simply because he had enormous resources (the Edison Electric Light Company was capitalized at $300,000--about $30 million today. The Wright Brothers were far more efficient at developing the airplane than Edison was in developing the electric light.

How is the NSF fostering collaboration between social and behavioral scientists and researchers from other disciplines, including computer scientists, engineers and physical scientists in science and technology policy research?

This is being done in a number of ways: through the program call, through workshops, and through successful and visible interdisciplinary projects.

Program Description

The SciSIP program explicitly encourages interdisciplinary cooperation in the program description. In particular, the program states:

"The SciSIP program invites the participation of researchers from all of the social, behavioral and economic sciences as well as those working in domain-specific applications such as chemistry, biology, physics, or nanotechnology. The program welcomes proposals for individual or multi-investigator research projects, doctoral dissertation improvement awards, conferences, workshops, symposia, experimental research, data collection and dissemination, computer equipment and other instrumentation, and research experience for undergraduates. The program places a high priority on interdisciplinary research as well as international collaboration."

Workshops

Most of the workshops that have been hosted have been explicitly interdisciplinary in nature, bringing together domain scientists and social, behavioral and economic scientists, and have resulted in calls for proposals (called Dear Colleague Letters) supported by multiple NSF programs.

Examples include:

  • A two-day workshop to advance the scientific study of federally funded centers and institutes as key elements in the innovation ecosystem. The workshop brought together engineers and natural, physical, and social scientists to address central questions relating to the role of NSF-funded centers and institutes in science and innovation policy.
  • Two separate workshops studying innovation in organizations. One of these, hosted by the Conference Board and supported by four Social, Behavioral and Economic (SBE) Sciences and three Computer and Information Science and Engineering (CISE) programs, was attended by computer scientists, SBE scientists and representatives from the business community to examine the potential for cyber data to better inform our understanding of innovation. A second conference brought together 20 leading computer scientists (from the fields of data management, data mining, security/privacy, social networks) and social/organizational scientists (that included economists, sociologists, psychologists, anthropologists) to identify emerging major challenges in the collection and use of confidential data collection for the study of innovation in organizations. SciSIP led the reusulting development of a Dear Colleague Letter whose purposes was to gather and create new Cyber-enabled data on innovation in organizations, supported by six SBE and four CISE programs as well as the Office of Cyber Infrastructure.
  • A workshop in conjunction with the NSF's Chemistry Division that examined the impact of science R&D in the United States, focusing on chemical sciences and related industries. This led to a Dear Colleague Letter from SciSIP and the Chemistry Division reaching out to the chemistry and the social science communities advising them of funding opportunities related to assessing and enhancing the impact of R&D in the chemical sciences in the United States.
  • An interdisciplinary workshop which examined the potential for new visualization tools to track the impact of investments in science. These possibilities include tracing the impact of basic research on innovation, examining the changing structure of scientific disciplines, studying the role of social networks in the dispersion of scientific innovations as well as making comparisons of how the U.S. compares internationally in science. That workshop brought together researchers from a broad range of disciplines to examine such key questions, and to engage the federal science community in a discussion about whether and how the tools could be used in the federal context.
  • Three workshops have directly engaged CISE and SBE researchers in enhancing NSF's ability to describe its research portfolio. The SciSIP program worked with the CISE directorate to form an Advisory subcommittee to provide advice on approaches to improving the way NSF interacts with its proposal and award portfolio. Although NSF staff still rely on traditional methods to do their jobs, such methods are becoming less practical given the rapidly changing nature of science, the increased recognition of the importance of funding interdisciplinary and potentially transformative research, and the significant increase in the number of proposals submitted. Individuals with research expertise in machine learning, data mining, information visualization, human-centered computing, science policy, and visual analytics were recruited for this effort. Nine teams were put together and charged with providing advice to NSF on identifying and demonstrating techniques and tools that characterize a specific set of proposal and award portfolios. Their report, in turn, will advise NSF on how to better structure existing data, improve use of existing machine learning, analysis, and visualization techniques to complement human expertise and better characterize its programmatic data. The results should help NSF identify tools that will help fulfill its mission including identifying broader impacts, as well as funding transformative and interdisciplinary research. NSF has also engaged program managers across the federal government so that our collective approaches can inform not only us, but other science agencies.
  • A workshop responding to Congressman Holt's request for better ways to measure the economic impact of federal research investments. SciSIP, together with NIH and other agencies, is supporting the National Academy of Sciences' Board on Science, Technology, and Economic Policy (STEP) and Committee on Science, Engineering, and Public Policy (COSEPUP) 2011 workshop on science measurement. This workshop is aimed at discussing new methodologies and metrics that can be developed and utilized for assessing returns on research across a wide range of fields (biomedical, information technology, energy, and other biological and physical sciences, etc.), while using background papers that review the methodologies used to date as a starting point.

As one SciSIP PI has noted, "SciSIP … creates a domain around which researchers from a variety of disciplines--biology and physics and economics as well as information science and public policy--can coalesce to pursue research topics in this domain for their own sake, rather than in the interstices of other projects in their home disciplines. As such, it acts as an attractor for top researchers across the natural and social sciences, allowing them to pursue their interests in SciSP topics."

Successful Examples

There are a number of examples of the fruits of these activities. For example, SciSIP funded research supports a University of Michigan research team consisting of a sociologist, a bioethicist specializing in informed consent and stem cell regulation, a bioethicist trained as a molecular biologist who is working on cell banking, and a post-doc in stem cell biology. The combination is a powerful one as it matches expertise with social scientific data and analysis methods, with deep knowledge about both the policy and the science.

Similarly, the interdisciplinary work of two SciSIP Pis has helped developed new metrics of the transmission of knowledge. These metrics go beyond citation metrics to usage metrics and help us better understand the impact that federal investment in research is having on research results. By mapping the structure of science and looking at how this structure changes over time, we can see the shifting landscape of scientific collaboration and understand the new emerging disciplines. That will enable us to to anticipate these changes and properly target research funding to new and vibrant areas. For instance, their work provides a striking example of the emergence of neuroscience over the past decade--changing from an interdisciplinary specialty to a large and influential stand alone discipline on a par with physics, chemistry, or molecular biology.

How is NSF fostering the development of science and technology policy degree programs and courses of study at colleges and universities? What is the current scope and level of support for such programs?

As with many NSF programs, the SciSIP program explicitly encourages submissions that support graduate student development. While there is no direct targeting of funds to policy programs, SciSIP supported 28 researchers from science and technology policy programs. In an example of the type of support that has been provided to expand the course of study, over 250 undergraduate students from Economics (behavioral economics), Cognitive Science, Electrical and Computer Engineering, and Industrial Engineering have participated in a project at Purdue University, which is an interdisciplinary collaboration linking social scientists and computer scientists and engineers.

A further example is the work done by Marcus Ynalvez at Texas A&M International University, which has the explicit goal of mentoring TAMIU Graduate Students (Students from Historically Underrepresented Populations): The hands-on training and mentoring of TAMIU graduate students represents an attempt to engage Hispanic students in international scientific research activities with the intention of introducing them to the possibilities of developing professional careers in science and technology. These students are currently gathering, synthesizing and reviewing literature materials for the project's manuscripts, publications, and reports. With the data from the Japan, Singapore, and Taiwan surveys, these students will be analyzing data using a number of statistical software such as: Statistical Packages for the Social Sciences (SPSS), STATA, and Statistical Analysis System (SAS). They will learn how to interpret statistical results associated with the family of generalized linear regression models, namely: linear, logistic, and negative binomial regression models, analysis of variance, and path analysis. Not only have the TAMIU graduates gained actual research experience, they have also developed professional relationships with students and professors from the prestigious National University of Singapore.

How is NSF encouraging a community of practice in science of science policy and the dissemination of policy to policy makers

A major avenue has been the linkage with the Science of Science Policy Interagency group, which is discussed in more detail below. In addition, the listserv and the website have been very important dissemination vehicles.

However, the most important vehicle has been two PI workshops with the explicit goal of fostering further collaboration among the PIs actively engaged in the study of Science of Science & Innovation Policy and the link to the federal community. The 2009 workshop had three overarching goals:

  • To provide NSF with an early opportunity to organize a collegial discussion of work in progress under SciSIP's two rounds of awards well before this work will begin to appear in professional forums and publications;
  • To begin to develop from among the purposefully diverse set of disciplinary perspectives reflected in SciSIP's two solicitations and subsequent awards, a "community of experts across academic institutions and disciplines focused on SciSIP;" and
  • To identify new areas of emphasis for support in future SciSIP solicitations.

The 2010 workshop, scheduled for October 19, 2010 seeks to focus on two objectives that flow from the National Science and Technology Council's 2008 report: The Science of Science Policy: A Federal Research Roadmap. The first task, as called for in the Roadmap report, is "to advance the scientific basis of science policy so that limited Federal resources are invested wisely." The second is to build a "community of practice" between Federal science and technology policymakers and researchers engaged in the development of new theories, tools of analysis, and methods for collecting and analyzing data.

This October 2010 workshop will consist of brief presentations by a number of SciSIP grantees who have been invited to participate via a competitive peer review of abstracts previously submitted based on their ongoing research. These presentations will be followed by roundtable discussions led by federal policymakers who will comment on the relevance of the research, followed then by open discussions among all participants. A networking session will be scheduled at the close of the formal sessions to allow for continued discussion.

    2. As a Co-chair of the Science of Science Policy Interagency Group under the NSTC please briefly describe the work of that group and how the various federal science agencies are collaborating on the development and implementation of science of science policy tools to improve the management and efficiency of their R&D portfolios and other science and technology related programs.

In 2006, the National Science and Technology Committee's Subcommittee on Social, Behavioral and Economic Sciences (SBE) established an Interagency Task Group on Science of Science Policy (ITG) to serve as part of the internal deliberative process of the Subcommittee. In 2008, this group developed and published The Science of Science Policy: A Federal Research Roadmap which outlined the Federal efforts necessary for the long-term development of a science of science policy, and presented this Roadmap to the SoSP Community in a workshop held in December 2008. The ITG's subsequent work has been guided by the questions outlined in the Roadmap and the action steps developed at the workshop.

The development of the STAR METRICS (Science and Technology for America's Reinvestment: Measuring the EffecT of Research on Innovation, Competitiveness and Science) program is the number one priority of the interagency group. The initiative is a multi-agency venture led by the National Institutes of Health, the National Science Foundation (NSF), and the White House Office of Science and Technology Policy (OSTP).

Another major activity is sponsoring a series of workshops to bring the science agencies together to share what is already established in the field, identify gap areas and outline steps forward for the creation of better tools, methods, and data infrastructure.

The first of these workshops was held in October 2009 to delve into the issues surrounding performance management of federal research and development portfolios. The focus was on sharing current practices in federal R&D prioritization, management, and evaluation. Over 200 agency representatives attended. The conference featured 27 speakers and panelists, representing 20 federal agencies, offices, and institutions, and over 30 poster presenters, representing more than 25 agencies and institutions. Topics that were discussed included:

  • Methods to set federal research priorities and strategic directions;
  • The use of metrics to improve federal R&D efficiency; and
  • Ways in which research evaluations can inform current and future R&D decisions.

It addressed the following key questions:

  • How do federal science and technology agencies systematically identify and prioritize research and development alternatives? How can these processes be strengthened?
  • How can research-performance metrics be used to improve research efficiency? How can these metrics be improved?
  • How do research-performance evaluations inform and improve R&D investment decisions? How can these feedback loops be reinforced?

While the 2009 workshop developed a dialogue within the federal science policy community, the ITG has a workshop planned for December 2010 that engages the federal community with the academic community in advancing the "Science of Science Measurement". The first goal is to create a dialogue between the Federal S&T agencies and the research community about relevant models, tools, and data that advance scientific measurement in key areas of national S&T interest. The second objective is to identify a joint Science of Science Policy (SoSP) research agenda for the Federal S&T agencies and the research community. The workshop has four modules intended to advance measurement in: 1) Economic benefits; 2) Social, health and environmental benefits; 3) S&T workforce development; and 4) Technology development and deployment. Four academic researchers will be presenting in each module, with a rapporteur synthesizing the presentation at the end of each module.

The audience will be primarily science policy practitioners from the Federal agencies who are interested in very practical issues, such as: getting new ideas about how to manage their portfolios in a more scientific manner; developing performance and outcomes metrics; measuring the return on investment; and using science to identify emerging trends in the U.S. scientific enterprise.

Another activity has been the establishment of a website to provide information on best practices to Federal and non-Federal agencies. The Web site (http://scienceofsciencepolicy.net) was launched in January 2010, and has become a model for other interagency groups (including the Forensic Science interagency group). The Web site serves as a repository for data, documents, research papers, and communication tools for the communities of users. The site receives over 2,000 hits a month. The associated Listserv is the highest visibility listserv in science policy, and has over 720 members.

The interagency group meets monthly, and has active participation by over 15 agencies. It is actively providing input to the Center of Excellence on Science Policy being established by the State Department in the Middle East.

    3. Please provide a brief description and update on the status of the OSTP led project on science metrics, known as STAR METRICS, including a description of international engagement and interest in this effort.

The STAR METRICS project is a federal and university partnership to document the outcomes of science investments to the public. The benefits of STAR METRICS are that a common empirical infrastructure will be available to all recipients of federal funding and science agencies to quickly respond to State, Congressional and OMB requests. It is critical that this effort takes a bottom up approach that is domain specific, generalizable and replicable.

Currently, the project is structured in two phases:

  • Phase I: The development of uniform, auditable and standardized measures of the initial impact of ARRA and base budget science spending on job creation.
  • Phase II: The development of broader measures of the impact of federal science investment, grouped in four broad categories:
    • Scientific knowledge (such as publications and citations) and, later,
    • Social outcomes (such as health and environment),
    • Economic growth (through patents, firm start ups and other measures),
    • Workforce outcomes (through student mobility and employment),

Phase I of the STAR METRICS project began in earnest in March of 2010 with funds formally designated for the project. The participation agreement was signed in May 2010, and a press release was issued by the three lead agencies: NIH, NSF and OSTP.2 As noted in that press release:

"A new initiative promises to monitor the impact of federal science investments on employment, knowledge generation, and health outcomes. The initiative--Science and Technology for America's Reinvestment: Measuring the EffecT of Research on Innovation, Competitiveness and Science, or STAR METRICS--is a multi-agency venture led by the National Institutes of Health, the National Science Foundation (NSF), and the White House Office of Science and Technology Policy (OSTP)."

In Phase I, through a highly automated process, with essentially no burden on scientists and minimal burden for administrators, STAR METRICS collects longitudinal employment data from the participating institutions to be able to assess the number of jobs created or retained (or lost) through federal funding support. The system is set up such that all jobs will be captured and not just principal investigators and co-principal investigator. In addition, in Phase I, STAR METRICS can provide estimates of jobs supported through facilities and administration (F&A) costs and through various procurement activities in the institutions.

STAR METRICS will also help the Federal government document the value of its investments in research and development, to a degree not previously possible. Together, NSF and NIH have agreed to provide $1 million in funding a year for the next five years.

More agencies are joining the STAR METRICS consortium. While meetings of the Consortium are convened by OSTP, the lead agency is NIH, which is hosting the data infrastructure. The official STAR METRICS website will be available September 30, 2010. NSF is providing key leadership in engaging the scientific community, particularly through the SciSIP program.

Phase II of the project expands the data infrastructure to incorporate the broader impact of science investments on scientific, social, economic and workforce outcomes. In keeping with the bottom up approach of the program, STAR METRICS is beginning a formal set of consultations with the scientific community to understand what data elements and what metrics the community would find useful to find in STAR METRICS. The first of these will occur October 22, 2010, with a meeting with Vice Presidents for Research of interested institutions. Other meetings will follow with research agencies and other interested groups.

In a very short period of time since formalizing the project, over 100 research-intensive universities, mostly from the Federal Demonstration Partnership (FDP), have expressed interest in participating in STAR METRICS; about 20 are contributing data. Universities have expressed enthusiasm and support for the project.

Science is fundamentally an international endeavor. And so must be its evaluation. In fact, there has been substantial international interest. Members of the STAR METRICS team have provided information or directly briefed Brazilian and Japanese science and technology agencies. The State Department is actively interested in learning about the program to advance the science of science policy in the Middle East.

Our most active international counterpart, however, is the European Union. A major presentation was given to the European Parliament in April.3 A joint EU/US conference has been proposed for March 2011 in the Rockefeller Foundation's Bellagio Center. The goal is to produce a roadmap that will outline a path for creating a US/European collaboration in developing a common theoretical and empirical infrastructure to describe and assess the outcomes of science investments. To achieve this, it will bring together key European and US science policy experts and makers, administrators and academic researchers. The group is carefully chosen to consist of the key players from the US side who have the experience in developing such an infrastructure in the US. The European attendees will consist of individuals who have both the deep understanding of the issues and the ability to effect change in Europe in a collaborative framework with the US.

The outcomes will include a roadmap that represents a combined effort to build on and extend existing efforts in both regions: notably the US investment in the STAR METRICS program and the European efforts to build better assessments for their investments. It is hoped that the roadmap will have the same success that the Science of Science Policy Interagency Roadmap had in the United States and that in the EU the road map will be the basis for including assessment measures in future legislation implementing science programs.

Conclusion

The NSF's Science of Science and Innovation Policy program, the NSTC's Interagency SOSP ITG, along with STAR METRICS, represent the first efforts to construct a scientific framework that is supported by multiple agencies and multiple institutions--all jointly engaged. It represents a true bottom up approach to providing an evidence basis for U.S. science policy. Its success is important for decision-makers: in a nutshell, you can't manage what you can't measure and what you measure is what you get.

NSF's innovative Science of Science and Innovation Policy program, and STAR METRICS, can help all of us do a better job in explaining this essential symbiosis.

This concludes my testimony, Mr. Chairman. I look forward to answering any questions you or Members may have.

 

NOTES

1. www.ncseonline.org/NLE/CRSreports/08Aug/RL31181.pdf, CRS-3
Return to speech.

2. whitehouse.gov/sites/default/files/microsites/ostp/STAR%20METRICS%20FINAL.pdf
Return to speech.

3. www.euractiv.com/en/science/eu-looks-to-us-model-for-measuring-rd-impact-news-448950
Return to speech.

 

Email this pagePrint this page
Back to Top of page
  Web Policies and Important Links | Privacy | FOIA | Help | Contact NSF | Contact Webmaster | SiteMap  
National Science Foundation The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel: (703) 292-5111 , FIRS: (800) 877-8339 | TDD: (800) 281-8749
Celebrating 60 Years of Discovery
Last Updated:
Jun 15, 2010
Text Only


Last Updated: Jun 15, 2010