National Science Board 9
The quest to understand the nature of the atom in the early twentieth century led to a scientific revolution. In overturning conventional thinking, quantum mechanics was discovered. To test this theory, microscopes were devised to study phenomena at scales of atomic size and smaller - much smaller than what is readily observable.10
In this age of S&T, the challenge to scientists, engineers, and policymakers alike is to generate new ideas and tools that harness research-based knowledge in the fight against human disease and disability, preserve the natural environment, enhance the quality of life, and assure a robust public investment in science and technology. As fields of science, mathematics, and engineering continue to specialize, overarching research themes cross disciplines and, moreover, captivate the public as issues of health, environment, energy, and space.
Areas such as genetics/biotechnology and information technology/telecommunications hold special promise. They are ripe for exploitation because of scientific discoveries made in past centuries and 20th century tools that enable discovery and advance knowledge. For example, the Human Genome Project, launched in 1990 as a distributed "big science" initiative, has historical roots in the Watson-Crick discovery in 1953 of a double helical structure of the DNA molecule, and the first recombinant DNA techniques (or gene splicing) pioneered by Hamilton Smith and Daniel Nathans in the 1970s. A quarter-century later, Ian Wilmut and Keith Campbell cloned a sheep from adult cells.11
Today, the research environment includes a host of normative ("should we do it?") questions that surround technical capability ("we can do it") with public controversy.12 For all the inevitable apprehension about the unknown, a newer age of life-enhancing innovations is not far off --biomonitoring devices that provide accurate readouts of our health and sensory prosthetics that synthesize speech, computerize vision, and feed electronic waves directly into the brain.13
A separate stream of inquiry in information technology has borne remarkable fruit. The U.S. Department of Defense's Advanced Research Projects Agency created ARPAnet, the precursor to the Internet, to facilitate communication among researchers. The National Science Foundation provided the critical sustenance that delivered this tool to university researchers. Subsequently, the World Wide Web was created to promote rapid data sharing among large collaborations of high energy physicists, a development that both simplified and popularized navigation on the Net. "The idea that anyone in the world can publish information and have it instantly available to anyone else in the world created a revolution that will rank with Gutenberg's . . . .14
The need to store, share, and interpret vast amounts of data15 has engendered whole new subfields, such as bioinformatics, which is dedicated to applying information technology (IT) to the understanding of biological systems. To explore the interdependencies among the elements of specific environmental systems will require the development of software, human-computer interaction and information management, and high-end computing. Companies use information technology to compete in today's global marketplace by tailoring their products and services to the needs of individual customers, forging closer relationships with their suppliers, and delivering just-in-time training to their employees. If we are to understand and deal with the socioeconomic, ethical, legal, and workforce implications of these systems, we will need to support a research agenda that crosses disciplines, languages, and cultures.
Highlighting technologies that are likely to blossom in the 21st century recalls yesterday's basic research. History instructs that we cannot predict which discoveries or technologies will change the lives of future generations. Rather, fundamental science and engineering research presents long-term opportunities -- a high-risk investment with high payoffs.