The revolution in information technology (IT) has been likened to the industrial revolution in terms of its potential scope and impact on society (Alberts and Papp 1997; Castells 1996; Freeman, Soete, and Efendioglu 1995; Kranzberg 1989). Few other modern advances in technology have had the capacity to affect so fundamentally the way people work, live, learn, play, communicate, and govern themselves. As IT extends human capabilities and takes over other functions previously performed by humans, it can even affect what it means to be human.
It is far from clear what the total effects of IT on society will be. As Vannevar Bush (1945) noted more than 50 years ago, "The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it." The question is, What has become of it? As with automobiles and television earlier in the 20th century, information technologies can be expected to have diverse and far reaching effects on society-some good, some bad, and many unanticipated.
The IT revolution raises many policy issues: How will IT affect the development and safety of children and the privacy of adults? How will IT affect the distribution of knowledge, wealth, and power among different groups in the United States and around the world? Will there be a "digital divide" between IT-rich and IT-poor groups that increases current inequalities? How will IT affect national sovereignty and international law? How will IT affect education and the future of libraries, universities, and scholarly communication? What measures are needed to make electronic commerce markets operate efficiently and fairly? Which issues can best be handled adequately in the private sector, and which require the involvement of the public sector? Although many of these questions are beginning to come into focus, data and research to answer these questions are lagging the changes that are occurring.
The information revolution is not new. The United States began moving toward an information-based economy in the 1960s, as information intensive services began to grow. At that time, computers were used mostly in the research and development (R&D) community and in the offices of large companies and agencies. In the past 20 years, however, IT has become increasingly pervasive in society. It has spread to the point that nearly everyone uses some form of IT every day. It has become common in schools, libraries, homes, offices, and shops. Corner grocery stores use IT for sales and electronic transactions; automobile repair shops use IT to diagnose failures and search for parts. In the past few years, the Internet and the World Wide Web in particular have contributed to the rapid expansion of IT. Innovations in IT now directly affect nearly everyonenot just the few in computer-intensive jobs.
As the market for IT has expanded, private investment in new technologies and manufacturing has increasedwhich in turn has led to new, better, and cheaper technologies. Costs have come down dramatically, and many new applications have been developed. Many of these advances provide return benefits to the science and engineering enterprise. For example, more powerful work stations, improved networking, and better databases all aid in research.
A discussion of IT in a collection of science and engineering indicators is important for two reasons. First, IT constitutes an important part of science and engineering's effect on society and the economy. It embodies advances in numerous fields, including computer science, computer engineering, electrical engineering, material science, mathematics, and physics. IT illustrates the effects of federal and private investment in R&D. Much IT has been developed by and for the R&D community, and the R&D community is an early user of many information technologies. Many of the effects of IT, such as the use of e-mail for communication or the World Wide Web for publication, take place first in the R&D community.
Second, IT is a major force affecting the U.S. and global science and engineering system. IT producers employ scientists and engineers, implement the results of academic research, and conduct significant amounts of applied research and development. IT affects the pipeline for science and engineering through its effects on the demand for people with technical skills and through its use in education at all levels. IT also affects the conduct of R&D in all disciplines. For example, the physical sciences make extensive use of computer modeling and simulation, and many aspects of biology (notably genomics) have become more information intensive. Advances in networking, meanwhile, facilitate the global nature of research collaboration.
This chapter provides an overview of the significance of IT for society and the economy; it focuses especially on the effects of IT on education and research. A complete discussion of the impact of information technology on society and the economy, however, is beyond the scope of this (or perhaps any) chapter. Other federal agencies and other organizations are addressing some areas. This chapter provides references and Web citations to direct the reader to more detailed and frequently updated information.
One major difficulty in analyzing the effect of IT on society is the difficulty in obtaining reliable national and internationally comparable data (CSTB 1998). There is little reliable, accepted, long-term data on either the diffusion of IT or its effects on society. The rate of technological change since the early 1980s has often outpaced our ability to define what we want to know and what data ought to be collected. Metrics are confounded by the changing nature of IT as a concept and the interactive effects of so many social variablesincluding age, ethnicity, income, learning processes, individual attitudes, organizational structures, culture, and management styles. In many cases, the effects of IT depend largely on how it is used. Positive effects often depend on appropriate organizational structures and managerial style, as well as the adequacy of training and the attitudes of individuals using IT.
Quantitative indicators of IT diffusion are relatively abundant but not standardized. Much of the available data is in the form of quickly developed, easily obtained information rather than long-term studies. Studies in many areas of interest often are not regularly repeated with the same methods. This lack of comparable data partly reflects the complexity and dynamism of IT: The most interesting things to measure change rapidly.
Indicators of the effects of ITas opposed to the use of ITon individuals, institutions, and markets are especially difficult to establish. This difficulty inhibits our ability to draw any definitive conclusions about the impacts of IT on society. Experts have had difficulty measuring productivity in service industries, in education, and in research and development. Consequently, determining the effects of IT on productivity in these areas is even more difficult. Moreover, IT often has effects in conflicting directions. There is evidence, for example, that IT can both increase and decrease productivity and contribute to both lowering and upgrading of skills in the labor force. Computer-aided instruction may enhance some forms of student learning, but extensive use of some computing environments may impede other aspects of child development.
This chapter attempts to compile relevant existing data and indicators; it also identifies the limitations of existing data and suggests how improvements to the data would be helpful. Data and measurement issues are identified throughout this chapter and are further discussed in the conclusion.
Information technology, as defined in this chapter, reflects the combination of three key technologies: digital computing, data storage, and the ability to transmit digital signals through telecommunications networks. The foundation of modern information technologies and products is the ability to represent text, data, sound, and visual information digitally. By integrating computing and telecommunications equipment, IT offers the ability to access stored (or real-time) information and perform an extraordinary variety of tasks.
IT is not a single technology; it is a system of technologies in combination. There are literally hundreds of commercial productsranging from telephones to supercomputersthat can be used singly or, increasingly, in various combinations in an information processing system. The different functions of many of these products contribute to a sense of fuzziness about IT's technological boundaries.
One approach is to group IT into four technological elements: human interface devices, communication links (including networks), information processing hardware and software, and storage media. (See figure 9-1.) There are substantial overlaps among the categories. For example, most human interface devices also have some information processing and storage capabilities.
The rapid social and economic diffusion of IT since 1980 has been stimulated by rapid changes in computing power, applications, telecommunications, and networks, as well as concurrent reductions in the cost of technology and, in some cases, improvements in ease of use. The most dramatic manifestations are enormous improvements in performance and reductions in cost of integrated circuits brought about by rapid miniaturization. (See sidebar, "Moore's Law.") Similar but less dramatic improvements in cost and performance have occurred in disk drives and other computer hardware.
In addition, new capabilities are being added to chips. For example, microelectromechanical systems such as sensors and actuators and digital signal processors are being put on chips, enabling cost reductions in these technologies and extending information technologies into new types of devices.
Another key development in IT is the growing connectivity of computers and information. Computers are increasingly connected in networks, including local area networks (LANs) and wide area networks (WANs). Many early commercial computer networks, such as those used by automated teller machines (ATMs) and airline reservation systems, used proprietary systems that required specialized software or hardware (or both). Increasingly, organizations are using open-standard, Internet-based systems for networks. Almost three-fourths of the personal computers in the United States are networked (WITSA 1999, 55). Worldwide, there were more than 56 million Internet hostscomputers connected to the Internetin July 1999, up from about 30 million at the beginning of 1998. (See figure 9-4 and appendix table 9-2.)
The number of transistors on a chip has doubled approximately every 12--18 months for the past 30 yearsa trend referred to as Moore's Law. (See figure 9-2.) This trend is named for Gordon Moore of Intel, who first observed it. As Moore (1999) noted:
I first observed the "doubling of transistor density on a manufactured die every year" in 1965, just four years after the first planar integrated circuit was discovered. The press called this "Moore's Law" and the name has stuck. To be honest, I did not expect this law to still be true some 30 years later, but I am now confident that it will be true for another 20 years.
Performance has increased along with the number of transistors per chips, while the cost of chips has remained fairly stable. These factors have driven enormous improvements in the performance/cost ratio. (See figure 9-3.)
The complexity and cost of developing new chips and new semiconductor manufacturing equipment also have increased. As a result, the industry has been driven toward greater economies of scale and industry-wide collaboration. Moore's Lawwhich began as the observation of an individual in a single companyhas become a self-fulfilling prediction that drives industry-wide planning. Since 1992, the U.S. Semiconductor Industry Association (SIA) has developed a National Technology Roadmap for Semiconductors, which charts the steps the industry must take to maintain its rate of improvement. In 1998, this effort evolved into the International Technology Roadmap for Semiconductors, with participation by the Japanese, European, and South Korean semiconductor industries. The 1998 update projects the number of transistors per chip increasing to 3.6 billion in 2014 (SIA 1998).