text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
News
design element
News
News From the Field
For the News Media
Special Reports
Research Overviews
NSF-Wide Investments
Speeches & Lectures
Speeches & Presentations by the NSF Director
Speeches & Presentations by the NSF Deputy Director
Lectures
Speech Archives
Speech Contacts
NSF Current Newsletter
Multimedia Gallery
News Archive
 



Remarks

Photo of Arden Bement

Dr. Arden L. Bement, Jr.
Director
National Science Foundation
Biography

"Un-Common Sense: Recipe for a Cyber Planet"
NCSA 20th Anniversary Distinguished Lecture

University of Illinois at Urbana-Champaign
Urbana, IL
January 25, 2006

(As Presented)

Thank you, and good evening everyone. I'm honored to be your speaker tonight, on the occasion of the 20th Anniversary of the National Center for Supercomputing Applications. Twenty years may not seem long compared to the vast reaches of time. But twenty years in the history of supercomputing is just about all the history there is!

All of you who have had a hand in making NCSA a global center of excellence can be justly proud of your achievements. I heartily congratulate you.

I have titled my remarks "Un-Common Sense: Recipe for a Cyber Planet" because I plan to offer some thoughts about the changing context -- national and global -- that our computer and communications technologies have helped to bring about. We already inhabit a cyber planet -- although one still in its infancy. We will need a good measure of "un-common sense" to discern the challenges and seize the opportunities that these transformations have introduced.

As every schoolchild knows, "Common Sense" is the title that Thomas Paine gave to his manifesto for the American Revolution. I mean by "un-common sense" a different sort of revolution in our thinking about the future -- one that may be every bit as significant for the growth and future prosperity of our nation as "Common Sense" was for its birth.

When the world is being changed in extraordinary ways, our common response is to reach into imagination for a fresh perspective. Un-common sense is the innovative mind-set that draws on imagination and creativity, but with eyes wide open and feet firmly planted on the ground.

In science and engineering, un-common sense operates at the furthest frontier of discovery to reveal startling, new concepts. In the private sector, un-common sense drives technological innovation that disrupts markets and entire industries.

Innovation has become the catchall watchword for a future with a markedly different path. It is both a rallying cry and a challenge. In recent testimony before Congress, Norman Augustine, former Chairman and CEO of Lockheed Martin, put the challenge this way:

"In addressing the future quality of life in America, one cannot help but notice warnings of what appears to be an impending perfect storm." 1

Augustine's "perfect storm" is a confluence of changing circumstances that now threaten America's economic and global leadership. Among these, he sites "the pervading importance of education and research in science and technology to America's standard of living, and the disrepair in which we find many of our efforts."

Augustine's warning is one among many that have surfaced as America faces increasing competition for technological leadership in the global knowledge economy. Warnings are useful; panic is not! With some un-common sense, America can navigate its way through these circumstances.

Thomas Friedman, in his book, "The World is Flat," has written that the landscape of opportunities among nations is leveling out as our economies and societies become increasingly interconnected. Innovation is no longer the purview of one scientist, one institution, or one nation. It is increasingly the result of connections that criss-cross the globe. Flat can also connote "dull." The flat world environment that Friedman describes is anything but dull. It is both dynamic and daunting.

The nations that are pulling ahead are those that quickly embrace new knowledge, regardless of its source, and that propel their citizens along new economic and technological pathways.

In this "knowledge economy," intellectual and human capital, infrastructure, and R&D investment are the three fundamental components -- perhaps better stated they are "survival components." This is a lesson that nations like China and India are quickly learning. They are building powerful economic momentum through a burgeoning science and engineering workforce and research capacity.

We all know that these changes have come about in large measure as a result of revolutionary advances in computer, information and communication technologies. These powerful forces continue to reshape science and engineering, the private sector, and nearly every aspect of our daily lives. My aim tonight is to revisit several of these transformations and then explore with you how un-common sense can inform our forward path.

Information and communications technologies have enabled us to scan research frontiers at velocities that are orders of magnitude faster than ever before.

These tools are not simply faster -- they are also fundamentally superior. They have raised the level of complexity we can understand and harness. That capability is growing, at a breathtaking pace. Just consider two revolutionary innovations in our toolkit: computer simulation and modeling. Soon it will be possible to model the entire U.S. electric power grid with all of its chaotic and emergent behavior.

This is just the beginning of our great transformation. We know how to share and gather information, but we have not yet tapped the full power of these technologies by integrating them fully into a new system for research, education, and innovation.

Cyberinfrastructure will take research and education to an entirely new plane of discovery.

If we combine new capabilities in information and communications with sensors and satellites, and improved visualization and simulation tools, databases and networks, we will leave our familiar landscapes in the dust. In the words of the pop song, "you ain't seen nothing yet." 2 And the demand for it is already deafening in all sectors, to meet a wide variety of needs.

Scientific breakthroughs that are just over the horizon will require speeds that even today's supercomputers cannot produce. We are already setting strategies to "super-size" supercomputers by plotting the transition from terascale to petascale computing.

The petascale capability will in itself lead to other challenges, from coping with the flood of data from our vastly improved observational tools, assimilating different data formats and ontologies -- atomic to the cosmic -- and finding ways to store and archive petabyte-sized databases.

We will need to transfer data more rapidly and securely, setting the stage for the next generation Internet. I am certain NCSA will continue to take a lead in meeting these challenges.

In industry, virtual prototyping could reduce design and testing costs and shorten product development cycles. Perhaps more importantly, new modeling, simulation and visualization techniques can provide tools to pioneer entirely new products and services, manufacturing processes, and business practices. That is innovation at the cutting-edge.

There was a time, in the 60s and early 70s, when the norm was 20 years for the results of fundamental research to find their way to the marketplace. This timeframe has now collapsed in many fields, often to 20 months or less.

High tech "hunter-gatherer" enterprises now shop the world for bits and pieces of cutting-edge science and scan patents for new technology. They create innovative production, supply, distribution, marketing and service networks that they can change on a dime. And voila! They destabilize the marketplace with new "killer" products and applications, and they do so with ever decreasing lead times.

Both "old economy" and "new economy" enterprises now confront the need for multi-sectoral enterprise integration. Advanced information technology is critical to make these new integrated business systems function. A case in point: it will not be possible to enter the hydrogen economy without establishing interlocking partnerships among the energy, transportation, manufacturing, and financial sectors.

The accelerating pace of scientific discovery and technological change, fired by more robust computing and networking, is the locomotive driving the relentless pressure of global competition.

In this changing environment, we have legitimate concerns about our nation's ability to sustain its long history of leadership in science and technology. You will recall other times of unease about competition -- anxiety over Sputnik, then concerns about competition from Japan in the eighties. Recently, we have seen "the competitiveness issue" clarify and crystallize into something we understand as globalization. This is shorthand for a complex, permanent, and challenging environment that calls for sustainable, long-term responses, not just short-term fixes.

We can continue to rely on the innovative and entrepreneurial American spirit. But we can no longer rely on research in isolated laboratories, or prolonged lead times to get from discovery to analysis, from experimentation to innovation.

In all of these developments and dilemmas, un-common sense can trigger innovative solutions. I'm going to offer a piece of uncommon sense for you to consider: We should pursue more global involvement, not less. The rapid spread of computer and information tools compels us to join hands across borders and disciplines -- if we want to stay in the race.

We have just completed the centennial celebration of Albert Einstein's "Miraculous Year" -- the year during which he published six papers that remapped forever our understanding of the physical world.

The network theorist Albert-László Barabási" 3 has observed that Einstein may be the last scientist to accomplish a conceptual revolution almost entirely in isolation. Einstein and other scientific giants -- Galileo, Newton, and Darwin, for example -- are like stars in a constellation, with origins that stretch across time.

In 1953, when James Watson and Francis Crick cracked the DNA code, a handful of their contemporaries had been working on the same problem. By the time scientists published the sequence of the human genome in 2001, a new phenomenon was in full swing. Scientists from around the globe, linked in a vast network, completed the task.

Today, collaboration is the only game in town. For one thing, "not collaborating" exposes a nation to losing out on the critical beginning of the next revolution. Global research networks are proliferating at a phenomenal pace. Not only can researchers communicate instantaneously, they can make observations, access databases, conduct experiments, and analyze results using instruments -- supercomputers included -- located anywhere on the globe. Some research agendas are simply too gargantuan for small groups, others too costly, and others, like climate change or biodiversity, are simply global in scope. International co-authorship of research papers is skyrocketing. Research teams are increasingly interdisciplinary and international. We can expect such network partnerships to increase in the future, and new forms of collaboration, as yet unimagined, to arise.

It is not that individual scientists matter less today. Rather, individuals of different stripes, with diverse perspectives and skills, now work together to accommodate the extraordinary complexity of today's science and engineering challenges.

Einstein, always profoundly aware of the larger context in which discovery occurs, once observed: "Without creative, independent thinking and judging personalities, the upward development of society is as unthinkable as the development of the individual personality without the nourishing soil of the community." 4

Thanks to our enhanced communications technologies, the "nourishing soil" of the science and engineering community is now global in scope. New ideas can burst forth from anywhere in the world, sometimes simultaneously in different labs. Thanks to the Internet, researchers worldwide learn about these sooner, and the time span between bursts grows shorter and shorter.

There are still barriers to communication, but these will soon be substantially lowered as we enable our national grid, and as networks develop in every nation and among all continents. We will need to encourage common ground and establish compatibility through a standard set of protocols. In time, a full complement of cyberinfrastructure will knit the international community into an even tighter net.

In reality, the science and engineering community is so far down the global track that there is really no way to stop the train. And we would be unwise to try.

In a sense, all nations cooperate in advancing frontier research and technology in order to compete in the marketplace. We need international collaboration on a global basis to identify exactly where the frontier is at any moment. We need fresh intelligence from the frontier to avoid being blindsided by new discovery and technologies. These necessities ensure an inevitable tension between collaboration and competition.

Partnerships among universities, the private sector and government are one of the stalwarts of the U.S. innovation system. Is there a danger, as corporations increasingly form international alliances, that we will lose the benefits of this collaboration? Perhaps -- if U.S. industry begins to disinvest in U.S. universities. But this would be a mistake.

In September, the Economist published a survey on higher education titled "The Brains Business." A sidebar reads: "America Rules." It lists the annual global ranking of universities by the Institute of Higher Education at Shanghai's Jiao Tong University. Seventeen of the top twenty universities on the list are American; so are thirty-five of the top fifty. 5

I mentioned Thomas Friedman's book, The World is Flat. In reality, the world is spiky -- and the outstanding performance of U.S. universities is one of the peaks.

There are others. Richard Florida, author of The Creative Class, has mined geographical databases to map the peaks, hills and valleys of innovation. His conclusion: "globalization has changed the economic playing field, but hasn't leveled it.' 6 "The world's most prolific and influential scientific researchers overwhelmingly reside in U.S. and European cities."

Granted, this is a snapshot in time -- the real worriers see gloom in future trends. But the lesson we can carry away is simple: maintaining the excellence of U.S. higher education is paramount to future prosperity.

As we consider our options for fostering innovation, we need to leave ample room for experimentation and exploration. That is a hallmark of innovation, and a key to our future.

Leonard Lynn and Hal Salzman, writing in Issues in Science and Technology 7 suggest that pursuing "collaborative advantage" as an alternative to cut-throat competition may be the best strategy for maintaining America's momentum in the global knowledge economy.

Being "the firstest with the mostest" in every field of science and technology may no longer be feasible. The authors envision "a stronger U.S. system revitalized by accelerated flows of ideas from around the world."

An environment that fosters collaborative technology advantage could even attract and retain global talent. Developing such an advantage would require that the U.S. "look aggressively for partnership opportunities -- mutual gain situations -- around the world." That is a bit of un-common sense that provides a fresh perspective on accommodating the tension between collaboration and competition.

There is a striking assumption underlying such a strategy. Roughly stated, face-to-face contact still counts. So does leadership in cutting-edge cyberinfrastructure, and a policy framework that supports partnerships among universities, the private sector, and government.

And yet, there are other trends that appear to take us in a different direction. We no longer think of our technology as if it were like the twigs chimpanzees use to probe anthills.

Surely, some of the most remarkable wizardry of the computer and communications revolution is the extent to which these technologies have become an extension of human cognition.

To paraphrase Descartes, the catchphrase of our new era might be: "I think, I compute, I communicate -- therefore, I am!"

These changes are most evident -- and startling -- when we look at the Web -- the device that seems to remove the significance of the local and substitute the global.

In a recent article titled "We Are the Web," Kevin Kelley, "senior maverick" at Wired Magazine, has this to say about the destiny of the Web over the next decade:

The Web will become "The OS [operating system] for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and our bodies but our minds...This planet-sized computer is comparable in complexity to a human brain." 8

The Web has been a font of remarkable innovation, from entirely new business models, such as Amazon and eBay, to new forms of entertainment and social interaction -- think about the estimated 50 billion blogs launched worldwide in the space of a few years. It has propelled us, ready or not, into a new period of creative transformation.

No one could have predicted the Web's impact on human and social dynamics. Although many, like Kelly, try to read the tea leaves, we still do not know the full consequences for society. Just last week, China's state news reported that the number of Web users in China -- the world's second-largest Internet market -- reached 111 million in 2005, an increase of 18 percent. About half of China's Web population -- or about 64 million people -- accessed the Web using broadband connections. 9

As the eyes of the world scrutinize China's rapid emergence in the global economy, we can only guess what the consequences will be for Chinese society. The Web is sure to generate "inevitable surprises," to use author Peter Schwartz's phrase. 10

Living in the midst of this rapidly changing context, we can scarcely fathom just how revolutionary the changes enabled by the Web may be.

Would it make sense to model possible future paths in order to anticipate the direction of change? Yes -- un-common sense. Social scientists view the Web as a complex system -- self-organizing, adaptive, and evolving -- with all the intricacies of emergent behavior. Better intelligence about human and social dynamics could give us a way to scope the future.

At a recent NEES demonstration in DC, I was impressed to discover that one of the team members was a social scientist. The role of the social scientist is to ensure that the virtual network actually works for researchers and students who are using it.

The social sciences are currently an underutilized -- and perhaps undervalued -- knowledge resource. Equipped with sophisticated tools, social scientists are particularly adept at finding ways to mine new knowledge from data on human behavior and cognition.

The social sciences can also illuminate complex societal institutions. An example is the NSF-funded Center for Nanotechnology in Society at Arizona State University 11, where investigators are experimenting with real-time technology assessment.

The un-common sense at work here is to ask whether we can use the tools of the social sciences to explore the potential consequences for society of new technologies, while those technologies are emerging. Total foresight would never be possible, but we could gain a kind of limited, evolving prescience. The idea is that such assessments would generate more choices along the technology trajectory, and allow us to make course corrections along the way. Investigators at the Center are exploring these new methods in the rapidly developing field of nanotechnology.

The late Peter Drucker once commented on our scant attention to all things human: "In a few hundred years," he said, "when the history of our time will be written from a long-term perspective, it is likely that the most important event historians will see is not technology, not the Internet, not e-commerce. It is an unprecedented change in the human condition. For the first time -- literally -- substantial and rapidly growing numbers of people have choices. For the first time, they will have to manage themselves. And society is totally unprepared for it." 12

Today, managing ourselves also means managing our technology. It is no longer meaningful to consider technology apart from humankind and vice versa. Our innovation systems, in particular, inextricably mesh the human and the machine. This concept must be embedded in emerging academic programs in computational science and engineering.

We need to mobilize the social and cognitive sciences to help us unravel the complexities of human thought and behavior, to provide new insights into societal institutions, and to elucidate how we learn.

The ultimate reason for the science and engineering enterprise is to put knowledge to work for the growth of the economy and the well being of society. Education is the font from which every such vital transformation will flow. Surely we need to teach our youngsters as if they will be the ones to turn the familiar world upside down. We want to prepare them to change the world.

Fantastic? Possibly. Science fiction movies and novels supply a wide range of even more imaginative scenarios. Is it worth thinking about? You bet. Does it make sense? Well, my answer would be, "yes -- un-common sense."


1 Norman Augustine, "Challenges to America's Competitiveness in Math and Science," US House of Representatives Committee on Education and the Workforce, May 19, 2005.
Return to speech.

2 Bachman-Turner Overdrive, "You ain’t seen nothing yet."
Return to speech.

3 Albert-Lázló Barabási, "Network Theory -- The Emergence of the Creative Enterprise," Science Vol. 308 29 April 2005.
Return to speech.

4 Einstein, Albert, Life as I See It; http://lib.ru/FILOSOF/EJNSHTEJN/theworld_engl.txt; last accessed January 23, 2006.
Return to speech.

5 "The Brains Business," The Economist, September 8th 2005.
Return to speech.

6 "The World is Spiky," The Atlantic Monthly, October 2005, pp.48-51.
Return to speech.

7 Leonard Lynn and Hal Salzman, "Collaborative Advantage," Issues in Science and Technology, Winter 2006.
Return to speech.

8 Kevin Kelly, "We Are the Web," Wired Magazine, etc.
Return to speech.

9 C/net News.com, "In China, 111 million Net users counted," Reuters, January 18, 2006.
Return to speech.

10 Peter Schwartz, Inevitable Surprises: Thinking Ahead in a Time of Turbulence, Gotham Books, 2003.
Return to speech.

11 The Center is located at ASU; partners include: University of Wisconsin-Madison, Georgia Institute of Technology, North Carolina State University, Rutgers, The State University of New Jersey, University of Colorado-Boulder and others; as well as private sector partners.
Return to speech.

12 Peter F. Drucker, "Managing Knowledge Means Managing Oneself." Leader to Leader, Vol. No 16, Spring 2000
Return to speech.

 

 

Email this pagePrint this page
Back to Top of page