text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
News
design element
News
News From the Field
For the News Media
Special Reports
Research Overviews
NSF-Wide Investments
Speeches & Lectures
Speeches & Presentations by the NSF Director
Speeches & Presentations by the NSF Deputy Director
Lectures
Speech Archives
Speech Contacts
NSF Current Newsletter
Multimedia Gallery
News Archive
 



Remarks

Photo of Arden Bement

Dr. Arden L. Bement, Jr.
Director
National Science Foundation
Biography

Remarks, NSF Plans for Computational Science and Cyberinfrastructure
President's Information Technology Advisory Committee

June 17, 2004

See also slide presentation.

If you're interested in reproducing any of the slides, please contact the Office of Legislative and Public Affairs: (703) 292-8070.

[Slide #1: Title]
(Use "back" to return to the text.)

Thank you, and good afternoon to all of you. I am delighted to join you to discuss the National Science Foundation's plans for investment in computational science and cyberinfrastructure.

[Slide #2: Computational science drives discovery]
(Use "back" to return to the text.)

I'll go right to the heart of the matter. Computational science is driving discovery and innovation across all fields of science and engineering today. The extraordinary advances in information technology of the past several decades have ushered in a new era of scientific exploration—an era that promises not simply incremental advances, but a revolution in the way we conduct scientific investigation and in the complexity and depth of the new knowledge we can generate. NSF is committed to a continuing leadership role in realizing this enormous potential.

Increasing the effectiveness of researchers—crossing new frontiers with greater confidence and more rapidly—carries an ever higher premium in today's world. Globalization has raised the stakes on competition, and our national circumstances demand our best efforts to ensure continuing economic prosperity and national security.

In every field of science and engineering today, research paradigms that involve and often depend on computing and communications technology are rapidly becoming the norm and the only way to advance the frontiers. Whether we call this computational science or cyberscience or e-science, it is essential.

Let me briefly highlight a few of the current projects and activities that illustrate just how computing and communications technology is transforming scientific research.

[Slide #3: Not available ]

Here is a remarkable image of the War and Peace Nebula produced through the National Virtual Observatory. This montage is literally cobbled together from many observations, and has a resolution of 144 million pixels Data is flooding in from today's advanced telescopes, requiring an integrated suite of data management, and analysis, mining, storage and visualization capabilities for data-driven investigations.

[Slide #4: Not available ]

The Grid Physics Network (GriPhyN) and the International Virtual Data Grid Laboratory (iVDGL) are two NSF-funded projects that together will provide a computational grid resource for major scientific experiments in physics, astronomy, biology and engineering in the U.S., Europe and Asia. IVDGL will test new computational paradigms at the petabyte scale and beyond, while GriPhyN will provide the basic software toolkits.

In addition to computational and data resources for the Sloan Digital Sky Survey, and for leading-edge experiments at the Large Hadron Collider and LIGO (the Laser Interferometer Gravitational Observatory), GriPhyN and iVDGL will provide the needed grid capability to support international collaboration in these fundamental areas.

[Slide #5: Storm collage]
(Use "back" to return to the text.)

The Center for Analysis and Prediction of Storms (known as CAPS) is developing techniques for the practical numerical prediction of high-impact local weather, particularly thunderstorms, using high-resolution observations from Doppler weather radars.

[Slide #6: Not available]

Here you see the higher resolution achieved through refinements in simulation and modeling. The predictive use of these techniques demands real-time, on-demand access to cyberinfrastructure resources, as well as sophisticated modeling, analysis and visualization tools. CAPS is leading a new NSF Large Information Technology Research (ITR) grant, Linked Environments for Atmospheric Discovery (LEAD), that is creating a cyberinfrastructure for mesoscale meteorology research and education.

[Slide #7: NEES collage]
(Use "back" to return to the text.)

The George E. Brown Jr. Network for Earthquake Engineering Simulation—NEES, for short—is a distributed, virtual laboratory for earthquake experimentation and simulation. NEES shifts the emphasis of earthquake engineering research from current reliance on physical testing to integrated experimentation, computation, theory, databases, and model-based simulation.

Using NEESgrid—a high-performance Internet network—researchers and students will have a powerful collaborative space for remotely operating equipment, and for modeling and simulation to study how building design, advanced materials and other measures can minimize earthquake damage and loss of life.

[Slide #8: Not Available]

Other distributed, large-scale, collaborative systems take advantage of advances in sensor and other data gathering technology. EarthScope will monitor movements in the earth's crust along the San Andreas Fault. NEON, the National Ecological Observatory Network, and the Ocean Observatories Initiative will deploy extensive sensor networks for real time data collection and environmental monitoring.

These scientific opportunities only hint at the huge potential for discovery that cyberinfrastructure can facilitate. The trick is striking the right balance: finding the correct blend of investments in a plethora of cyberinfrastructure resources and services that serve a broad base, while providing the most powerful resources for those that require them. NSF intends to remain a leader in providing both capability and capacity for widespread use by the open science community.

And we will have to provide balance as well in our support for fundamental research in computer, networking and computational science, and support for domain specific applications. This will not be an easy task. However, finding the right balance is essential if we are to succeed in addressing opportunities for high-risk, high-reward research, invest in next generation science, and support emerging areas of high promise.

[Slide #9: Computer evolution time line]
(Use "back" to return to the text.)

NSF has been in the business of supporting frontier research and education in computational science since the 1960's, with support for the first Academic Computing Centers and the first university computer science education programs. These investments helped to foster the talent and expertise that has transformed a fledgling field into a revolutionary force.

Our current capability and capacity is a result of an array of NSF investments, illustrated in this slide. One of NSF's most recent has been the terascale computing project, which has produced the Teragrid—a next step in an evolutionary process that makes a rich mix of cyberinfrastructure resources broadly available to the science and engineering community.

This evolving computing and networking capability is likely to produce some remarkable waves along the path of its development. Researchers are pioneering entirely new ways of conducting investigations, from computationally intensive to large data-driven applications. The power and flexibility of these tools, combined with progress in data collection and observational tools, from sensors to satellites, continues to draw new communities of researchers into the computational fold.

The scientific opportunities I've described today illustrate that "one size does not fit all"—at different times and for different tasks, researchers and educators will need data-centric, sensor-centric, computationally-centric, or network-centric applications. Large-scale modeling and simulation, real-time computing, collaboration spaces, and huge data archives illustrate these varying needs.

[Slide #10: Integrated Cyberinfrastructure System ]
(Use "back" to return to the text.)

"Cyberinfrastructure" describes the melding of tools and capabilities—hardware, middleware, software applications, algorithms, and networking—that are now transforming research and education, and are likely to do so for decades to come. There is a gathering avalanche of demand for cyberinfrastructure to suit a wide variety of needs in the open science community.

At the same time, cyberinfrastructure is an integrating force that will take research and education to an entirely new plane of discovery. It has already altered our familiar research, education and innovation landscapes—and we can expect much more to come.

But, my primary purpose this afternoon is not to talk about the exciting scientific advances that can be made or the details of the cyberinfrastructure that will enable them, but to share with you the outlines of NSF's plans for the future.

[Slide #11: The Atkins Report Cover]
(Use "back" to return to the text.)

In 2001, NSF commissioned a "blue ribbon panel" headed by Prof. Dan Atkins to advise NSF on new directions for cyberinfrastructure investments. The panel concluded that "the National Science Foundation should establish and lead a large-scale, interagency, and internationally coordinated Advanced Cyberinfrastructure Program" to radically empower all scientific and engineering research and education. This is an extraordinary challenge. I will now address how NSF plans to meet this challenge...and go beyond.

Our intention is to grow our current cyberinfrastructure into a rich system that will serve the expanding research and education needs of the science and engineering community. Here are some indications of where we are and where we want to go.

[Slide #12: Activites and Plans] (Use "back" to return to the text.)

  • NSF currently supports 3 of the top 15 supercomputers on the Top 500 list for the benefit of the open scientific community.


  • In FY 2005, we propose to invest over $60 million dollars for the operation of supercomputing facilities, including those at the National Center for Atmospheric Research (NCAR).


  • This year alone we added 25 teraflops of capacity at the San Diego Supercomputing Center and the National Center for Supercomputing Applications. Additional resources have been integrated into theTeragrid.


  • We have announced our intention to upgrade computing resources at the Pittsburgh Supercomputer Center over the next two years. Our sights are set on providing a system of up to 50 teraflops, responding directly to needs expressed by the community.

[Slide #13: Activites and Plans, continued]
(Use "back" to return to the text.)

  • We will soon release a $10 million dollar solicitation for national efforts aimed at educating, training, advancing and mentoring current and future generations of computational scientists and engineers.


  • We intend to provide as many leadership cyberinfrastructure resources as possible—including data, software, sensor and educational resources. Let me emphasize that we have purposely adopted a flexible strategy that will allow us to take advantage of the latest technical advances. So, while we are not announcing a specific spending target for "out years," we are confident that we will be able to deploy resources intelligently and effectively.


  • NSF has been a key participant in the High End Computing Revitalization Task Force. In fact, NSF has been instrumental in jumpstarting the research component—the High End Computing-University Research Activity (HEC-URA). A solicitation, focused on research in languages, compilers and libraries, sponsored jointly by NSF and DARPA, has already been issued, and another, addressing other advanced topics, will follow. DOE has also initiated a coordinated research program.


  • NSF is already spending tens of millions of dollars each year in direct support for domain-specific as well as domain-independent computational science. We plan to consolidate responsibility for investments in domain-independent computational science in the CISE directorate, to make it more accessible to all communities. At the same time, direct support for domain-specific computational science, across all disciplines, will likely expand.

[Slide #14: Activites and Plans, continued]
(Use "back" to return to the text.)

  • To tap into the needs and inventiveness of the community, NSF has recently convened workshops to explore emerging developments in computational science and cyberinfrastructure technology, as well as the links between cyberinfrastructure and education.


  • We continue to coordinate and seek further opportunities for collaboration with U.S. government agencies, and with international counterparts. NSF is working closely with DOE, NIH, DARPA and others. We have had discussions with many agencies, both here and abroad, about sharing resources and coordinating our development activities. In fact, we will hold an all-day meeting in this room next Monday with a high-level delegation from the European Community.


  • Indeed, collaboration among U.S. agencies is increasing rapidly. To ensure coordination, we propose a new subcommittee of the National Science and Technology Council, composed of agency heads, to oversee working groups in cyberinfrastructure provisioning, research and development, high end computing, cyberinfrastructure education, and in other areas as needed.

[Slide #15: Vision for the Future]
(Use "back" to return to the text.)

In concluding, I want to emphasize NSF's two overarching objectives. Both are vital to science and engineering, and NSF is well suited to tackle them. As we have done in the past, we intend to support frontier research that will lead to the, as yet unimagined, cutting-edge systems of the future. At the same time, we want to foster a comprehensive and integrated cyberinfrastructure—high-end computing, middleware, software applications, computational tools, data services, high-volume, high response networking, and educational tools. Scientists and engineers need these tools to exploit the exciting promise of new paradigms of research and education that computational science is spawning.

Now, I'm happy to answer your questions.

 

 

Email this pagePrint this pageBookmark and Share
Back to Top of page