text-only page produced automatically by LIFT Text Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation Home National Science Foundation - Mathematical & Physical Sciences (MPS)
Physics (PHY)
design element
PHY Home
About PHY
Funding Opportunities
Awards
News
Events
Discoveries
Publications
Career Opportunities
Facilities and Centers
PHY Program Director Jobs
See Additional PHY Resources
View PHY Staff
MPS Organizations
Astronomical Sciences (AST)
Chemistry (CHE)
Materials Research (DMR)
Mathematical Sciences (DMS)
Physics (PHY)
Office of Multidisciplinary Activities (OMA)
Proposals and Awards
Proposal and Award Policies and Procedures Guide
  Introduction
Proposal Preparation and Submission
bullet Grant Proposal Guide
  bullet Grants.gov Application Guide
Award and Administration
bullet Award and Administration Guide
Award Conditions
Other Types of Proposals
Merit Review
NSF Outreach
Policy Office
Additional PHY Resources
PHY: Investigator-Initiated Research Projects
Physics in the Mathematical and Physical Sciences
Nuclear Science Advisory Committee (NSAC)
High Energy Physics Advisory Panel (HEPAP)
DCL: Announcement of Intent to use an Asynchronous Review Mechanism for Proposals
DCL: Announcement of Instrumentation Fund to Provide Mid-Scale Instrumentation for FY2014 Awards in
PHY COV Report 2012
Response to the PHY COV Report
Other Site Features
Special Reports
Research Overviews
Multimedia Gallery
Classroom Resources
NSF-Wide Investments

Email this pagePrint this page
All Images


Press Release 12-060
NSF Leads Federal Efforts In Big Data

At White House event, NSF Director announces new Big Data solicitation, $10 million Expeditions in Computing award, and awards in cyberinfrastructure, geosciences, training

Back to article | Note about images

Hurricane Ike visualization created by Texas Advanced Computing Center supercomputer Ranger.

Throughout the 2008 hurricane season, the Texas Advanced Computing Center was an active participant in a NOAA research effort to develop next-generation hurricane models. Teams of scientists relied on TACC's Ranger supercomputer to test high-resolution ensemble hurricane models, and to track evacuation routes from data streams on the ground and from space. Using up to 40,000 processing cores at once, researchers simulated both global and regional weather models and received on-demand access to some of the most powerful hardware in the world enabling real-time, high-resolution ensemble simulations of the storm. This visualization of Hurricane Ike shows the storm developing in the gulf and making landfall on the Texas coast.

Credit: Fuqing Zhang and Yonghui Weng, Pennsylvania State University; Frank Marks, NOAA; Gregory P. Johnson, Romy Schneider, John Cazes, Karl Schulz, Bill Barth, The University of Texas at Austin


Download the high-resolution JPG version of the image. (9.9 MB)

Use your mouse to right-click (Mac users may need to Ctrl-click) the link above and choose the option that will save the file or target to your computer.

Broadcast of OSTP-led federal government big data rollout, held on March 29, 2012, in the AAAS Auditorium in Washington, DC, and featuring: John Holdren, assistant to the President and director, White House Office of Science and Technology Policy; Subra Suresh, director, National Science Foundation; Francis Collins, director, National Institutes of Health; Marcia McNutt, director, United States Geological Survey; Zach Lemnios; assistant secretary of defense for research & engineering, U.S. Department of Defense; Ken Gabriel, acting director, Defense Advanced Research Projects Agency; and William Brinkman, director, Department of Energy Office of Science. Each official announced initiative(s) that his or her federal government agency was embarking on to embrace the opportunities and address the challenges afforded by the Big Data Revolution.

The announcements were followed by a panel discussion with industry and academic thought leaders, moderated by Steve Lohr of the New York Times. Panelists were: Daphne Koller, Stanford University (machine learning and applications in biology and education); James Manyika, McKinsey & Company (co-author of major McKinsey report on Big Data); Lucila Ohno-Machado, UC San Diego (NIH's "Integrating Data for Analysis, Anonymization, and Sharing" initiative); and Alex Szalay, Johns Hopkins University (big data for astronomy).

About Big Data: Researchers in a growing number of fields are generating extremely large and complicated data sets, commonly referred to as "big data." A wealth of information may be found within these sets, with enormous potential to shed light on some of the toughest and most pressing challenges facing the nation. To capitalize on this unprecedented opportunity--to extract insights, discover new patterns and make new connections across disciplines--we need better tools to access, store, search, visualize and analyze these data.

Credit: National Science Foundation

 

Farnam Jahanian serves as assistant director for NSF's Computer and Information Science and Engineering Directorate.

Credit: National Science Foundation

 

Jose Marie Griffiths, vice president for academic affairs at Bryant College, has since 2006 served on the U.S. National Science Board.

Credit: National Science Foundation

 

Alan Blatecky has served as director of NSF's Office of Cyberinfrastructure since 2010.

Credit: National Science Foundation

 

Michael Franklin of the University of California, Berkeley, is the principal investigator for a team that just won a $10 million NSF Expeditions in Computing Award.

Credit: National Science Foundation

 

Photo of UC Irvine's HIperWall system measuring 23 x 9 feet with 50 flat-panel tiles.

Collaboration and concurrent visualization of 20 simulation runs performed by the International Panel on Climate Change (IPCC) using the HIPerWall (Highly Interactive Parallelized Display Wall) system. Located at the University of California, Irvine, the HIPerWall system is a facility aimed at advancing earth science modeling and visualization by providing unprecedented, high-capacity visualization capabilities for experimental and theoretical researchers. It's being used to analyze IPCC datasets. The room-sized HIPerWall display measures nearly 23 x 9 feet and consists of 50 flat-panel tiles that provide a total resolution of over 200 million mega pixels, bringing to life terabyte-sized datasets.

Credit: Falko Kuester, California Institute for Telecommunications and Information Technology (Calit2), University of California, San Diego


Download the high-resolution JPG version of the image. (2.2 MB)

Use your mouse to right-click (Mac users may need to Ctrl-click) the link above and choose the option that will save the file or target to your computer.



Email this pagePrint this page
Back to Top of page