November 18, 2002
SC2002, the annual conference of high-performance
computing and networking, is being held Nov. 16-22
at the Baltimore Convention Center. NSF-supported
activities in high-performance information technology
tools, applications, and infrastructure -- including
those described here -- are being featured throughout
the SC2002 Research Exhibits.
For more information on these science news and feature
story tips, please contact David Hart
at (703) 292-8070.
Contents of this News Tip:
Tomorrow's astronomers are more likely to start their
research by mining data than by sitting at a telescope.
High-performance information technology is giving
astronomers a new window into the night sky, as captured
in digital collections by automated telescopes. In
one such project, the entire sky comprises 10 terabytes
(10,000 gigabytes) of infrared imagery amassed by
the 2-Micron All-Sky Survey (2MASS), funded by NSF
and NASA and led by the University of Massachusetts.
However, each 2MASS image covers a region in the sky
only one-tenth of a degree by two-tenths of a degree.
For comparison, the moon is half a degree across,
and large celestial structures can span several degrees.
The Large Magellanic Cloud stretches across nearly
five degrees -- 50 2MASS images across.
Simply piecing raw 2MASS images together produces patchy
and unusable mosaics. The Jet Propulsion Laboratory's
Infrared Processing and Analysis Center (IPAC), which
is processing the 2MASS data, has developed software
that recalibrates, pixel by pixel, the orientation,
background, and alignment of each image. Each 2-megabyte
image requires about four minutes to process. A 300-image
mosaic, therefore, requires 40 hours of computer processing
over 600 megabytes of data.
However, with a parallel supercomputer like Blue Horizon,
operated by the NSF-supported National Partnership
for Advanced Computational Infrastructure (NPACI),
large mosaics can be assembled in minutes. Eventually,
an all-sky mosaic will be assembled from the entire
2MASS collection. Data-mining tools can then be applied
to discover objects of interest for further study.
The use of NPACI's data grid technology and Blue Horizon
to create large mosaics will be demonstrated in the
NPACI research exhibit (R1134) at SC2002. The demo
will show mosaics of the Virgo galaxy cluster and
a dust-cloud region in the galactic plane.
Top of Page
Only an earthquake
engineer would want to be closer to an
earthquake, simulated or otherwise, as
it rocks the structure of a bridge, building
or highway. But the George E. Brown, Jr.
Network for Earthquake Engineering Simulation
(NEES), a national major research equipment
program supported by NSF, is constructing
the information technology infrastructure
to make that possible.
Engineering experiments vital to reducing
the nation's vulnerability to catastrophic
earthquakes rely on shake tables, tsunami
wave tanks, geotechnical centrifuges and
other massive pieces of laboratory equipment
being constructed under NEES. Traditionally,
experiments with such shared equipment
have to be conducted on site, but NEES
is deploying technology to overcome that
A Nov. 14 demonstration at the University
of Nevada, Reno, provided the first glimpse
of the infrastructure, called NEESgrid,
that will eventually allow earthquake
engineers to remotely view and control
experiments on scarce physical simulation
systems, collaborate with colleagues across
the country and analyze vast amounts of
In the demonstration, an earthquake simulated
by UNR's shake tables rocked a model bridge
that was equipped with sensors to measure
and display displacements. Data from the
bridge were streamed live to an early
version of the NEESgrid, a project led
by the National Center for Supercomputing
Applications at the University of Illinois,
Urbana-Champaign. The captured data were
stored and transferred across the grid
to another workstation, where a wide range
of tools were available for in-depth analysis.
The NEESgrid team will present the results
of the Reno demonstration at the National
Computational Science Alliance research
exhibit (R1249) at SC2002 in Baltimore,
MD, on Tuesday, Nov. 19 through Thursday,
For more details, see http://www.neesgrid.org/.
Photo of three shake tables at University
of Nevada, Reno, used to demonstrate the
capabilities of the NEESgrid infrastructure
at the NSF NEES awardees meeting, Nov.
Photo Credit: David Gehrig, NCSA
Demo Video Available Here (streaming)
This clip shows a shake table experiment
at the University of Nevada, Reno, replicating
the 1940 El Centro, Calif., "Imperial
Valley" earthquake, which measured 7.1
on the Richter scale. A large beam is
mounted across three separate tables,
which are shaken by large piston arms
mounted alongside. The beam, representing
a model bridge, wobbles loudly in response
to the initial temblor and continues to
shake for the duration of the quake. No
damage is visible to the bridge structure,
but data from sensors have been captured
and transmitted to the NEESgrid infrastructure.
Video Credit: University of Nevada,
Top of Page
The data - lots of data - may be out there,
but for the average scientist, businessperson
or teacher, they might as well not be.
Freely available data, whether for studying
the Earth, designing drug treatments or
making business decisions, may be almost
impossible to use because of different
data formats, sheer quantity, distant
storage sites or other issues.
Researchers from the National Center for
Data Mining at the University of Illinois,
Chicago, want to make it easier to use
data you've never seen before. Led by
Robert Grossman and supported in part
by NSF, NCDM researchers are developing
"data webs" to make sharing data sets
as easy as the World Wide Web makes sharing
digital photos. Data webs automatically
convert data into relevant formats and
let users analyze the data on the fly.
But what if the information you need is
buried in a huge data collection on the
other side of an ocean? In September at
the iGrid 2002 meeting in Amsterdam, the
NCDM team moved data across the Atlantic
at 2.8 gigabits per second -- more than
500 times faster than the typical trans-Atlantic
Internet transfer. This achievement foreshadows
the day when the average scientist or
businessperson can routinely include gigabytes
of online data within an application.
In their research exhibit (R1145) at SC2002,
NCDM will demonstrate the power of data
webs using Earth science data sets, collections
of protein structures and drug-candidate
compounds, and U.S. Census data for demographic
analysis in business applications.
For the demonstrations, they have created
a global test bed with storage sites in
Ottawa, Amsterdam, Chicago, and Baltimore.
The data web demonstrations will be presented
for the HPC Challenge on Wednesday, Nov.
For more details, see http://www.ncdm.uic.edu/.
A TeraScope visualization of remote Earth
science data using a data web from Project
Photo Credit: The Laboratory for Advanced
Computing and the Electronic Visualization
Laboratory at the University of Illinois
Select image for a larger version.
(Size: 147KB) or Download high
resolution TIF file (439kb).
Top of Page
In the past year,
civil engineers have begun to change the
way they look at buildings. A team of
civil engineers and computer scientists
at Purdue University has developed a new
high-performance computing tool that will
help to improve the design of critical
buildings, such as hospitals and fire
stations, which may save lives in the
event of a disaster.
The research team, as part of several NSF
computer science and engineering projects,
used software commonly used in automobile
crash testing to create highly realistic
simulations of the September 11, 2001,
attack on the Pentagon, in which an airliner
was crashed into the building. The results
were then used to create a vivid re-enactment
of the moment of impact.
The most detailed version of the simulation
used a mesh of 1 million nodes to represent
the airliner and the steel-reinforced
columns of the building. Simulating a
quarter-second of real time required close
to 68 hours on a Purdue supercomputer.
"Using this simulation I can do the so-called
'what-if' study, testing hypothetical
scenarios before actually building a structure,"
said project leader Mete Sozen, the Kettelhut
Distinguished Professor of Structural
Generating detailed and accurate models
as well as combining commercial software
with the special models needed to simulate
an airliner hitting a building provided
the team with its greatest challenges.
The results of the simulation showed that
the reinforced concrete support columns
in the Pentagon probably saved many lives.
Team members Christoph Hoffmann, professor
of computer science, and Sami Kilic, a
civil engineering research associate,
will present the team's work in the Research
in Indiana exhibit (R1951) at SC2002 on
Monday, Nov. 18, and Tuesday, Nov. 19.
For more details, see http://www.cs.purdue.edu/homes/cmh/simulation/.
This animation models what may have happened
to the Pentagon when it was struck by
an aircraft on September 11, 2001. The
aircraft model, with a load of fuel in
the wing tanks, is shown hitting a grid
of posts, representing the Pentagon’s
steel-reinforced concrete columns. The
simulation shows the fuel onboard as it
crashes into the building like a massive
river of fluid. The million-node model
required close to 68 hours on Purdue’s
IBM Regatta to simulate 0.25 seconds of
real time. Three observations in the simulation
stand out. (1) The columns destroyed in
the building facade do not have to correspond
to the wingspan; the columns cut the lightweight
tips of the wings, rather than the wings
cutting the columns. (2) Each reinforced
concrete column cuts into the fuselage
until the column is destroyed by the river
of fuel. (3) The plane rapidly decelerates
after impact, as witnessed by the buckling
of the fuselage near the tail structure.
Video Available Here (streaming)
Credit: Purdue University School of Civil
Engineering, Departments of Computer Science
and Computer Graphics, and Information
Technology at Purdue
versions (Total Size: 858KB)
of all images from this document
Top of Page