A scientist at the Institute for Computational Engineering and Sciences (ICES) at The University of Texas at Austin is one step closer to understanding what might happen if a hurricane hits the Gulf of Mexico oil spill.
Clint Dawson and collaborators Joannes Westerink at the University of Notre Dame and Rick Luettich at the University of North Carolina-Chapel Hill are using the ADCIRC (ADvanced CIRCulation Model for Coastal Ocean Hydrodynamics) code on the Ranger supercomputer at the Texas Advanced Computing Center (TACC) to model past hurricanes Katrina, Rita, Gustav and Ike to see how a hurricane could affect the region. Dawson is head of the Computational Hydraulics Group at ICES and a professor of aerospace engineering and engineering mechanics.
The models have been validated through comparisons with real-time measurements to ensure what the scientists are computing resembles reality.
“We compare water levels at buoys and other types of measurement stations,” Dawson said. “For the oil, we do a visual inspection based on satellite imagery.”
The ADCIRC code simulates storm surge predictions over a large computational domain, and at the same time, generates high-resolution data in coastal areas with complex shorelines. The model includes variables such as coastline geometry, water depths, land elevations and obstructions, land types, and hurricane winds and pressures.
The ADCIRC code has been validated for tides and wind, and the researchers are getting good matches with the data.
“We want to validate the oil particle tracking next,” Dawson said. “We’ve made tremendous strides in the last few weeks. Since the hydrodynamics code is working correctly, I feel almost certain that we are tracking the surface oil.”
Dawson and his colleagues see this as a three-dimensional oil spill problem, but right now they can only model the surface oil in two dimensions.
“Nobody knows the extent of the oil plume underwater,” Dawson said. “We just don’t have the data for it. We’re focusing on tracking significant plumes like the one off the coast of Louisiana in Barataria Bay, which is a very sensitive ecological area.
“This is the reality of research. It’s a lot of data collection, a lot of missing links and a lot of coordination. We’re generating huge amounts of data and it has to be automated and validated.”
Dawson said all of the research would not be possible without TACC and the center’s expert staff and advanced computing resources.
“We rely on our partnership with TACC because without them we wouldn’t be able to do real-time forecasting,” Dawson said. “We have to be able to compute a forecast within a few hours, and then visualize the data into something that makes sense. We now have interest from the National Oceanic and Atmospheric Administration and the U.S. Geological Survey for our results, so they have to be posted in a timely manner.”
Advanced computing allows scientists to simulate different scenarios and create numerical infrastructures to understand which circumstances will work in a disaster response effort, and which ones could make a situation worse. This saves time, lives and millions of tax payer dollars.
“Advanced computing plays an increasingly crucial role in many aspects of society-even though its role is often behind the scenes,” said Jay Boisseau, TACC’s director. “We’re proud that TACC is able to help model and analyze emergency situations such as oil spills and hurricanes, which have great impact to human lives and the environment.”
A Snapshot of the Advanced Computing Resources Used for the Gulf Oil Spill
Ranger: Powerful Computational Capabilities: At 579.4 teraflops, Ranger is one of the most powerful academic supercomputers in the world — up to 50,000 times more powerful than today’s PCs. Dawson is using Ranger to run all of the past hurricane and oil spill simulations. Once the simulations are complete, they are immediately archived on Corral. Thousands of simulations have been run to date.
Corral: Data-Intensive Computing and Storage: Corral is a storage system that supports data-centric science. Since the outbreak of the Gulf oil spill, The University of Texas at Austin Center for Space Research (CSR) has produced the satellite images that Dawson’s team uses daily to gauge the movement of the oil spill. The images constitute 23 terabytes of data on Corral, which is equivalent to more than 4,500 DVDs. The images are used by disaster response personnel across the Gulf region to help assess current and future threats from weather and oil. More than 17,000 Web requests for data from CSR have been received to date.
Spur: Data Processing and Visualization: Spur is a first of its kind visualization resource that is tightly integrated with Ranger. Scientists use Spur for remote visualization and data analysis, as well as to improve models and code. Dawson’s team uses Spur to create images from the simulations that run on Ranger. With the assistance of TACC’s visualization specialists, their visualization run-time was reduced from three hours to 15 minutes.
Longhorn: Visualization and Data Analysis for Massive Scale Datasets: Longhorn is a unique visualization cluster designed for remote interactive visualization and data analysis. The system supports serial and parallel visualization and analysis applications that take advantage of large memory, multiple computing cores, and multiple graphics processors that allow researchers to create visualizations quickly. While Dawson’s team is using Spur to visualize forecast and simulation data such as wind velocity, water elevation and oil particle position, TACC’s visualization specialists are using Longhorn to create visualizations that overlay oil particle movement and satellite data. The file system accessible from Ranger, Spur and Longhorn makes it easy to access and visualize the data.
###
Funding for this work is provided by the National Science Foundation Office of Cyberinfrastructure and the Department of Homeland Security Science and Technology Directorate through the Center of Excellence for Natural Disasters.
Ranger, Spur and Longhorn are funded through the National Science Foundation (NSF) Office of Cyberinfrastructure. They are key systems of the NSF TeraGrid (www.teragrid.org), a nationwide network of academic HPC centers that provides scientists and researchers access to large-scale computing, networking, data-analysis and visualization resources and expertise.