UT Wordmark Primary UT Wordmark Formal Shield Texas UT News Camera Chevron Close Search Copy Link Download File Hamburger Menu Time Stamp Open in browser Load More Pull quote Cloudy and windy Cloudy Partly Cloudy Rain and snow Rain Showers Snow Sunny Thunderstorms Wind and Rain Windy Facebook Instagram LinkedIn Twitter email alert map calendar bullhorn

UT News

With Hurricane Ike, researchers develop next-generation forecasts to safeguard coastal communities

Science at the Center of the Storm: It was Sept. 7, 2008, six days before Hurricane Ike, the third most destructive hurricane in U.S. history, crashed into the Texas coast. Ike had already made landfall in Cuba as a Category 4 hurricane, and in the coming days would gather strength over the warm waters of the Gulf of Mexico.

Two color orange horizontal divider

It was Sept. 7, 2008, six days before Hurricane Ike, the third most destructive hurricane in U.S. history, crashed into the Texas coast. Ike had already made landfall in Cuba as a Category 4 hurricane, and in the coming days would gather strength over the warm waters of the Gulf of Mexico.

On land, teams of scientists, technologists and emergency coordinators were feverishly working on what would be one of the greatest scientific experiments of their lives. These researchers at the cutting-edge of future storm forecasting were creating and testing the methods that will ultimately make our current understanding and response to tropical storms and hurricanes seem like a quaint antiquity.

Clint Dawson and Gordon Wells

Clint Dawson, professor of engineering mechanics in the Center of Subsurface Modeling at the Institute for Computational and Engineering Sciences (left), and Gordon Wells, of the Center for Space Research, played an integral role in predicting the storm surge and orchestrating the evacuation of the Texas coast during Hurricane Ike. In the background are high-resolution visualizations of the storm created by the Texas Advanced Computing Center.Photo: Christina Murrey

And in the middle of it all, coordinating and aiding that effort, was The University of Texas at Austin–its scientists, its computing systems and its leadership–charting a path to the improved predictive methods of tomorrow.

Knowing Nature

It would take infinite intelligence to understand how the interaction of each wind-gust and wave-crest creates the devastating confluence of events known as a hurricane. Scientists shine light into obscurity, create mathematical formulas to approximate nature and use this knowledge to save lives.

But Hurricane Katrina in 2005 taught Americans a tragic lesson: We didn’t know enough. Our predictive capabilities, preventative measures and emergency responses were inadequate, and this lack of readiness cost thousands of lives and the destruction of one of America’s great cities.

So, in the wake of Katrina, scientific agencies–in particular the National Oceanic and Atmospheric Administration (NOAA), which oversees the National Hurricane Center and the National Weather Service–redoubled their efforts and drew up ambitious new goals for the coming decades.

University of Texas at Austin researcher Clint Dawson and his colleagues, Johannes Westerlink, Rick Luettich and Randall Kolar, had created the most effective storm surge model available. Before and after Katrina, it had showed why a Category 5 storm would trigger such damage.

The researchers were preparing to turn their after-the-fact modeling method into a forecasting tool. In hurricanes Gustav and Ike, the model would face a crucial test, one that could mean the difference between destruction and safety for coastal communities.

Meanwhile, Gordon Wells of the Center for Space Research at The University of Texas at Austin was working with the Governor’s Division of Emergency Management, synthesizing satellite imagery, global positioning tracking signals, and the best hurricane and storm surge models available, to orchestrate the widespread evacuation of the Texas coast.

It was a heady time for hurricane forecasting and readiness, and the fact that these responsibilities converged on the university was no coincidence. It stemmed from the presence of Ranger, the most powerful supercomputer for open research in the world, which was perfectly suited to test next-generation models and provide the answers that researchers and emergency managers desperately needed.

Where would the storm hit? How dangerous would it be? What kind of damage would it cause? And how could people get out of harm’s way?

Storm surge destruction in Gilchrist area of Bolivar Peninsula

Total destruction and severe erosion caused by the storm surge in the Gilchrist area of Bolivar Peninsula. View a larger version of the Bolivar Peninsula photo (100K). Photo: Texas Civil Air Patrol

Total destruction and severe erosion caused by the storm surge in the Gilchrist area of Bolivar Peninsula. View a larger version of the Bolivar Peninsula photo (100K). Photo: Texas Civil Air Patrol

These questions galvanized the researchers. If they could answer them in near real-time, with significantly increased fidelity, their research would have ramifications for decades.

Ever-Greater Expectations

Hurricane forecasting, along with storm surge prediction and evacuation coordination, all require massive computing power, information technology infrastructure and computational know-how to be effective.

In the wake of Katrina, NOAA created the Hurricane Forecast Improvement Project (or HFIP), and assembled its best minds to achieve three ambitious goals:

  1. Improve track and intensity forecasts by 50 percent in 10 years;
  2. Better predict rapidly intensifying hurricanes; and,
  3. Create an accurate seven-day hurricane forecast.

Success involved significantly increasing the resolution of both the global and local hurricane models, incorporating the dense Doppler radar information from planes flying into the eye of the storm and developing ensemble methods, a statistical way of treating hurricane tracks and intensities to account for their unpredictability.

With the steady increase in computing capability, researchers believed some of these improvements were finally within reach. What they didn’t know was how quickly they could realize those goals.

NOAA turned to the National Science Foundation (NSF)–whose network of supercomputers known as the TeraGrid had taken a great leap forward in the last few years–to help achieve their objective. With on-demand access to the most powerful computing resources in the world, NOAA might begin to test new methods without disrupting the operational workflow that was still crucial.

“We got a call from the NSF informing us that the NOAA folks had a real scientific need with societal impact,” said Karl Schulz, associate director of the Texas Advanced Computing Center (TACC), “and we agreed that running on Ranger was a great opportunity to demonstrate this technology.”

At the beginning of the 2008 hurricane season, the goals of the hurricane improvement group were modest. They would use Ranger to test next-generation methods on historical storms, just to see if it could be done.

Frank Marks and Jun Zhang

Frank Marks (right) and Jun Zhang, both of AOML’s Hurricane Research Division, during a NOAA P-3 flight into Hurricane Gustav.Photo: Erica Rule

But when the hurricane season intensified, the scientists decided to improve not just one or two of the aspects of their forecasting techniques, but all of them at once, and to test their methods and mettle in the pressure-filled environment of real-time modeling.

“When we were asked if we could do it, it was always, ‘If, if, if,'” NOAA-HFIP Director Frank Marks recalled. “And we kept ticking off the ifs, going relentlessly toward the real-time forecasting we did on Gustav and Ike.”

NOAA wasn’t working alone. They had recruited top hurricane-modeling experts for the project, like Fuqing Zhang of Penn State University, as well as computing experts from TACC who manage Ranger.

The multi-tiered effort used global models with twice the resolution of the best operational simulations, regional models six times as high-resolution as those now used, and added, for the first time, Doppler radar data streamed directly from NOAA’s planes to TACC’s servers.

“In weather forecasting, it has been shown that having really good empirical data–for example, based on the Doppler data coming from NOAA research planes–can have a dramatic impact on the quality of the forecast,” Schulz said.

Instead of a single forecast, researchers ran 30 forecasts for each prediction to accomplish the ensemble blending that provides important uncertainty measures. Taken together, the accuracy improvements would be immense.

TACC provided the high-speed capability to communicate with and ingest data from planes in flight, to crunch incredibly complex equations and produce new predictions in a matter of hours. It also provided the systems engineering talent and expertise to pull the technical accomplishment off in only a few months.

For researchers like Zhang, the ability to use a system like Ranger was eye-opening.

“Its capability is just unprecedented,” he said. “We have never been able to do 1.5 kilometer resolution ensemble forecasts before. The operational mode is nine kilometers and they only do one forecast. We did 30. This leap is all because TACC has so much computing power.”

For Marks, the revolutionary aspect of the project was not the data flow or the ensemble high-resolution modeling, but the blending of all these talents and technologies together into a coordinated whole, a task which he compared to directing an orchestra.

Hurricane Ike

Predicted storm surge levels for Hurricane Ike along the northwestern Gulf of Mexico coastal region. Dry areas are gray and the color bar indicates maximum surface water elevation in feet. The maximum levels were recorded over the duration of the five-day simulation beginning Sept. 11, 2008. Watch an animation depicting results from day three through five of the simulation (3MB). Credit: Clint Dawson and Jennifer Proft, Center for Subsurface Modeling, ICES

Predicted storm surge levels for Hurricane Ike along the northwestern Gulf of Mexico coastal region. Dry areas are gray and the color bar indicates maximum surface water elevation in feet. The maximum levels were recorded over the duration of the five-day simulation beginning Sept. 11, 2008. Watch an animation depicting results from day three through five of the simulation (3MB). Credit: Clint Dawson and Jennifer Proft, Center for Subsurface Modeling, ICES

“Trying to make it all work in a real-time environment where the data’s coming in and the models have to run and the computers have to work and everything has to sing, that’s never been done at this scale,” Marks said. “Certainly not for weather.”

The results were astonishing.

“For Hurricane Ike, the predictions by our ensemble analysis and forecast system appeared to be significantly better than operational runs,” Zhang said. “We basically hit Houston right on, four days in advance.”

Both Zhang and Marks are wary of claiming success after only a few predictions. But perhaps more important than the qualitative results was the fact that the project represented a rare and important collaboration among NOAA, academic researchers and the computational resources of the NSF TeraGrid. It’s this kind of collaboration that will be necessary for NOAA to reach its long-term goals.

“The University of Texas at Austin and NSF had a real key role in helping NOAA take the first steps toward developing a new way of doing forecasting.” Marks said. “If we didn’t have TACC, we couldn’t have done it. That’s the bottom line.”

Landfall

A hurricane is something of an abstraction for most people. It is the hurricane’s interaction with the sea surface and terrain that affects individuals and communities.

Clint Dawson, professor of engineering mechanics in the Center of Subsurface Modeling at the Institute for Computational and Engineering Sciences (ICES), has been studying the impact of tropical storms for almost two decades, helping the Louisiana Army Corps of Engineers, and more recently the state of Texas, model the storm surge of a hurricane as it makes landfall.

Both before and after Katrina hit, it was Dawson and his colleagues’ ADCIRC model that demonstrated why the storm caused such devastation.

“Our model showed that we could predict and simulate Katrina and match the actual data that was measured after the storm to about a foot of error or less over the whole Louisiana and Mississippi coast,” he said.

The advantage of the ADCIRC model was not only the physics it incorporated, which went beyond previous models, or the fact that, like the NOAA models, it treated its forecasts as an ensemble, crashing dozens of hurricanes into the coast to take into account the uncertainty of a storm.

What made ADCIRC particularly effective was its use of LIDAR (Light Detection and Ranging) elevation data and other data sources to represent local topographies down to the 60-meter level, including levees, jetties, highways and other critical features. This allowed a more accurate accounting of precisely where flooding and surge damage could occur and the harm it would cause.

Depth of inundation in the City of Galveston

Depth of inundation in the City of Galveston estimated from debris line elevations and tide gauge records using elevation data collected by the university’s aerial LiDAR system. 

“A massive amount of data are available to us,” Dawson said, “And the question is, how do we use that data in combination with better physics and better mathematical models?”

The new ADCIRC code had only been used to study storms after the fact. To save lives, the researchers would need to turn an interpretive method into a forecasting tool, and this, above all, would require powerful parallel processing.

For Hurricane Gustav, the results suggest ADCIRC was able to predict the storm surge within a foot, though this won’t be confirmed until after a full assessment is made from data collected in the field.

“That’s an amazing result for running in prediction mode,” Dawson said. “And of course, the state of Louisiana used this forecast to order the evacuations for New Orleans and the coastal areas of southern Louisiana, so it was useful from an emergency management perspective.”

During Hurricane Ike, working to predict the storm surge on the Texas coast, the test seemed at first less effective. But running the model again after the fact, ADCIRC produced very accurate predictions, especially in areas adjacent to the coast.

“If Ike hadn’t happened and we hadn’t been scrambling around to forecast it, I’m not sure we would’ve made such progress,” Dawson said. “It really got us to a place where we’re in good shape for Texas if this happens again.”

Evacuation

At the center of the storm readiness effort was another university researcher, Gordon Wells of the Center for Space Research, who watched Hurricane Ike gravitate toward Texas from the State Operations Center in North Austin.

At 144 hours before landfall, when Hurricane Ike was north of Haiti, Wells was already combining data from radar and satellites with models of the hurricane and storm surge to orchestrate the largest evacuation effort in Texas history.

“A storm out in the Caribbean doesn’t mean much in the minds of Texans,” Wells said. “But the state knows we need to have everything in motion or we won’t be able to accomplish the things that need to be done.”

Gordon Wells shows Gov. Rick Perry and Jack Colley the latest projections for Hurricane Gustav

Keeping state leaders informed at the state Emergency Operations Center in North Austin, Gordon Wells (left) with the Center for Space Research, shows Gov. Rick Perry (right) and his emergency management chief, Jack Colley, the latest projections for Hurricane Gustav’s path.Photo: Larry Kolvoord/Austin American-Statesman

The lessons of Katrina, as well as Hurricane Rita, with its clogged highways and fuel shortages, showed Wells that greater organization and coordination were necessary during hurricane events. Using special tracking satellites and a career’s worth of experience with geospatial analysis, Wells advised the State’s leadership and the Texas Governor’s Division of Emergency Management’s (GDEM) as they choreographed the hurricane response and evacuation.

Long-distance commercial coaches, brought in from as far away as Wyoming, were tagged with global positioning systems, which streamed data to a dedicated server at TACC and 43,000 special needs evacuees were tracked along evacuation routes from coastal transportation hubs to sheltering cities, allowing Wells and GDEM officials to get the right people evacuated from the right places.

Fuel distribution was managed to avoid shortages, and road traffic was controlled to keep the flow moving, all with the help of the state-of-the-art processing and streaming servers at TACC, which monitored the mass evacuation.

“The capabilities that we have now to do real-time command and control of these different state assets makes a huge difference,” Wells said. “We certainly didn’t have the kinds of problems that we had during Rita for any of the evacuation. And the fact that very, very few people drowned or were struck by debris, given the size of the storm and the impact area, is probably a pretty good measure of the state’s ability to evacuate large areas.”

Predicting the Next Storm

The 2008 hurricane season packed many years’ worth of effort into a few manic months of collaborative experimentation and discovery and left a powerful impression on the scientists involved.

“I was trained as a mathematician,” Dawson said, “but I was always interested in the application of mathematics to real world problems. And this is the perfect example of a problem where mathematics and computational science had a huge social impact.”

Schulz said, “We’ve been in production on Ranger since February and we’ve had applications that have run across the whole system, doing really big science. But when you have an opportunity to work on something like this, which has more of an immediacy, you have to say that’s gratifying.”

The research projects proved the importance of nation-wide investment in computing infrastructure, confirmed the value of academic and inter-agency collaboration and showed how The University of Texas at Austin is leading the nation in developing scientific and computing solutions.

“We’ve demonstrated that we can do real-time forecasting on-demand,” Marks said. “You always have to start somewhere and then you say, how can we do it better, how can we streamline the process? Those things take time, but this project demonstrated it can be done.”