News: Model Predictions of Surface Oil Transport

2013/07/10 – Gulf of Mexico Research Initiative
Study Evaluates Accuracy of 2010 Model Predictions for Oil Spill Movement

gomri

During the 2010 oil spill, emergency responders needed forecasts of where oil would go given different scenarios and weather conditions, not just information of current oil location. Forecasts had to account for this first-time situation of oil coming from a deep water source and thus, “the important processes that contribute to oil movement in deep water, on the continental shelf, and within the complex nearshore environment.” Forecasters used the coupled SWAN (Simulating Waves Nearshore) and ADCIRC (Advanced CIRCulation) models to simulate currents that accounted for tides, rivers, winds, and waves. They adjusted their models with data from Lagrangian particle tracking to simulate “short-term (less than 1 week) oil movement.”

News: Large-Scale Simulations of Coastal Flooding

2013/04/04 – The Daily Texan
UT research group uses math, simulations to analyze hurricanes
by Mark Carrion

Daily-Texan

Casey Dietrich is one of 14 researchers besides Dawson who works in the research group. He said having access to Stampede, UT’s new and powerful supercomputer, is important for the simulations the group runs.

“We’re very lucky we get access to one of the largest supercomputers in the world,” Dietrich said. “That opens the door for us to run larger, more interesting problems.”

Dietrich said the models they run use calculations from Stampede that allow them to analyze changes in an area as small as 20 meters.

“We can really see how the flooding is affecting the environment,” Dietrich said.

News: Forecasting of Hurricane Isaac

2012/08/29 – Computerworld
Supercomputers help New Orleans prepare for Hurricane Isaac
Computing advances since Katrina have helped the city plan better on the storm surge, for one

Computer-World

About the time of Katrina, the computer models “were much coarser and had minimum resolutions of only 100-200 meters,” said Casey Dietrich, a post-doctoral researcher at the Institute for Computational Engineering and Sciences at University of Texas in Austin.

Dietrich has been running compute models at the Texas Advanced Computing Center at the University of Texas to assess the impact of the storm surge on Texas.

Emergency planners in both states take the data generated by the university researchers and incorporate it into geographic information systems.

“They can look down at neighborhood scale and say ‘on this street along the levy we’re going to have water this high,’ and plan accordingly,” Dietrich said.

Comparing the capability today with that at the time of Katrina, Dietrich said: “I think we have a very strong understanding of how hurricane wave storm develop and how they can threaten a coastal environment.”

Also see local coverage by the Institute for Computational Engineering and Sciences.

How Far Have We Come?

I came across this figure in a Weather Bureau report from 1963. (D.L. Harris, “Characteristics of the Hurricane Storm Surge,” Technical Paper No. 48, U.S. Weather Bureau, Washington, DC, 1963, pg. 10.) You can predict the hurricane storm surge using nothing but this figure! All you do is find the shape factor on the map, match it and the minimum pressure from your hurricane on the graph, and then read off the storm surge.

It makes me wonder why we’re wasting our time with math and equations and computers and what-not.

StormSurge