With hundreds of thousands of gallons of oil threatening some of America's richest wetlands and fisheries, crews grappling with the massive spill in the Gulf of Mexico have at their disposal new technologies never before available in such a disaster.
To be sure, oil spill response is a largely low-tech business that hasn't changed that much over the years. But as the aftermath of the explosion and subsequent sinking of the semi-submersible offshore oil rig Deepwater Horizon looks increasingly like it may result in the country's worst environmental catastrophe in decades, there are some ways that technology is helping find solutions.
On Thursday, U.S. Department of Homeland Security Secretary Janet Napolitano announced that the incident has been designated of "national significance," meaning that the government has taken control of cleanup operations and that a wide range of new federal resources are now available to those involved in the efforts. No one knows yet the full extent of the amount of oil that has been released from a rupture 5,000 feet below the surface of the Gulf, or how the leak will be plugged. The rig was owned by Transocean and operated by BP.
One major challenge, according to U.S. Coast Guard incident operations coordinator Doug Helton is that crews are constantly dealing with uncertainties. Essentially, while there are best guesses, no one knows how much oil is flowing from the leak. Initially, authorities thought that 1,000 barrels of oil per day were entering the Gulf. Now that number is up to 5,000 barrels a day.
With that lack of precision, said Helton, it doesn't make sense to try to use sophisticated supercomputers to try to model the spread of the giant oil slick, given that there are too many unknown variables.
But Helton did say that there are some tools and techniques in the hands of the Coast Guard's Gulf Coast crew that have never before been available to personnel working on big oil spills.
One of the most important, Helton said, is the ability to get almost real-time access to National Oceanic and Atmospheric Administration (NOAA) satellite imagery of a spill incident area. In the past, he explained, crews wouldn't get satellite images until much later, so by the time they had their hands on the pictures, the conditions they depicted were out of date.
Now, he said, NOAA's National Environmental Satellite, Data, and Information Service (NESDIS) satellites are providing almost real-time data. "To get sensors and real-time analysis back to us in a matter of two hours of the data being acquired is pretty amazing to us," Helton said. "We always see the satellite images on the news a week later. [But] we need to plan for this afternoon."
Day by day, the Coast Guard is using the imagery to re-initialize its oil spill models, and to do a "reality check of what's happening out there."
But one of the most important ways that the imagery is being utilized is to help direct aerial flyovers of the oil slick to try to determine where the worst areas are, how fast it's spreading and to where.
A problem in a major spill like this, Helton said, is that the slick has become so large that helicopters doing fly-overs don't have the range to cover the entire spill. But with near-real-time satellite imagery, Helton is able to help direct much more focused aerial observation and cleanup efforts.
Clearly, Helton continued, the use of satellites for overhead observation is nothing new. But for oil spill crews, it is a crucial new tool that until now has been used only in some very experimental settings. The only question is whether such technology will be available to crews cleaning up spills that are not considered national events.
And there's also been talk that in the future, crews might have access to unmanned aerial vehicles (UAVs) for oil spill observation and analysis, Helton said, though it's not known when or if such technology will be available.
Dialing up weather buoys
Another fairly recent addition to the spill cleanup team's arsenal is a series of data-gathering buoys and other ocean-based instruments that allow Helton and his team to get a real-time sense of the conditions at sea.
"We used to get forecasted winds and observations from ships," he said. "Now there's buoys out there and I can go [on] my computer and find out what the wind speed and direction [is] and what the wave height is."
That's a crucial development because the older method of using forecasts meant that anyone trying to make decisions based on that data was already dealing with a level of uncertainty. Now, teams can see what the conditions are within the last few minutes and make determinations on where oil will likely be heading. "So it gives us a lot more confidence and reduces the uncertainty for us," Helton said.
Similarly, he said, data provided by the multi-agency Integrated Ocean Observing System (IOOS) has allowed spill response teams to get a wide range of real-time information about currents, tidal heights, and other measurements that previously were based only on predictions or were otherwise very slow to gather. Ironically, many of the sensors included in IOOS are installed on oil platforms like the now-sunk Deepwater Horizon.
Despite some of the newer high-tech systems being used to help oil spill recovery, Helton said there are some areas where still fairly standard equipment is used. For example, while it might seem like a natural for utilizing sophisticated to try to model oil slick spread, Helton said that because there is so much uncertainty regarding even how much oil is leaking from the Deepwater Horizon, it doesn't make a lot of sense to try to employ such powerful computers.
Instead, modeling of the spread of oil is done on off-the-shelf computers, largely because what the Coast Guard is doing is "response-oriented," meaning that the agency wants to be able to get going within an hour of any new reports of spills or oil spread. And for that, supercomputers would be "overkill," Helton said.
One type of equipment that's not off-the-shelf is the series of(ROVs), essentially small remote-controlled deep-sea submarines, complete with robotic arms, that are being employed by BP to try to work on the location of the leak itself. That work has been unsuccessful so far, but the company is nonetheless operating the ROVs a mile below the surface, and controlling them from offices in Houston.
In the meantime, as the oil continues to flood into the Gulf of Mexico, the Coast Guard is sending crews equipped with cameras and GPS along the shoreline to do surveys. The idea there, Helton explained, is to try to gather data on the conditions on the ground and send that information back to a central database within minutes. In the past, he said, that information took much longer to gather and deliver because it required downloading files and e-mailing them to others, and because of the delays inherent in having individuals take such manual steps.
Helton said that this kind of data gathering is fairly new to biological and shoreline survey work.
The holy grail for oil spill responders has always been technology that would allow them to really detect where oil actually is and its thickness.
"The rule of thumb is that 90 percent of the oil is in 10 percent of the area," Helton said. "But if you're flying over, [the oil is] patchy, but it's hard to find where the heaviest patches are. But that's where you want to direct responders and dispersants."
Today, that job is done visually, forcing observers to make inferences on how thick the oil is in any given area. So for people like Helton, the next great technological advance will be sensors that can determine in one way or another how thick the oil is in any given patch. Today, however, while there are some experimental projects, they are not operational yet.
In the end, to someone like Helton, it's important to point out that while there is plenty of technology being employed in trying to do spill cleanup, it is still relatively low-tech work that will require a heroic number of manual efforts throughout the Gulf Coast region as oil from the Deepwater Horizon hits and collects on the shore.
"There are improvements and efficiencies on the edges," Helton said. "Still, it's going to be a lot of labor and a lot of messy, dirty activity to clean this up."