Geophysicists, who are quick to point out that they're not soothsayers, won't answer the question directly. Instead, they rely onto predict odds of a major quake--not unlike casino bookies in Las Vegas, but with plenty more at stake.
While they're no more sure of the next big earthquake than a gambler is of the outcome of a football game, scientists at the U.S. Geological Survey are getting better at setting the odds and getting the word out thanks to new technologies and the near-ubiquity of the Web.
Improvements to computing power have put real-time information at researchers' fingertips and helped advance quake warning systems around the globe.
While technology of the digital age has helped scientists get a significantly better handle on setting earthquake odds, the tools are as yet untested in a major tremor.
Dramatic improvements to computing power in the last 15 years have put real-time information at researchers' fingertips and helped advance warning systems around the globe. Calculating the size, scope and probability of earthquakes commands a lot of computing power, but with the growing capabilities of PCs, what used to take hours to compute now takes seconds.
"Some of the calculations we do are incredibly CPU-intensive. Calculations that took 20 minutes to run a decade ago now take a minute," said Dave Oppenheimer, a seismologist with the USGS.
Theused to detect earthquakes and measure the straining of the ground has also dramatically advanced during the digital revolution. Global positioning systems, satellite, radio and the Web have changed techniques to measure and disseminate data on the Earth's changes. A happy side effect is that with this new access to information, seismologists and researchers are developing new analysis tools.
For example, GPS networks give scientists new ways to think about how earthquakes interact. A new instrument called a broadband seismometer can measure how the ground moves up and down over several minutes, while a conventional seismometer can only detect quick motions over seconds. The tool can assist researchers by providing more data on the length and effects of different types of quakes.
"The confluence of the Web and more capable machines has propelled the seismology community," said Oppenheimer.
That spate of earthquakes along the California coastline has had seismologists hopping over the last two months. On June 15, a 7.2 offshore earthquake raised fears of a tsunami all the way from the Canadian border to Mexico. Two days later, an earthquake of magnitude 6.7 also struck offshore.
Both occurred along the San Andreas fault, which had its last significant rupture in 1906 in Northern California--a devastating 7.8 quake that damaged San Francisco and its surrounding area.
"Do they foretell a big one? No, but, we know something about the big faults that have ruptured," said Ross Stein, a geophysicist at the USGS.Watching for waves
Here is what they know: Because the San Andreas erupts about every 200 to 250 years and is only halfway through its current cycle, the USGS believes that that fault has between a 5 percent and 10 percent chance of triggering a major earthquake in the next 30 years. The recent quakes exert more pressure on the fault, which boosts the odds slightly for a short period of time, Stein said.
At least one Northern California earthquake threatened to create killer tsunamis because of its proximity to an adjacent fault called the Cascadian subduction zone. It's a 680-mile fault line running from Northern California to Southern British Columbia, which last produced a disastrous tsunami in the 1700s that swept over Japan. If the earthquake had occurred on the fault at a vertical slant, its magnitude could have been enough to produce a massive oceanic shift.
Scientists quickly tried to gauge how the fault broke in order to warn nearby coastal cities.
To do this, they computed what's called a "moment tensor," a mathematical representation of how stress is released in an earthquake, to model whether the fault broke vertically or horizontally--the difference between a killer wave and large waves.
"That calculation used to take 10 minutes. Now we routinely do it in 30 seconds," Oppenheimer said. As it happened, the quake's sideways movement did not produce a monstrous wave, but it did put additional stress on the fault.
The scientific community is also testing an early warning system that relies on sensors placed in areas surrounding faults, connected back to seismic headquarters in Colorado. With the sensors, scientists can locate an earthquake and compute the magnitude before the shaking begins. If a wave is traveling at a typical speed of 4 miles per second and the sensors are sending data at the speed of light, then in areas offshore, the USGS could offer something like a 10-second warning.
In the last month, Oppenheimer and others at the USGS have been sizing up the California earthquakes for potential links. Since June, two earthquakes with a magnitude of 5 hit Southern California, followed by the two quakes of magnitude 7 in Northern California. On June 26, another quake of magnitude 4.8 shook up the Lake Tahoe region.
In some sense, all of the earthquakes are linked because they're part of the San Andreas system that runs the length of the state of California. Stein said the first quake in Northern California likely caused the other, like dominos falling, and triggered smaller shifts felt around San Francisco. But those likely did not cause the ruptures in Southern California.
One of the two in Southern California was right on the Yucaipa fault, and the other was on the San Jacinto fault. Though the southern earthquakes were 1,000 times weaker in comparison, they were close to the San Andreas fault and hit densely populated areas.
Still, that portion of the San Andreas is in greater danger of rupturing soon. Stein said the last earthquake of magnitude 8 in that area happened roughly 325 years ago, but the average time span for such a fissure to occur is between 200 years and 215 years. That means there's a 30 percent chance in the next 30 years that it will erupt, Stein said.
"The little ones probably don't do much, but the message is: We have a hazard fault cheek by jowl in a very populated place," he said. "It's a one-in-three shot."
As for odds predicting afrom the Cascadian subduction zone, Oppenheimer said there's less than a 5 percent chance over the next 30 years that killer waves will hit California. That's because tsunamis produced from 9.0 magnitude quakes on the Cascadian subduction zone occur every 500 to 600 years, and the last one that caused a tsunami that washed over Japan happened in the 1700s.
"The latest quakes ratchet up the probability a bit for months to a couple of years," said Oppenheimer.
In the long term, the challenge is to wire cities and rural areas to respond appropriately. For example, a warning system could be used to automatically cause traffic lights to turn red so that people wouldn't drive onto bridges, or it could alert trains to avoid underground pathways. Such technology could have saved lives in Thailand and Sri Lanka, but the warning system was not in place to alert people on beaches.
"The effort has been over the past decade to get any information we would typically compute in a research environmentthrough automatic means," Stein added.
The USGS is also now developing software that harnesses short text messaging to alert people of earthquakes via cellular phones. It is testing a system that will push information to local emergency managers, but in the future it could be used as a public service system that could blanket cell phones with warnings.
Still, technology of the digital age is as yet untested in a major earthquake.
"We don't know how brittle the systems will be," said Oppenheimer.