Geophysicists, who are quick to point out that they're not soothsayers, won't answer the question directly. Instead, they rely onto predict odds of a major quake--not unlike casino bookies in Las Vegas, but with plenty more at stake.
While they're no more sure of the next big earthquake than a gambler is of the outcome of a football game, scientists at the U.S. Geological Survey are getting better at setting the odds and getting the word out thanks to new technologies and the near-ubiquity of the Web.
Improvements to computing power have put real-time information at researchers' fingertips and helped advance quake warning systems around the globe.
While technology of the digital age has helped scientists get a significantly better handle on setting earthquake odds, the tools are as yet untested in a major tremor.
Dramatic improvements to computing power in the last 15 years have put real-time information at researchers' fingertips and helped advance warning systems around the globe. Calculating the size, scope and probability of earthquakes commands a lot of computing power, but with the growing capabilities of PCs, what used to take hours to compute now takes seconds.
"Some of the calculations we do are incredibly CPU-intensive. Calculations that took 20 minutes to run a decade ago now take a minute," said Dave Oppenheimer, a seismologist with the USGS.
Theused to detect earthquakes and measure the straining of the ground has also dramatically advanced during the digital revolution. Global positioning systems, satellite, radio and the Web have changed techniques to measure and disseminate data on the Earth's changes. A happy side effect is that with this new access to information, seismologists and researchers are developing new analysis tools.
For example, GPS networks give scientists new ways to think about how earthquakes interact. A new instrument called a broadband seismometer can measure how the ground moves up and down over several minutes, while a conventional seismometer can only detect quick motions over seconds. The tool can assist researchers by providing more data on the length and effects of different types of quakes.
"The confluence of the Web and more capable machines has propelled the seismology community," said Oppenheimer.
That spate of earthquakes along the California coastline has had seismologists hopping over the last two months. On June 15, a 7.2 offshore earthquake raised fears of a tsunami all the way from the Canadian border to Mexico. Two days later, an earthquake of magnitude 6.7 also struck offshore.
Both occurred along the San Andreas fault, which had its last significant rupture in 1906 in Northern California--a devastating 7.8 quake that damaged San Francisco and its surrounding area.
"Do they foretell a big one? No, but, we know something about the big faults that have ruptured," said Ross Stein, a geophysicist at the USGS.Watching for waves
Here is what they know: Because the San Andreas erupts about every 200 to 250 years and is only halfway through its current cycle, the USGS believes that that fault has between a 5 percent and 10 percent chance of triggering a major earthquake in the next 30 years. The recent quakes exert more pressure on the fault, which boosts the odds slightly for a short period of time, Stein said.
At least one Northern California earthquake threatened to create killer tsunamis because of its proximity to an adjacent fault called the Cascadian subduction zone. It's a 680-mile fault line running from Northern California to Southern British Columbia, which last produced a disastrous tsunami in the 1700s that swept over Japan. If the earthquake had occurred on the fault at a vertical slant, its magnitude could have been enough to produce a massive oceanic shift.
Scientists quickly tried to gauge