X

Conoco hopes to hit oil with slick supercomputer

The oil company builds a Linux-based machine for finding oil and gas beneath the Earth's surface, saying it costs a tenth of the average price of a conventional supercomputer.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
3 min read
Conoco has built a Linux-based supercomputer for finding oil and gas beneath the Earth's surface, saying the machine costs a tenth of the average price of a conventional supercomputer.

The computer uses dozens of single- and dual-processor Intel-based computers connected by a 1-gigabit-per-second network, 10 terabytes of hard disk storage and a tape library. The company declined to provide further details on the hardware.

The machine can perform 500 billion calculations per second, said Alan Huffman, manager of Conoco's seismic imaging technology center.

Linking collections of relatively inexpensive Linux computers has become a popular way to build low-budget supercomputers called "Beowulf clusters." Conoco used a method similar to the Beowulf approach, including its own modifications to the Linux kernel, Huffman said.

Not all computing tasks work well on Beowulf systems, which can be hobbled by comparatively slow communications among the nodes that comprise the system. In addition, software written for traditional supercomputers must be rewritten to work on a Beowulf system.

Conoco decided its Linux machine was a good idea, even accounting for the $4 million cost of rewriting its software. The hardware cost an additional $1.75 million, Huffman said, and the company will spend another $1.5 million to double the computer's performance by the end of 2001.

Conoco started watching Beowulf cluster research at Sandia National Laboratories, Los Alamos National Laboratory and Lawrence Livermore National Laboratory, Huffman said, and soon was convinced of the economic merits of the approach.

"Even with the software re-engineering included, we were able to build the system for about 10 percent of what an equivalent Cray or other supercomputer architecture would have cost us," Huffman said. "We very quickly realized the cost-performance difference...between the Intel and Cray systems was going to get very big very quickly."

Beowulf systems are popular not only at Linux start-ups such as Atipa and Linux Networx but also at big companies such as IBM, Compaq Computer and Dell Computer. Seismic research used to map the Earth's interior to find likely oil and gas reserves is one lucrative job amenable to the method.

Dell leased a 64-computer system to oil company Amerada Hess. Linux Networx, which has opened a Houston office to accommodate oil and gas industry customers, sold a 32-processor cluster to seismic analysis company 3DGeo Development.

Another advantage of Conoco's system is that it can be split into several independent pieces that can be taken to different parts of the globe where seismic research is being conducted, Huffman said. That's important for some countries where Conoco enters partnerships with nationally owned oil companies that require the first crack at data analysis to take place on that country's soil, Huffman said. In addition, local analysis is necessary for remote locations such as central Asia or the middle of the ocean.

With the new system, Conoco can break off units of 32, 64 or 128 CPUs and send them along with seismic researchers, he said. He estimated that one subcluster would be out in the field at any given time.

"We currently have an eight-node system on a ship in Indonesia," he said, a system too small for full data analysis but large enough to see if the information is of high enough quality.