X

Linux company to work on supercomputer

Linux NetworX will collaborate with SGI and Lawrence Livermore National Laboratory to develop software for storing data on "clusters" of Linux computers.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
SAN FRANCISCO--Linux NetworX, a company that focuses on joining collections of Linux computers into a supercomputer, will collaborate with SGI and Lawrence Livermore National Laboratory to develop software for storing data on these "clusters" of computers, the organizations said Tuesday.

The groups will work on a parallel file system, a system that controls how files are stored on a collection of servers. The work will be geared to meet the demands of the Energy Department's Accelerated Strategic Computing Initiative (ASCI), work that's been going on since the mid-1990s to simulate nuclear-weapons tests.

In addition, SGI Federal, the SGI subsidiary that sells computers to the government, will team with Linux NetworX to build three Linux clusters for the ASCI program that will have a total of 472 Pentium 4 CPUs at the Livermore lab. The companies announced the news Tuesday at the LinuxWorld Conference and Expo.

The file system must be able to span 1,024 computers, read or write 32GB of data a second, keep running even when individual servers fail, and lock files so two computers can't access a given file at the same time.

Being an open-source project could enable the software to be used more broadly than IBM's proprietary Global Parallel File System, said Giga Information Group analyst Stacey Quandt.