The demonstration was done in part to convince the industry that a standardized PC architecture can run corporate networks.
At Comdex Enterprise in San Francisco, Dell showed how 16 servers based on Intel's Pentium II Xeon processors and Windows NT work together in an extended "clustered" configuration.
Clustering refers to the practice of connecting multiple servers together to provide better data backup and higher performance and is a hallmark of high-end Unix systems running at large corporations.
Currently, using standard technology, Microsoft's Windows NT operating system can only be used to create a two-server cluster. Clustering gives users the benefit of data protection, a fail-safe proposition under which one server takes over the work of others in the event of a crash.
Unix-based servers, by contrast, can handle many servers in a cluster.
Progress in clustering has come in the form of proprietary technology from third-party vendors: Companies such as Sequent and Compaq's Tandem division extends that limit to 16, and also allows processing tasks to be shared across the different servers.
Moving up the corporate computing ladder has been a goal for a number of PC-centric companies. Intel has been attempting to position Xeon as the first in a series of chips designed to move the Intel chip platform into the high-end "enterprise" corporate computing market, currently dominated by computers based on a different chip architecture running variants of the Unix operating system.
Gaining a foothold in this market is seen as key to Intel's maintaining its historically high profit margins, as well as Dell's torrid revenue growth as well.
The first step has already been beset by several bugs in the associated chipsets, however, and other roadblocks still lie in the path for Intel and its associated vendors.
"The goal is to offer more bang for buck. But as you try to run one general purpose operating system on this very distributed structure, you run into performance roadblocks," said Carl Howe, director of computing strategies at Forrester Research.
The Windows NT operating system is widely regarded by analysts and IS professionals as being one of the sources of performance roadblocks. And Howe said Intel's Xeon chip, while capable of accessing 64 gigabytes of memory, still can't match the levels of memory storage offered by mainframe systems. Systems sometimes need vast amounts of memory reaching into the thousands of gigabytes, or terabytes, in order to process complex calculations.
To get around some of the Windows NT performance issues, switching technology from GigaNet will be used to connect and direct information between the servers.
GigaNet's technology employs the Virtual Interface Architecture specification (an initiative supported by Microsoft and Intel) to improve performance between "cluster-aware" applications such as DB2. These applications bypass the operating system and have direct access to the network, resulting in higher performance to go along with the fail-safe data protection that NT's clustering technology is currently limited to.
While not considered by some to be "true" clustering, analysts say the enterprise market is looking for these kinds of technologies to tie together their NT-based systems.
"People are just frustrated. Microsoft has not been moving fast enough for what the industry needs," said Dan Dolan, an analyst with Dataquest. Dolan expects IBM and Compaq to become more aggressive in bringing their own clustering technologies to bear in a commoditized server market as a means to differentiate their products.