Commentary: Distributed computing is a potential boon for research

SETI@home is an ideal distributed computing project because the data is very easily parsed into chunks and scheduling is fairly easy.

2 min read

Gaining benefits from distributed computing has been a holy grail for at least years. One problem is that it is much harder to do across multiple, different operating systems, because the software has to run on each of the computers.

Technical computing applications frequently are

See news story:
Companies working on standard for distributed computing
characterized by complex operations applied to fairly simple data easily parsed into discrete chunks.

For years, engineers and scientists have been devising distributed system software schemes for exploiting unused processing power on super-fast, networked technical computing workstations. However, the software used invariably is proprietary and complex, making ongoing support and upgrading difficult.

Moreover, this type of highly networked, parallel processing environment is rarely suitable for traditional, general-purpose computing.

SETI@home is the largest, most public example of a highly networked, parallel processing computing environment. Home computers of thousands of volunteers spend all their extra CPU cycles analyzing pieces of radio telescope data looking for possible signals from intelligent life on other planets--essentially doing large amounts of important research at no real cost to anyone.

SETI is an ideal distributed computing project. Because the data is very easily parsed into chunks, scheduling is fairly easy, since results do not have to be compiled into any specific order, and it is running on a homogeneous computer population: Windows PCs.

A little closer to home are weather prediction and research projects such as genome mapping. However, corporate financials, ERP and CRM would be nearly impossible to run over a distributed computing environment.

This is because although distributed computing is a very good way to solve certain computing problems, only a subset of companies have these problems.

The recently announced effort of companies, including Hewlett-Packard, Compaq Computer and SGI, in conjunction with distributed computing software seller Platform Computing, will face sizable challenges as it attempts to provide this type of broadly networked, parallel processing technology across a heterogeneous group of computing platforms, which adds to the complexity of the problem.

However, the platforms involved are logical choices since most--like SGI and HP--are strong in research and design applications, where this kind of parallel processing has most of its potential application.

This is one area of computing where Microsoft does not dominate, so the lack of Microsoft involvement will not be a determining factor in the success of Platform Computing--at least not for now.

For companies that have computing problems that lend themselves to parallel processing, distributed computing can provide a low-cost substitute for time on a supercomputer. However, this is mainly a solution for research departments, which will probably install their distributed computing software themselves.

Corporate IT will probably have no involvement in or control of the project. IT should not consider this for mainstream enterprise business applications, which do not lend themselves to the kind of parsing required for distributed computing.

Meta Group analysts William Zachmann, Dale Kutnick, Val Sribar, and Peter Burris contributed to this article.

Entire contents, Copyright © 2000 Meta Group, Inc. All rights reserved.