A grand scheme introduced in the early 1990s as a way to link disparate computer systems is being usurped by Web technologies popularized through the Internet's ubiquity.
Once the only game in town, this technological lingua franca--known as Distributed Computing Environment, or DCE--is being abandoned even by some of its most ardent supporters of years past. And DCE's one success story, its security system, is losing key backers as newer, better, and more easily distributed technologies emerge.
"What attracted us to DCE in the first place was the infrastructure and security services," said Ted Hanss, former IT director at the University of Michigan and now director of application development for the Internet2 project. Hanss and the university embraced DCE long ago, hoping that the technology could serve as a universal bridge for the massive jumble of mainframe, workstation, and Unix server systems that made up the computing world.
The school recently ditched that idea. The final straw, according to Hanss, was that the technology couldn't be easily adapted to help link packaged software applications.
"We were deploying Oracle and PeopleSoft, and there was no way to get DCE integrated in," he said. "We weren't getting help from the software vendors, so we would have to spend an inordinate amount of time porting DCE services to specific systems--and that's what we were trying to avoid in the first place."
DCE grew out of the late 1980s when large companies ran into the need for a way to connect a growing number of computer systems from different manufacturers. In the 1960s, '70s, and 80s big companies still relied almost exclusively on big iron--self-contained mainframe systems that either weren't connected to other computer systems or shared only rudimentary file exchange.
That all changed with the advent of Unix. So-called "minicomputers" began to invade corporate IS shops, as corporations turned to Unix-based systems from Hewlett-Packard, Sun Microsystems, Data General, and IBM. Those systems were fast and relatively cheap in comparison to multimillion dollar mainframes.
Despite a common Unix ancestry, each system spoke its own language and ran its own flavor of Unix. As the number of systems in a single company went from a solitary mainframe to multiple dissimilar Unix systems, the need for a common way to link systems became imperative.
That led to the formation of the Open Software Foundation, now the Open Group, a vendor-sponsored consortium that launched a far-reaching scheme for linking systems, called DCE.
Like many techo-religions, DCE still maintains a small band of die-hard followers intent on pursuing its development. Among them, Hanss notes, are Penn State University, the Energy Department, and the United Kingdom's postal service, which uses the technology to link its mainframes and Unix systems.
But their ranks have thinned as analysts like Giga Information Group's Mike Gilpin advise companies to look elsewhere for distributed technology. "Now when people want to bridge systems, they think of CORBA or Java," he said.
The hope was that DCE would be a vendor-neutral technology for linking systems from multiple vendors, so companies could exchange data and people could use a single logon for all systems.
That theory soon clashed with reality. Early in its life, DCE was branded as too complex by analysts and large companies that labored to install the framework, leaving many companies reluctant to commit to it.
The one bright spot has been DCE's security service, based on an MIT-developed technology called Kerberos, which has been adopted by a handful of IS departments as a convenient way to cover multiple systems under a single security umbrella.
On a broader scale, however, much of DCE's promises can now be delivered using far less complex technologies. One-time loyal supporter IBM, for one, is shifting from the technology in its software, looking to move its Component Broker middleware toward a public key-based approach instead.
The security mechanisms of Distributed Computing Environment have largely been surpassed by new Web-based technologies, such as digital certificates and public-key infrastructure (PKI) encryption, which offer stronger protection that's easily distributed through the Internet, according to a report written by Gilpin.
And despite efforts to link DCE to Windows NT, Java, and other newer forms of computing, analysts are recommending that companies look elsewhere to connect different systems--namely, toward technologies popularized by the Net, especially those that support Web browsers.
As for other DCE services--such as threads, remote procedure call, directory, time, distributed file service, and distributed management environment--all have been equaled or surpassed by newer technologies, many in the industry say.
"Some of the problems that DCE set out to solve are slowly being addressed," Hanss said. "Things like LDAP [the Lightweight Directory Access Protocol] and other technologies are making it easier to distribute systems."
As the newer technologies are still evolving, he added, DCE does have a head start in some key areas. "Public key encryption still needs some work, and some things like key revocation are missing. With Kerberos, if you want to get rid of someone's privileges, you just delete them from the database."
For companies that have already started down the path toward Distributed Computing Environment, the good news is that The Open Group, the industry consortium that promotes the technology and is responsible for the specification, has announced new ways to link it to public key encryption.
Perhaps most important to DCE's survival is the fact that large companies have poured substantial resources into its development, giving them a vested interest in keeping the technology alive.
"DCE isn't dying," Gilpin said. "It's just much less prevalent, and there are better ways to do the same thing."