X

Here to there and back?

A sea change may be under way with diskless PCs called network computers. If it catches on, the NC will mean at least a partial return to centralized, network-based computing, and a lot of IS managers can't wait.

8 min read
In 1984, Apple Computer portrayed corporate computer users as drones, numbed and brainwashed into mute compliance with a Big Brother-like ruler who shouted imprecations from on high. The aim was for the ruler to symbolize, in the computer hierarchy, the all-powerful mainframe. In Apple's narrative, the wretched souls could only be rescued by an athlete whose power, grace, and speed represented the PC, and, more specifically, the Macintosh. In any case, the image stuck. The PC wasn't a computer; it was a revolution.

Since then, if as an IS manager you weren't into PCs, you were a passive corporate slave and that was all there was to it. Client-server computing, where the PC under the direct control of the user held all the cards, was where things were at.

Another revolution, however, may be under way with the emergence of a breed of diskless PC alternatives called network computers. While the Internet has given a whole new purpose to these boxes, in many ways they look a lot like the so-called dumb terminals those corporate slaves used. Is it a nostalgic trip back to the future, or just a big marketing stunt to wrest away control from the new Big Brother in Redmond?


John Seeley, database administrator for the Air Force Wargaming Institute at Maxwell AFB, on "brilliant terminals"
Whatever it is, if it catches on it will mean at least a partial return to the kind of centralized, network-based computing that became pass? as soon as that Apple commercial aired. It turns out that a lot of IS managers just can't wait.

It's a tribute to the marketing might of Oracle and its allies that NCs are being portrayed as a breakthrough technology. In terms of basic schematics, a computing architecture built around NCs is just a flashier version of the old dumb terminals and mainframes where all the data and most of the processing power live on a computer that the user may never see and certainly can't touch.

"This is a new variation on an old concept, as far as having most of your resources centralized where they can be used efficiently," said John Seeley, a database administrator for the Air Force Wargaming Institute at Maxwell Air Force Base in Alabama. "I wouldn't call the NC a dumb terminal. You might want to call it a brilliant terminal because it's going to have a lot more computing power."

778K 799K
Nelson Petracek, consultant with DBCorp, on how the NC is different
Nelson Petracek, a consultant with DBCorp, an information systems integration and consulting firm, said the NC is so improved it really is a new idea: "I see the NC as a completely new technology. It contains a lot more than a dumb terminal, in terms of the ability to access a lot more information worldwide. It [also] has a better user interface."

The thin-client model isn't really centralized like the old mainframe architecture. With mainframes, there was only one ingress and egress point kept literally under lock and key, but NCs will connect to multiple servers scattered throughout a company.

It's really a combination of the centralized and distributed models, with data residing on distributed servers but with a lot of the high-maintenance features removed from the client. But, as with dumb terminals, users will have to adapt to the machine more than the machine will adapt to them.

This is part of why Intel and Microsoft hate NCs. Network computers really do undermine all that "power of the people" PC panache. But a sampling of IS managers, programmers, and consultants interviewed by CNET say that this notion of a revolution is overrated. A new cost-conscious realism that's willing to make room for the NC is back in style. After all, empowering everybody is really expensive.

"Some employees like to control what goes on [their] computer and how to access it. But as time goes on, people will see that the NC is easier to maintain and that the NC is the way to go," said Tijuana Glover, an engineer and scientist with TRW's systems integration group.

249K 313K
Tijuana Glover, TRW engineer, on how fast NCs will be adopted
"I think it's a great idea to have server-based applications," said Roger Johnson, a developer with Ericsson Data. He said developers at the Swedish telecommunications giant are already talking about implementing network computers for at least some of the company's 80,000 employees worldwide within the next two years.

The NCs main attraction is its ability to lower support costs. "Right now, we have several support people spending a lot of time running around playing Dr. Feelgood to all of our PCs," said Seeley. "The nice thing about the NC is that theoretically you will have most of the work done on servers and most of the applications on servers, which will make upgrading a lot easier."

True, nobody is ready to give the PC its last rites. The NC is barely out of prototype stage, and few big corporations have even started to evaluate them. But the combined promises of cheap and simple management and the ability to develop centralized applications without having to face the enormous complications of distributed computing are looking more and more attractive to the IS community.

"The network computing device will offer more for the same money," said Mohd Shah, a consultant with Information Services. "The NC will make a low-cost alternate to a very expensive desktop machine."

When this community talks, people listen. Suddenly, Microsoft and Intel are now willing to help cut the cost those desktop machines with an initiative they call the NetPC, a machine they say will be simpler and cheaper to manage but retain the processing power that made the PC such a potent symbol of individual power.

Still, that may not be enough. Analysts say the truth is that many big companies are frustrated by the technical challenges involved in splitting an application into client and server segments that live all over the network. Plain old common sense dictates that keeping something in one place makes it a whole lot easier to find when you need it. These companies are ready to scale down their client-server efforts to fit in a model where the client is thin and the server is king.

"In a company doing a software implementation today, one of the issues is how to make remote users accessible to the network," said Shah. "A lot of work is going into Web applications, which involve using a PC with a heavy investment. To get more access for the money, many people will look at NCs as a low-cost alternative. My main objective is to get information across to as many people as cheaply as I can."

995K 1016K
Mohd Shah, consultant with Information Services, on alternatives to PCs
NCs don't include hard disk drives, so applications must be maintained on corporate servers, giving IS the upper hand in maintaining application and data integrity. They also lack floppy drives, so data-eating viruses are much more difficult to introduce to servers and networks. If you need a new application, you don't have to run to the software store or to your IS department. You may simply download it from the company server or from the Internet.

"What is neat about this thing is total scalability," said William Phelan, vice president of industry strategy and member relations for floral delivery giant FTD.

FTD is likely to adopt an NC architecture in the form of Sun's new JavaStation to connect its 23,000 members to central applications. Phelan said FTD is also moving development to Java and recently built a Java-based custom order entry application in eight weeks. He estimates that a comparable client-server system would take up to 18 months to develop.

But whether the vendors now hyping the NC see an opportunity to get back at Microsoft or feel genuine remorse for any unnecessary client-server projects commissioned in the past ten years, a steady parade of vendors are now building NCs posthaste. Even IBM, which built its fortune pushing a centralized computing concept based on multimillion-dollar mainframes, has established an NC division. A dozen just-hatched NC designs from Big Blue, Sun Microsystems, and other hardware makers are on display this week at the annual Comdex trade show in Las Vegas.

But the fact remains that once something has been given, it's hard to take away. The NC does rob the user of their custom-tailored PCs, (just think about handing over your own hard disk), and that all but guarantees a certain amount of backlash from users. "The NC certainly takes up a segment of the market that has been ignored," said Jeff Markham, a database developer with Science Applications International. "I'm not sure people will want to give up the PC, though."

IS managers attending this week's Comdex may get an opportunity to mix with the masses, assessing the odds that users will give up their PCs, voluntarily or otherwise.

By show time next year, a lot of the still open questions will have been answered: Will NCs really save money? Will the NetPC outmaneuver the NC and its own proponents? Will users rise up in protest? Will the adoption of the NC change how IS departments are managed?

Regardless of the final outcome, the discussion surrounding this newest trend in computing architectures is having a profound and irreversible effect: It lets IS dispense with all the rhetoric associated with the PC. It lets them acknowledge that corporate computing is not an ideology; it's just a job.

"For most of us in the trenches, it's like, 'Just don't screw up our paychecks,'" said Markham. 

Three models for making
computers work

Computing requires both processing power (brains) and data (files). Each new architecture is defined by where the processing happens and where the data resides.

Centralized: dumb terminals and mainframes
Pros
- Easy to manage, maintain
- Strong security
- Centralized application development
 
Cons
- All terminals are alike
- Mainframes cost a fortune
- Heavy traffic can slow performance
- Basically paperweights when network goes down
 
 
Distributed: clients and workgroup servers
Pros
- Users get to do their own thing
- Users can still work when network is down
- Most flexible design
 
Cons
- Users get to do their own thing
- Client applications take a lot of work
- High maintenance costs
- Vulnerable, not too secure
 
 
Thin client: network computers and servers
Pros
- Easy to manage, maintain
- Strong security
- Centralized application development
- Client can be customized via applets
 
Cons
- Heavy traffic can slow performance
- Limited application options
- Power users not satisfied
- Basically paperweights when network goes down
 

Go back to page 2