He is the driving force behind Windows NT Server, the operating system that will eventually become the software base for all future Microsoft operating systems as well as the company's Windows client franchise. He also has to sell an amalgamation of software tools--now known as Distributed Net Applications (for DNA)--that are intended to spur a new era of applications that take advantage of the processing power of the network, rather than specific computers. CNET's NEWS.COM sat down with the white-haired industry veteran to get his views on Windows NT growth, what DNA means for developers, how Windows will look in the future, and how Microsoft sees itself in an increasingly network-centric world.
NEWS.COM: In your talk at this conference, you made the statement that you
are "betting the company" on Windows NT. Could you expand on that?
ALLCHIN: We've made a commitment that we've made pretty clear that we're going to an NT technology base for future versions of consumer or business products after Win 98, so that's the first bet. Second, our strategy is heavily client and server. In fact, I don't think you can really have a good solution today if you're not thinking: "the network." All our products are banking on taking advantage, like the BackOffice wave behind it--banking on the directory, banking on the security system. If you tie all those together, it's a very strategic decision that we're making. We could go create another operating system just in case. We don't have that plan. We're banking on it. It means we can't falter in execution, so that's what I meant by that.
How do you describe Windows DNA and how does it benefit developers?
Windows DNA is an applications architecture. You can think about it as a set of rules for how you go about building applications so you get the best of client/server, PC, and the Internet together. It means that over time, we're going to merge the concepts of the Web page and the classical Windows applications interface, and we will do a similar thing at the processing level and at the storage level.
So what does it mean by integrating these things together and making it better for them? It means that we're going to provide the services in the operating system so they can basically write less code to get more out of the system. A good example there would be in IIS 4.0 and what was talked about in the Option Pack where you don't really have to think about scheduling when a request comes in on the server side, your code just runs. That's very nice. You don't have to think about transactions, you can just say "I want to run in a transaction context" and it'll just run in a transaction context. Typically, that's tremendously complicated for a developer, but now they don't have to do it.
How is this different from what Microsoft offers today?
To some degree, you could have done this before; you could say, I could've written a Windows DNA application back in NT 3.51 days. You could've. The problem is you would have written reams of code, monster code. You'd have to write a Web server, transaction server, message queuing system. Security, and on and on. Every step of the way we're just dropping more in the system so you have to write less code. The thing that became clear to us is that there is an actual architecture about how you write these applications that are going to make it much more powerful, so you can run it all on one machine, break it across three machines, or break it across a thousand machines.
We're not trying to say DNA is a technology. We're trying to say that DNA is a way that you can approach writing applications and that we have the infrastructure underneath it that supports it. So we're not, in particular, trying to re-label our technology. We have Windows and, given all the technology, we're asking, "How should you write an app in the future?" That's what we're asking. We're saying this is the way you should approach writing your applications.
How would you position this set of applications development tools
To some degree you could say it might not be related. I say that because I see Java trying to build this mini-operating system that's layered everywhere. Many of the services we're talking about don't exist in Java today. So when this observation [concerning DNA] hit us, we weren't thinking, "What's our answer to Java?" Our answer to Java is called Windows; our answer being the platform Java, not the language or the virtual machine.
The honest truth is Java didn't come up during the discussion. Java comes up when we talk about Windows, but it didn't come up when we talked about this applications architecture. Instead, we talked about: What's wrong with the Component Object Model [COM]? How can we improve it? Why is it hard for developers? That's what we've been asking for months and months.
NT is now incorporating several enterprise elements in it. Where
does work on the multiuser version of NT to support a variety of clients--code-named Hydra--stand?
We will ship the NT 4.0 version of Hydra. It will come with NT 4.0 because we have to change the system for it to work. When we do it in NT 5.0, it'll just be an add-on service onto NT 5.0, so it will not be a new kernel. The NT 4.0 will be, but that's an anomaly of time. Our whole focus is to have NT Server and NT Enterprise Server and on that you can layer a new service. We're done with NT 4.0, so we're not going to go back and fix the kernel there, but we will with NT 5.0. However, I can't guarantee we won't have to put out a service pack with NT 5.0 in order to fix it. There's too many schedules here for us to understand what's happening. We'll ship the 4.0 version of Hydra and then just see how close they are to NT 5.0 so we can figure if we can get everything into NT 5.0 or not.
What is Microsoft's role in networking?
Our role today is anything dealing with edges?-anything on the edge. So it could be remote access coming in, it could be edge routing, branch routing, whatever. Anything on the edge we see us having a role. We also see our role integrating with the other network elements that are in the middle, making sure that Cisco Systems, Bay Networks, 3Com, and everybody ties into the directory, making sure that that they also work with WBEM [Web-based enterprise management] so we can get a single management view and the like. There are a number of companies who believe NT is a general-purpose system and, as such, could be used in a commodity way for a switch or a router.
They will use NT and we will OEM NT to them, so then it's going to be up to them. There's going to be a lot of companies OEMing NT and using it as an embedded system in that way. That's something we're going to let them do. We're not right now focused on that ourselves. We have our hands full trying to be able to do great things at the edge. Its sort of just up to the networking vendors to leverage our technology.
Could you end up competing with networking vendors, such as your
current partner, Cisco Systems?
Probably not. They're pretty sure they're not in the operating-system business. They're pretty sure they want to run their stuff on NT and, in fact, we have a program underway to do that. It's hard to say what could change in the marketplace, but that's not the collision course we're on?-I know the industry wants to set us up that way. There's still going to be quite a battle out there between the networking vendors?-they're going to use NT, but we're going to be slightly removed from that battleground and hopefully NT can be the underpinnings of many of the alternatives that are out there. Period.
How will a Windows operating system look in the future given the
reliance on the NT base?
We are doing a lot of experiments now. Whether you're a consumer or a business user, you want a system that you don't have to re-boot, you want one that's very resilient to failures, you want one that's intelligent, secure, and you want to make it easy to use at work. So the underpinnings?-fixing stupid error messages, making the network transparent?-is true across the board. Then, on top of that, there's two things we need to think about. The first is, how can we make this UI scalable? That is, from very novice users up to people who want to customize the system to their heart's content, and not take away the flexibility that the PC has given them.
We're doing experimentation to decide how the interface can start with simple things and then have the system, through the end user, select more features or have the system automatically make those functions appear. The other dimension is what auxiliary software we should ship with each package. That is, what sort of software does the business user want vs. the consumer? The key thing for us is a common API [application programming interface] everywhere. That's why we're going for this common technology base.
A video was shown during your presentation showing the latest
code of NT being tested every day. How do you view the way NT is being
developed and the rate with which you are adding new functions to the
No one has done that as fast as we're doing it. On the server side, with the rate that we're adding technology to it, I would be hard-pressed to find anything like this. Frankly, it's fascinating because it's not like we inside Microsoft are very happy with it. It's very inefficient on one level, it's a militaristic machine to get this thing done. But it works. I feel very confident that we will be able to keep the quality up because we have such a focus on code coverage and automatic stress-testing.