iPhone 14 Pro vs. Galaxy S22 Ultra HP Pavilion Plus Planet Crossword Pixel Watch Apple Watch Ultra AirPods Pro 2 iPhone 14 Pro Camera Best Android Phones
Want CNET to notify you of price drops and the latest stories?
No, thank you

Sending a wake-up call to the W3C

HP technology strategist Rich DeMillo says the industry standards body must fix vulnerabilities in the network infrastructure underpinning the Net if it is to ever establish a "Web of trust."

When it was founded in 1994, the World Wide Web Consortium was entrusted with developing standards and policies that would enable the Internet to reach its full potential.

An explicit goal was to facilitate what was described as a "Web of trust" that would help guide the Internet's development "with careful consideration for the novel legal, commercial, and social issues raised by this technology." However important Internet trust seemed in 1994, it is even more critical in a post-WTC world.

Most everyone I talk to thinks the Web must become implicitly trustworthy if it's to reach its full potential. I use the word trust intentionally because it means more than security.

Trust is also about dependability, privacy, data integrity, and authentication. Trust is a system-class concept that cannot be adequately addressed by point technologies. If there is a coming golden age for the Web, then trust has to be stitched into its fabric. I want to suggest a path for achieving this.

Let's begin with what needs to be fixed.

First, the open nature of the Web has changed everything. The Web is a deliberately blurred set of abstractions in which computer hardware, communications networks, system software, middleware and applications connect, cooperate and disconnect.

All of the trusted components in my local environment can be easily subverted by untrustworthy components I may need to rely upon to complete a task. Building, for example, a highly trusted desktop in a Web-connected world is like building steel doors in a house with paper walls. The headlong rush to Web services is going to make things worse.

On top of the new vulnerabilities sometimes created by the fixes, nobody has real confidence in ad hoc trust features and functions.
Second, the current "break and fix" approach toward trust and security cannot sustain the improvements I am talking about. On top of the new vulnerabilities sometimes created by the fixes, nobody has real confidence in ad hoc trust features and functions. Without precisely defined trust models, users cannot know what exactly is being claimed relative to threats that have not yet occurred.

The discussion also needs to shift from "cost benefit" to "table stakes." Without system-class trust models, investing in even simple trust technologies is--at best--insurance against events of unknown probabilities. At worst, it's another set of steel doors in paper walls.

In reality, we don't know the levels of trust that we need for getting into Web services, utility computing, or mobility games.

I believe that since the terrorist attacks on Sept. 11, the table stakes have been raised.

Web resources are only barely manageable today. Adding trust as a quality-of-service guarantee will push things over the edge. The number of devices connecting and disconnecting to our shared infrastructure will surely keep doubling for some time to come. Advances in nanotechnology could well result in Avogadro's number of hosts all assigned to a specific life-critical task. There is simply no way to scale existing performance management technology to such an environment.

One solution for addressing trust in today's world is to build a chain of trust where each link is strong but also connects to its neighbor by verifying its trustworthiness. In particular, beginning with a priori grounding in a physically trustworthy base, each link of the chain checks signature-based trust tokens of its neighbor before the chain is allowed to carry any weight. Such a chain begins with processors and extends to operating systems to applications to interconnect protocols, and ultimately, to end users.

Fix it!
I believe a first link in this chain will come from addressing the core vulnerabilities deep in computing and network infrastructure--the underpinnings of the Web. Such a program, if adopted by the major technology companies, would lead to a fundamental shift in trust for the Internet.

Web resources are only barely manageable today. Adding trust as a quality-of-service guarantee will push things over the edge.
The Trusted Computing Platform Alliance architecture represents an industrywide effort to provide secure hardware platforms, which is the essential first link in any IT trust chain. Using Itanium's inherent security features (for example, an additional two layers of privilege protection at the microprocessor level), Hewlett-Packard is building a secure platform architecture on top of the Itanium architecture.

We think it makes great sense to do this in the town square by calling on the trust-enhancing ability of the open-source community with its rigorous peer review, open publishing and testing methodologies. That is one reason why Linux will be the first operating system we'll port to SPA.

The W3C may not be directly involved in designing and regulating these industry-standard building blocks and open architectures. But it is quite deeply involved in creating the standards for how they all interoperate.

The opportunity to extend the chain of trust beyond a secure platform and extend it into Web-based platforms is where the W3C can significantly contribute to the chain--architecting trust into new links via its influence over next-generation protocols. Ultimately, this will create the opportunity for the Web to escape the cycle it is currently caught in, paving the way for it to thrive and prosper.