New standard for a faster Web finished by year end? Maybe not

HTTP 2.0 is designed to deliver Web pages to browsers faster. But some in the standards world think finishing the technology in 2014 is unlikely.

IETF logo
Internet Engineering Task Force

A new version of the HTTP standard could deliver Web pages to browsers faster -- but delivering the standard itself rapidly is proving to be difficult.

On Friday, standard effort leader Mark Nottingham laid out a plan for delivering HTTP 2.0 and said if developers can stick to it, "we have every chance of having [a standard] well before the end of the calendar year."

"So far, I've heard strong preferences across the board for keeping the schedule tight," Nottingham said. "Every change we make and especially every feature we add has the potential to delay us."

No doubt some of that preference comes from Google, whose SPDY technology formed the basis for HTTP 2.0. Google introduced SPDY in 2009, and the technology has spread to its own Chrome browser, Mozilla's Firefox, Microsoft's Internet Explorer, many Web site that they reach, and the some of the software that delivers Web pages to browsers.

Everybody wants a faster Web -- people read more pages, buy more things, and perform more searches when Web pages load faster. But given how many entities are involved in building and operating the Web today, actually changing the rules that govern it has proved difficult.

The core feature of SPDY and HTTP 2.0 is "multiplexing," which lets many data-transfer requests share a single underlying network connection between a Web browser and the Web server across the Internet. Those requests are costly to set up, and Web pages have been demanding more and more over the years as the Web has grown more complex.

Google developed SPDY but is working with the Internet Engineering Task Force (IETF) to standardize HTTP 2.0. Its predecessor, HTTP 1.1, dates from 1999, and Google has a direct financial interest in a faster Web.

HTTP 2.0 skeptics

But Nottingham's hope for a year-end completion triggered some voices of skepticism.

"I do not see a draft that is anywhere near to being ready for LC [last call]," said Greg Wilkins, a software developer with business software maker Italio, referring to a late stage of feedback solicitation before a standard is final. "At the very least in the WG there currently exists a level confusion on fundamental matters that should not result from a clear specification."

"Rushing to last call in the spec's current state is folly," said James Snell, an engineer at IBM, and Apple's Mike Sweet concurred.

A longtime SPDY critic, Poul-Henning Kamp, piled on by saying the current effort should "admit defeat" and start over. That went too far for Mike Belshe, a SPDY co-developer and former Google employee who responded to Kamp with a "-1."

It's not easy to hammer out standards among a group of people with different opinions, priorities, and interests to defend in the technology world. The IETF works though "rough consensus and running code," though, and Nottingham countered some criticisms with calls for specific, technical proposals.

Already in use in the real world

Ultimately, standards only matter to the extent that people implement them, though -- and there SPDY has strong traction today, with HTTP 2.0 likely to follow on.

"This is well-worn ground," said Mozilla's Patrick McManus, evidently weary to see earlier issues resurface after being settled in earlier discussions. "Forgive me if I don't weigh in on every episode rerun while we do the implementation work to actually get this tested anddeployed."

Nottingham allowed that it's quite possible HTTP 2.0 standardization will slip into 2015. But the old days, with HTTP 1.1 unchanged for what's an eternity in the computing industry, won't return. He seemed inclined to follow the general trend in the software industry toward more frequent, less disruptive changes.

"While there was an about 15-year gap between HTTP/1.1 and HTTP/2, it's very likely that the next revision will come sooner," Nottingham said. "While we don't want to needlessly rev the protocol, we also don't want to turn this into a five+ year effort to design the perfect protocol -- if we get it wrong, we can learn from those mistakes and correct them."


About the author

Stephen Shankland has been a reporter at CNET since 1998 and covers browsers, Web development, digital photography and new technology. In the past he has been CNET's beat reporter for Google, Yahoo, Linux, open-source software, servers and supercomputers. He has a soft spot in his heart for standards groups and I/O interfaces.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

Mac running slow?

Boost your computer with these five useful tips that will clean up the clutter.