A new version of the HTTP standard that promises to deliver Web pages to browsers faster has been formally approved, the Internet protocol's first revision in 16 years.
The specifications for HTTP 2.0 have been formally approved, according to a blog post by Mark Nottingham, who as chairman of the IETF HTTPBIS Working Group serves as the standard effort's leader. The specifications will go through a last formality -- the Request for Comment documenting and editorial processes -- then be published, Nottingham wrote.
HTTP, short for Hypertext Transfer Protocol, is one of the seminal standards of the Web. It governs how a browser communicates with a Web server to load a Web page. HTTP 2.0, the protocol's first major revision since HTTP 1.1 in 1999, is designed to load Web pages faster, allowing consumers to read more pages, buy more things and perform more and faster Internet searches.
The new standard is based on SPDY, a protocol Firefox, Microsoft's Internet Explorer, many websites such as that they reach, and the .. The technology spread to Google's own Chrome browser, Mozilla's
The core feature of SPDY and HTTP 2.0 is "multiplexing," which lets many data-transfer requests share a single underlying network connection between a Web browser and the Web server across the Internet. In terms of computing resources, those requests are costly to set up, and Web pages have been demanding more and more over the years as the Web has grown more complex.
In practice, HTTP 2.0 also brings another big change: encryption.to protect privacy and cut down on hacking vulnerabilities, and SPDY requires encryption technology called TLS (Transport Layer Security), formerly called SSL for Secure Sockets. That encryption push grew a lot stronger after the former National Security Agency contractor Edward Snowden revealed extensive government surveillance, and SPDY's creators along with some IETF saw the performance benefits of HTTP 2.0 as a good way to coax more of the Web toward encryption.
There's also a practical reason for encryption in HTTP 2.0: it makes it easier to adopt a new version of HTTP. That's because it sets up a direct connection between the Web server origin and the Web browser destination, and that direct connection sidesteps problems from intermediate network equipment that might not yet support HTTP.
However, some IETF members -- notably some of those that make or operate that intermediate equipment -- didn't like the encryption requirement. Thus, the IETF didn't require it as part of the HTTP 2.0 standard. However, in practice, encryption is very likely, because Firefox and Chrome won't support HTTP 2.0 without encryption.
"For the common Web browsing case, HTTP/2 servers will need to use TLS if they want to interoperate with the broadest selection of browsers," Nottingham said in an earlier blog post summarizing the encryption debate. So in practice, it's likely that HTTP 2.0 will function like the secure version of earlier HTTP, called HTTPS.
Moving too fast?
Nottingham had previously expressed confidence that the. But given the number of entities involved in building and operating the Web today, actually changing the rules that govern it proved difficult.
Nottingham's hope for a year-end completion triggered some criticism in the developer community, who voiced skepticism that a draft was ready for last call -- the late stage of feedback solicitation before a standard is final.
"Rushing to last call in the spec's current state is folly," James Snell, an engineer at IBM, wrote last May, and Apple's Mike Sweet concurred, saying, "I do not see a draft that is anywhere near to being ready for LC."
Google's roots aside, Nottingham dismissed the notion that the Web giant strong-armed the Internet Engineering Task Force into using its protocol for the standard revision.
"While a few have painted Google as forcing the protocol upon us, anyone who actually interacted with Mike and Roberto [who brought SPDY to the group for standardization] in the group knows that they came with the best of intent, patiently explaining the reasoning behind their design, taking in criticism, and working with everyone to evolve the protocol," he wrote.
[Via The Next Web]