CAMBRIDGE, Mass.--Comcast has confessed to slowing down certain peer-to-peer file-sharing traffic, but is it being clear enough about what it's doing?
That's perhaps the key question that emerged by the end ofconvened by the Federal Communications Commission on Monday here at Harvard Law School.
While none of the FCC commissioners was willing to solidify an answer to that just yet, two MIT computer scientists on an afternoon panel accused the cable company of behaving badly on multiple levels.
Each drew on his experience with fundamental Internet standards-setting bodies. And each charged that Comcast's admitted practice of delaying uploads to peer-to-peer networks at times of "peak" network congestion--ostensibly to optimize the surfing experiences for all other customers in that geographic area--presents at least two major concerns. First, it leaves users and application developers in the dark about when traffic interference will be triggered, and second, it goes aggressively beyond existing techniques for managing Internet traffic congestion.
David Reed, an MIT professor who's considered to be an Internet engineering pioneer, said Comcast's behavior makes him especially "uncomfortable" because it involves both deep packet inspection, which poses privacy concerns, and injection of forged reset packets, a disruptive tactic that makes a message appear to be coming from someone it's not.
Comcastever since a handful of networking-traffic tests last fall forced it to go public about delaying uploads of files to peer-to-peer file-sharing applications such as BitTorrent. Monday's hearing was meant to inform the FCC's thinking as it decides whether to grant two petitions, which ask it to declare Comcast's behavior outside the scope of and set ground "Net neutrality" rules for all Internet service providers to follow.
Comcast Executive Vice President David Cohen repeatedly defended his company's practices during his appearance before the commission on Monday. He argued that Comcast engages in extremely limited management of "excessive" peer-to-peer file-sharing traffic at peak hours of network congestion and that such activities produce "imperceptible" effects for most of its customers. In fact, he added, more customers have come to the cable operator with compliments than complaints about its traffic handling.
BitTorrent Chief Technology Officer Eric Klinker, who spoke on the panel alongside the computer scientists, said network operators have been mischaracterizing the role of the file-sharing protocol. It's not intended to swallow up all available bandwidth, he said. Rather, it was hatched back in 2001 to respond to the question of "how do we efficiently move large files on the Internet," he told the commissioners.
But commissioners were clearly finding it difficult to draw the line between what's "reasonable" network management and and what's not, or as Democratic Commissioner Jonathan Adelstein put it in a final question to the MIT computer scientists: "Where is the line between good discrimination and bad discrimination?"
One problem is an absence of quantifiable ways of figuring out whether peer-to-peer use is causing consumers to exceed their seemingly unspoken bandwidth allotments--and what those are, anyway. FCC Republican Chairman Kevin Martin and Republican Commissioner Robert McDowell each pressed Comcast to help supply that information, but in a way, they may have been asking the wrong question.
"We don't say for x dollars you get to use x amount of bandwidth, that's not the way we market our service," Comcast's Cohen replied. "We market...that we provide a service up to a certain amount of speed, subject to the condition that the customer does not use a service in a way that would degrade other customers' services."
That ambiguous threshold makes it difficult for application developers and users alike to know what exactly they're getting from their Internet access provider, David Clark, chief protocol architect of the Internet during its nascent stages and a senior research scientist at MIT.
Clark said he doesn't understand why ISPs are reluctant to be more specific about such policies, although he did acknowledge it's difficult to say "how much is too much." "If I had to quantify what constituted unacceptable congestion, it becomes a very contentious space," he said.
Martin indicated he was similarly perplexed. "If the contract doesn't say that there's any limitations, then how can there be limitations on (subscribers)?" he asked.
The computer scientists agreed that network operators must accept that congestion will occur from time to time on the Internet, just as it does on physical streets, and they readily acknowledged network management is necessary to a degree. But if network operators need to develop more innovative ways to manage heavy loads, they suggested abiding by already-accepted methods in Internet standards communities or working with standards-setting bodies to develop new ones, rather than unilaterally adopting techniques favored by hackers, as Reed argues Comcast did.
Clark also urged that Internet users, rather than their ISPs, should have the power to assign priority to applications as they please. For instance, if they want choose to give Internet phone traffic precedence over gaming, or vice versa, they should be allowed to do so.
Martin said he hadn't decided by the end of Monday's hearing where to come out on the Comcast-BitTorrent dispute, but he said he would weigh carefully what the MIT professors and other panelists recommended during the hours of public discussion.
"One of the main concerns I have," he told reporters after the hearing had adjourned, "is that there wasn't a transparency to some of the network management practices (Comcast) engaged in."