X

Whatever else it is, P2P is inefficient

P2P places more load on the aggregated systems and networks of the Internet taken as a whole than if the same content were being distributed in a centralized manner. Using P2P may make sense and, perhaps ISPs should support P2P traffic for any of a number

Gordon Haff
Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.
Gordon Haff
2 min read

I assume that Mark Cuban is deliberately being contentious about peer-to-peer networking in his An Open Letter to Comcast and Every cable/Telco on P2P when he writes:

"BLOCK P2P TRAFFIC, PLEASE"

I'm not going to get into the political and other considerations here, but he has an economic and technical basis for his argument.

In September, I attended Technology Review's EmTech07 Emerging Technologies Conference at MIT. I've written previously about some of my general takes. However, one of the panels that I attended, "P2P: The Future of Networking?" is germane to the question at hand. The panelists were Klaus Mochalski, CEO, Ipoque (does P2P traffic management and analysis); Roger Dingledine, President and Cofounder, The Tor Project (an online anonymity project based on P2P); and Robert Morris, Associate Professor of Computer Science at MIT (helped develop Roofnet and Chord/DHash). Without delving further into the background of the panelists, suffice it to say that all have been involved with P2P networks from the technical side. None were coming at P2P from a content provider perspective--which tends to be anti-P2P given that these networks are often used to pirate copyrighted material. Again, not today's discussion.

With that as preamble, I found it noteworthy that none of the panelists came across as particular P2P fans.

For instance, Tor's Roger Dingledine said that "P2P is not good for anything you can do in a centralized way with the same properties." Now, to be sure, one of those things that you can't do in a centralized way and still have the same properties is anonymity as implemented by the Tor Project. Nonetheless, I think it a notable statement from someone who's clearly not a particular P2P foe. (Roger also commented that P2P/decentralization helps anonymity but it's not perfect and that anonymity for Web browsing and other small things can be achieved in other ways.)

Robert Morris discussed some of the reliability issues associated with P2P: "One way to think about this is that: 'Would you want Skype to be the only way to get 911'?" He went on to note that: "Distributed in server room vs. distributed servers around the world is a big difference because of latencies. You can get partial failures which almost doesn't happen in a centralized system. and partial failures are very hard to design for. That's one of the main limits of P2P."

Finally, Ipoque's Klaus Mochalski noted that "P2P is taking load off servers but actually adds load to the Internet backbone because you copy stuff around more often than necessary."

This last comment is really the heart of Mark's missive. P2P places more load on the aggregated systems and networks of the Internet taken as a whole than if the same content were being distributed in a centralized manner. Using P2P may make sense and, perhaps ISPs should support P2P traffic for any of a number of reasons.  But efficiency can't be the argument.