[LINK] Huston: 'The Future Internet and Peer to Peer'

Roger Clarke Roger.Clarke at xamax.com.au
Mon Jun 25 15:47:02 AEST 2007


The Future Internet and Peer to Peer
Geoff Huston, Chief Scientist, APNIC
AARNet News
Issue 8, June 2007
http://www.aarnet.edu.au/publications/aarnews/AARNews_07_06.pdf

These days the Internet is a massive content store, and we appear to 
have moved rapidly
through text, then audio and now video content.

But can our conventional models of content dissemination keep pace? 
So far we've replicated
the library model by constructing massive data centres that feed 
content out to clients on
demand.

But as the data volume increase across the entire network, can the 
data centres keep
pace? Already we are seeing some of the data centre giants building 
their data centres beside
power generation facilities. Will such a model of concentrated data, 
concentrated power and
concentrated heat generation continue to scale?

It seems that traditional concentrated data centre models of 
networking are closing in on an
inflexion point where ever larger centres imply higher unit cost 
rather than lower.

Maybe we're reaching a limit on the data centre model, and in looking 
for alternatives for the
Internet, it's here that peer to peer (P2P) models look very 
attractive. It's therefore not surprising to see a new entrants in 
the Internet working on models of video and other content delivery 
via P2P, including Joost and Vudu, for example.

But is the Internet really ready for a P2P explosion? It's not at all 
clear that we have the
type of network we need for widespread P2P, and this extends all the 
way down into the basic
architecture. As Van Jacobsen has pointed out, much of the network 
research today is focused
on the Internet as a medium of "channels" from A to B, and as a 
result Internet research is largely
about topology optimization, network layers, reliability, redundancy 
and service profiles. The
nature of P2P recasts this architectural view into an entirely new light.

Should we really be restricting P2P traffic on research and academic 
networks? Or should
we be encouraging it as a way of assisting the research sector to 
gain significant insights into
the service profile and architecture of tomorrow's Internet?

The fastest and most efficient data transport protocol we've deployed 
on the Internet so far is
the P2P transport model, and BitTorrent and its variants remains one 
of the few ways we know
to completely fill a network with data! After all, of what use to 
anyone is idle network capacity?
In networks speed and volume always win, and this points inevitably 
to the view that P2P is the
future of the Internet.

It's time we opened P2P up and started to conduct research into the 
ways we can architect
the network that really support P2P, and also researching ways to 
ensure the integrity of
content in a dense P2P world.


[I ran some similar lines in
Clarke R. (2006)  'P2P"s Significance for eBusiness: Towards a 
Research Agenda'  Journal of Theoretical and Applied Electronic 
Commerce Research 1, 3 (December 200^), at 
http://www.jtaer.com/portada.php?agno=2006&numero=3#
in particular at:
http://www.anu.edu.au/people/Roger.Clarke/EC/P2PRes.html#Infra

[But Geoff speaks with a lot more authority and a great deal more 
depth than I ever could.

-- 
Roger Clarke                  http://www.anu.edu.au/people/Roger.Clarke/
			            
Xamax Consultancy Pty Ltd      78 Sidaway St, Chapman ACT 2611 AUSTRALIA
                    Tel: +61 2 6288 1472, and 6288 6916
mailto:Roger.Clarke at xamax.com.au                http://www.xamax.com.au/

Visiting Professor in Info Science & Eng  Australian National University
Visiting Professor in the eCommerce Program      University of Hong Kong
Visiting Professor in the Cyberspace Law & Policy Centre      Uni of NSW



More information about the Link mailing list