[LINK] Latency in indexing robots

Roger Clarke Roger.Clarke at xamax.com.au
Wed Dec 6 19:53:00 AEDT 2006

At 17:39 +1100 6/12/06, Eric Scheid wrote:
>I've got a wiki with 1,045 pages, and robots routinely revisit every darn
>one of those pages to check for changes. I'm implementing the sitemaps
>protocol so that google, msn, and yahoo only needs to revisit the one
>document regularly (the sitemap), and then just the pages which *have*
>Should cut down on a colossal waste of bandwidth :-)

Surely they'd use the HTTP HEAD method to ask for last date of change?

In fact, you'd expect them to use UDP, in which case the overheads of 
session creation and tear-down aren't involved, and the packet-sizes 
are very small.

Roger Clarke                  http://www.anu.edu.au/people/Roger.Clarke/

Xamax Consultancy Pty Ltd      78 Sidaway St, Chapman ACT 2611 AUSTRALIA
                    Tel: +61 2 6288 1472, and 6288 6916
mailto:Roger.Clarke at xamax.com.au                http://www.xamax.com.au/

Visiting Professor in Info Science & Eng  Australian National University
Visiting Professor in the eCommerce Program      University of Hong Kong
Visiting Professor in the Cyberspace Law & Policy Centre      Uni of NSW

More information about the Link mailing list