[LINK] Latency in indexing robots

Eric Scheid eric.scheid at ironclad.net.au
Wed Dec 6 17:39:49 AEDT 2006

On 6/12/06 4:47 PM, "Antony Barry" <tony at tony-barry.emu.id.au> wrote:

>> also, consider using google's sitemap thingy
>>     http://sitemaps.org/
> Now that's a good idea. Reminds me of the recursive file listing
> archie used to pick up for public ftp sites back in the late '80s.

I've got a wiki with 1,045 pages, and robots routinely revisit every darn
one of those pages to check for changes. I'm implementing the sitemaps
protocol so that google, msn, and yahoo only needs to revisit the one
document regularly (the sitemap), and then just the pages which *have*

Should cut down on a colossal waste of bandwidth :-)


More information about the Link mailing list