[LINK] making the index transparent (was Re: Police raid home of Wikileaks.de domain owner over censorship lists
stil at stilgherrian.com
Sat Mar 28 13:12:31 EST 2009
On 28/03/2009, at 12:43 PM, Marghanita da Cruz wrote:
> ISPs host multiple virtual websites on single hosts - my
> understanding is that
> they would all share the same IP address.
Correct. There can be hundreds, even thousands of individual and
completely unrelated websites on one IP address, and in turn they can
each consist of many, many pages. Conversely, one high-traffic
"website" might be served from multiple IP addresses to spread the load.
The granularity of "bad content" is at the URL level. URLs change over
time. For example, the "bad thing" might be in a forum at http://verybadplace.com/comments?page=2
, but as newer material is added at the front the "bad thing" will
move to http://verybadplace.com/comments?page=3 and http://verybadplace.com/comments?page=4
and so on.
Hashes of the content at a "bad URL" are irrelevant because the "page"
might well contain dynamic content such as advertising or "related
links" or whatever which change over time. Or a URL like http://verybadplace.com/latest
will have different content every day, even without advertising.
Lists are static. The web is dynamic. The "page" metaphor is,
essentially, dead. Game over.
> Thus blocking the IP of a prohibited site will result in the
> blocking of sites
> belonging to other customers of that ISP.
Yes, but IP addresses can be used as a "first cut" of the process, to
shunt the request to The Magic Filter Box for a more detailed decision
(i.e. looking into the packed to see that URL is being requested).
Internet, IT and Media Consulting, Sydney, Australia
mobile +61 407 623 600
fax +61 2 9516 5630
ABN 25 231 641 421
More information about the Link