[LINK] making the index transparent (was Re: Police raid home of Wikileaks.de domain owner over censorship lists
Marghanita da Cruz
marghanita at ramin.com.au
Sat Mar 28 12:43:03 AEDT 2009
Kim Holburn wrote:
> I like this idea. I think with a few additions it might be workable.
<snip>
>> A Public Database that lists the:
>> - DATE of entry into the database,
>> - classification it would receive (because the URLs are NOT
>> classified officially through application, they are just 'presumed
>> to receive' a specific classification)
>> - Title of the page (or site)
>> - Description of the page or content. "Bestiality" or "Child
>> Pornography"
>> - The Geographical location (may be assumption too) of the site
>
> The IP or at least a hashed or encrypted version of the IP and the
> whois entry for the IP.
> If it's just one page in an otherwise ordinary site with a large
> number of pages like say wikipedia then an encrypted form of the URL
> and the host name.
> A note as to whether the URL refers to one page or to the whole site
> or a major portion of the site.
> a hash of the page you get from the URL.
> A button that can test if the current page is the same as the banned
> page, the IP is the same, the whois info is the same.
>
<snip>
ISPs host multiple virtual websites on single hosts - my understanding is that
they would all share the same IP address.
Thus blocking the IP of a prohibited site will result in the blocking of sites
belonging to other customers of that ISP. Hence, I guess the need for takedowns.
Marghanita
--
Marghanita da Cruz
http://www.ramin.com.au
Phone: (+61)0414 869202
More information about the Link
mailing list