[LINK] Link Digest, Vol 208, Issue 52

David Vaile d.vaile at unsw.edu.au
Thu Apr 1 10:24:56 AEDT 2010


Date: Wed, 31 Mar 2010 11:58:37 +1100
From: "Birch, Jim" <Jim.Birch at dhhs.tas.gov.au>
Subject: Re: [LINK] Conroy vs Google
-----Original Message-----

From: Stilgherrian
>>
>> I am not at all convinced that the policy is Conroy's per se, or if it
>> is, that it was his idea as opposed to someone else's idea agreed to in
>> exchange for something else.
> 
> It's not Conroy's policy, it's government policy that they took to the
> electorate last election, as I recall.

It is a changed variant on the Cybersafety policy
<http://www.alp.org.au/download/now/labors_plan_for_cyber_safety.pdf >
outlined in ambiguous terms shortly before the last election, without the
preparation of any formal policy development work such as a law reform
report, major departmental review or extensive public or expert consultation
about the specific problems to be addressed, or costs, risks and benefits of
potential alternative solutions, or issues about transparency,
accountability or effectiveness.

(I note this page is has been taken down since about August 2009, so it is
unfortunately no longer available for voters to compare against current
proposal. But wait! The miracle of the Wayback Machine gets us:
<http://web.archive.org/web/*/http://www.alp.org.au/download/now/labors_plan
_for_cyber_safety.pdf>. Perhaps someone should preserve it by uploading to a
repository offshore in case the Wayback cache is flushed. See
<http://cyberlawcentre.org/censorship/references.htm> for other docs.)

The current version is different in some respects from the policy presented
to the electorate, although there are also some internal inconsistencies
which give conflicting interpretations.

  (1) the policy said, expanding what Omandatory ISP filtering¹ meant, that
ISPs would be obliged to Ooffer¹ filtering rather than it being mandatory
for everyone to have, so the principle of consent or choice to accept the
risks and costs in exchange for whatever benefit, present in part of the
policy, is missing.

  (2) the policy said the list criteria would be Oprohibited¹ content
(classified by Classification Board) which includes Opotentially prohibited
content¹ (deemed classified by ACMA, apparently without review). This
Oprohibited¹ concept in the Broadcasting Services Act Schedules 5 and 7
includes classifications RC and X18+, plus R18+ and MA15+ not behind age
verification (with the MA15+ also restricted in type and profit motive). The
latter two categories were inserted into Oprohibited¹ before the election by
Howard govt with Labor support.

The currently proposed test now seems to be, for inside Australia, as above;
for offshore, whatever is on the RC list, which potentially includes child
abuse material and much other. Thus there is now apparently one standard,
and a much stricter one, for Australian content compared to the RC standard
for the rest of the world.

Elsewhere it said, ³Labor¹s ISP policy will prevent Australian children from
accessing any content that has been identified as prohibited by ACMA,
including sites such as those containing child pornography  and X-rated
material.² The current policy does not cover X rated material, to the
disappointment of all those who wanted depictions of real sex (not
simulated, not connected with violence) banned.

The policy document they took the election (Oprohibited¹) also would cover
some R and MA above, but this was not mentioned where they give examples.

 (3) the policy limited the filter scope by reference to protecting
children. Practical direct dangers are the theme everywhere: OProvide
[elsewhere: OISPs to offer¹] a mandatory Oclean feed¹ internet service for
all homes, schools and public computers that are used [elsewhere:
Oaccessible¹] by Australian children.¹

If you read the latter Oby children¹ clause as qualifying the whole
sentence,  this does not apply to everywhere but home computers that are
used by Australian children, school computers that are used by Australian
children, and public computers that are used by Australian children. If you
read it as only applying to the last of these (which is inconsistent with
the direct child protection theme underlying the policy), then the scope of
the unfiltered net would be more restricted, but still available outside the
home - on wireless connections in other private premises perhaps, or on
campus, outside a 'home', etc.

> Governments have a duty to
> institute policies they were elected on where circumstances permit.  I'd
> be pretty confident that Rudd would be a firm supporter of filtering
> being a Christian with conservative social values.  He won't be the only
> one.  We haven't heard the "Liberals" decrying the filter, have we?

In fact there appears to be considerable disquiet in the Liberals, as well
as in the Labor party. Civil liberties, freedom of speech, potential for
future abuse, questions about effectiveness, and about whether it actually
empowers parents or children or not, are apparently resonant issues in both
party rooms.  Whether this converts into Parliamentary opposition depends
more on party discipline and calculations of point scoring than personal
belief.
 
> The government is responding to public opinion (and fears).

This is true. It is also to some extent promoting it, in traditional
fashion. The question is whether its proposed solutions offer real benefits
for the supposed beneficiaries (children, anyone under 18), or make their
parents feel better without proper reason, boosting government support. A
government that cares primarily for appearances and approval based on an
unsubstantiated impression they are doing something is not as justified as
one which investigates all the factors more openly and objectively. There is
a danger that parents, who do have real concerns and unwarranted ones, are
misled by solutions that do not solve the real problems.

> In general,
> the idea that any criminal/antisocial web site put up by anyone,
> anywhere, for any purpose, should be available in our childrens'
> bedrooms is wrong to most people.

The filter is not aimed at criminal/antisocial web sites. There are already
laws that criminalise the most commonly cited content, child abuse material
in the most draconian terms, such that many children (anyone taking certain
images of anyone who appears under 18)  are probably guilty of these
offences.

> We can't rely on authorities in other
> countries to shut down nefarious sites.

While we can't rely on authorities anywhere to be entirely effective,
experiments seem to indicate that even ordinary individuals or organisations
taking direct action by simply asking hosts to shut down compromised sites
is (a) quite effective in many cases and (b) relatively rarely tried,
possibly in the interests of building prosecution cases rather than taking
down content. So this avenue is under-explored.

> Neither can we rely on
> reputation as a means of control as we can - to some degree - with
> conventional local media.

Depending on what you are trying to control, you can probably rely on well
resourced law enforcement to a considerable degree to effectively target for
instance child pornography networks to the extent that they are being driven
underground and off the public net.
 
> Most people would want at least some sites taken out even if they don't
> know how.

Pretend that you know how. If there are a trillion items on the open web,
with 1-10 billion or more changing every month, how do you propose to detect
and classify a significant enough number (assuming you want this to be
effective, and not merely a moral gesture?)

Viability aside, it is true that some people think some sites should be
taken down. This occurs at present, for very limited and clearly
demonstrated as harmful sites.

> There's disagreement about what to block, how much of a
> performance hit you're willing to take to achieve this, process
> transparency, and whether it would work at all, but the basic idea of
> that some level of media censorship is pretty common.

There is of course disagreement about whether to block.

Media classification is common, and is be used in ways that do or do not
involve censorship. Real and reliable classification of the infinitely
vaster and more dynamically changing internet content is potentially very
expensive. The current proposal does not even this, so it does not even
offer to try to block that material which would offend whatever standard is
chosen, merely anything anyone wants to complain about that might fit the
relevant category (RC offshore, RC, X, and some R and MA in Australia).

Technical censorship on the internet is in its infancy. By trying to do
something this confused and extensive, and getting political points for
making a gesture regardless of its effectiveness, the scene is set for scope
creep and wider application of technical monitoring and blocking into the
future, with the supposed beneficiaries not necessarily any safer from real
harms. It is important, faced with this risk, to explore the real interests
we are trying to protect (young people and to a lesser extent their
parents), and if we are serious, identifying the things that will actually
work to make these people's use of the internet tolerably safe with least
collateral damage.
 
> This isn't meant to be a defence of Conroy's personality or methods.

Neither is this critique addressed at you, but the popular misconceptions
you articulate!





More information about the Link mailing list