[LINK] Australian Government Trial of Generative AI for Law, Education, Health, and Aged Care
Roger Clarke
Roger.Clarke at xamax.com.au
Sat Mar 9 15:35:31 AEDT 2024
On 9/3/2024 13:33, David wrote:
> Personally, I first came across neural networks in the late 60's when
my Supervisor at the time was experimenting with them on a very slow
common-or-garden engineering computer. But we could still see the model
learning...
Where 'the model learning' =
'model-parameters being adjusted by software on the basis of pre-defined
aspects of the data-inputs'
I don't want to play down the significance, because it was indeed a
generational change in the mode of software development.
But it helps to remain balanced about artefacts' capabilities when
anthropomorphic terms are avoided.
I wrote in 1990-91, in 'A Contingency Approach to the Application
Software Generations', in s.8 (The Application Software Generations as
Levels of Abstraction), at:
http://www.rogerclarke.com/SOS/SwareGenns.html#ASGLA
> The shape of at least one further generation is emerging from the
mists. Connectionist or neural machines, whether implemented in software
or using massively parallel hardware architectures, involve a conception
of knowledge yet more abstract than knowledge-bases containing
production rules.
>
> In essence, such a knowledge-base contains empirical data expressed
in some common language, but stored in a manner very close to its
original form, rather than in a summary form such as rules.
[ My use of 'form very close to' was misleading. ]
> The application contains no pre-packaged solutions to pre-defined
problems (as third generation technology requires), no explicit
problem-definition (as is necessary when using 4GLs), and does not even
contain an explicit domain-model (as is the case with knowledge-based [
most commonly rule-based] technology).
>
> With sixth generation application software technology, the human
'software developer' abdicates the responsibility of understanding the
domain, and merely pours experience into the machine. Rather than acting
as teacher, the person becomes a maintenance operative, keeping the
decision factory running.
30 years later, I say it a little differently from that. But that did
manage to build in the notions of (merely) empirical, abdication of
responsibility / decision factory [i.e. decision system, not decision
support system], and maintenance operative not teacher.
But in the late 60s, I was very prosaically writing a little Fortran
(before it even had version-numbers) and was shortly going to embark on
writing rather more code in that deeply intellectual language, COBOL. I
don't think I heard of neural networks until a *long* time after that.
> ...or at some risk of repeating myself, HAL in "2001: A Space
Odyssey" by Arthur C. Clarke. HAL developed self-awareness and took it
upon itself (if that's the right pronoun!) to kill the crew and run the
mission according to HAL's own estimate of it's importance.
For Christmas, my kids, ever-desperate to avoid resorting to socks or
handkerchiefs, gave me a T-shirt with these words emblazoned on it:
'I'm sorry Dave. I'm afraid I can't do that.'
I squeezed that [unrelated Clarke] idea into an article on Asimov, here:
http://www.rogerclarke.com/SOS/Asimov.html#RTFToC20
--
Roger Clarke mailto:Roger.Clarke at xamax.com.au
T: +61 2 6288 6916 http://www.xamax.com.au http://www.rogerclarke.com
Xamax Consultancy Pty Ltd 78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Visiting Professorial Fellow UNSW Law & Justice
Visiting Professor in Computer Science Australian National University
More information about the Link
mailing list