[LINK] Cory Doctorow: What Kind of Bubble is AI?

Roger Clarke Roger.Clarke at xamax.com.au
Tue Dec 26 13:56:10 AEDT 2023


On 26/12/23 12:34, Kim Holburn wrote:
> https://locusmag.com/2023/12/commentary-cory-doctorow-what-kind-of-bubble-is-ai/ 
> Of course AI is a bubble.  ...

Damn Cory.  He makes me feel so inadequate.

I burble on about all the inadequacies, and propose what's needed in the 
way of safeguards and processes to make AI tenable, and how AI has to be 
reconceived to actually be useful.

Cory cuts through all that with direct speech and relevant examples, and 
gets the message through to an extent that I can never manage.


For boring details on some of what Cory's conveyed, see for example:

Guidelines for the Responsible Application of Data Analytics (2018)
http://www.rogerclarke.com/EC/GDA.html#G

The Threats Inherent in AI (2019):
http://www.rogerclarke.com/EC/AII.html#Th

Principles for Responsible AI (2019):
http://www.rogerclarke.com/EC/AIP.html#T4
http://www.rogerclarke.com/EC/AIP.html#App1

How to Reap [AI] Benefits but Mitigate Harms and Manage Risks (2023)
http://www.rogerclarke.com/EC/AITS.html#RB

The Necessary Reconception of AI (2023):
http://www.rogerclarke.com/EC/AITS.html#RAI

___________________________________________

> ... It has all the hallmarks of a classic tech 
> bubble. Pick up a rental car at SFO and drive in either direction on the 
> 101 – north to San Francisco, south to Palo Alto – and every single 
> billboard is advertising some kind of AI company. Every business plan 
> has the word “AI” in it, even if the business itself has no AI in it. 
> Even as two major, terrifying wars rage around the world, every 
> newspaper has an above-the-fold AI headline and half the stories on 
> Google News as I write this are about AI. I’ve had to make rule for my 
> events: The first person to mention AI owes everyone else a drink.
> 
> It’s a bubble.
> 
> Tech bubbles come in two varieties: The ones that leave something 
> behind, and the ones that leave nothing behind. Sometimes, it can be 
> hard to guess what kind of bubble you’re living through until it pops 
> and you find out the hard way.
> 
> ...
> 
> When the dotcom bubble burst, it left a lot behind.
> 
> ...
> 
> But the most important residue after the bubble popped was the mil­lions 
> of young people who’d been lured into dropping out of university in 
> order to take dotcom jobs where they got all-expenses paid crash courses 
> in HTML, Perl, and Python. This army of technologists was unique in that 
> they were drawn from all sorts of backgrounds – art-school dropouts, 
> hu­manities dropouts, dropouts from earth science and bioscience 
> programs and other disciplines that had historically been consumers of 
> technology, not producers of it.
> 
> ...
> 
> Contrast that bubble with, say, cryptocurrency/NFTs, or the complex 
> financial derivatives that led up to the 2008 financial crisis. These 
> crises left behind very little reusable residue. The expensively 
> retrained physicists whom the finance sector taught to generate wildly 
> defective risk-hedging algorithms were not able to apply that knowledge 
> to create successor algo­rithms that were useful. The fraud of the 
> cryptocurrency bubble was far more pervasive than the fraud in the 
> dotcom bubble, so much so that without the fraud, there’s almost nothing 
> left.
> 
> ...
> 
> AI is a bubble, and it’s full of fraud, but that doesn’t automatically 
> mean there’ll be nothing of value left behind when the bubble bursts.
> 
> ...
> 
> That’s unlike, say, the Enron scam or the Uber scam, both of which left 
> the world worse off than they found it in every way. Uber burned $31 
> billion in investor cash, mostly from the Saudi royal family, to create 
> the illusion of a viable business. Not only did that fraud end up 
> screwing over the retail investors who made the Saudis and the other 
> early investors a pile of money after the company’s IPO – but it also 
> destroyed the legitimate taxi business and convinced cities all over the 
> world to starve their transit systems of investment because Uber seemed 
> so much cheaper. Uber continues to hemorrhage money, resorting to cheap 
> accounting tricks to make it seem like they’re finally turning it 
> around, even as they double the price of rides and halve driver pay (and 
> still lose money on every ride). The market can remain irrational longer 
> than any of us can stay solvent, but when Uber runs out of suckers, it 
> will go the way of other pump-and-dumps like WeWork.
> 
> What kind of bubble is AI?
> 
> Like Uber, the massive investor subsidies for AI have produced a sugar 
> high of temporarily satisfied users. Fooling around feeding prompts to 
> an image genera­tor or a large language model can be fun, and playful 
> communities have sprung up around these subsidized, free-to-use tools 
> (less savory communities have also come together to produce 
> nonconsensual pornography, fraud materials, and hoaxes).
> 
> The largest of these models are incredibly expensive. They’re expensive 
> to make, with billions spent acquir­ing training data, labelling it, and 
> running it through massive computing arrays to turn it into models.
> 
> Even more important, these models are expensive to run. Even if a 
> bankrupt AI company’s model and servers could be acquired for pennies on 
> the dollar, even if the new owners could be shorn of any overhanging 
> legal liability from looming copyright cases, even if the eye-watering 
> salaries commanded by AI engineers collapsed, the electricity bill for 
> each query – to power the servers and their chillers – would still make 
> running these giant models very expensive.
> 
> Do the potential paying customers for these large models add up to 
> enough money to keep the servers on? That’s the 13 trillion dollar 
> question, and the answer is the difference between WorldCom and Enron, 
> or dotcoms and cryptocurrency.
> 
> Though I don’t have a certain answer to this question, I am skeptical. 
> AI decision support is potentially valuable to practitioners. 
> Accountants might value an AI tool’s ability to draft a tax return. 
> Radiologists might value the AI’s guess about whether an X-ray suggests 
> a cancerous mass. But with AIs’ tendency to “hallucinate” and 
> confabulate, there’s an increasing recognition that these AI judgments 
> require a “human in the loop” to carefully review their judgments.
> 
> In other words, an AI-supported radiologist should spend exactly the 
> same amount of time considering your X-ray, and then see if the AI 
> agrees with their judgment, and, if not, they should take a closer look. 
> AI should make radiology more expensive, in order to make it more accurate.
> 
> But that’s not the AI business model. AI pitchmen are explicit on this 
> score: The purpose of AI, the source of its value, is its capacity to 
> increase productivity, which is to say, it should allow workers to do 
> more, which will allow their bosses to fire some of them, or get each 
> one to do more work in the same time, or both. The entire investor case 
> for AI is “companies will buy our products so they can do more with 
> less.” It’s not “business custom­ers will buy our products so their 
> products will cost more to make, but will be of higher quality.”
> 
> ...
> 
> Just take one step back and look at the hype through this lens. All the 
> big, exciting uses for AI are either low-dollar (helping kids cheat on 
> their homework, generating stock art for bottom-feeding publications) or 
> high-stakes and fault-intolerant (self-driving cars, radiology, hiring, 
> etc.).
> 
> ...
> 
> Cory Doctorow is the author of Walkaway, Little Brother, and Information 
> Doesn’t Want to Be Free (among many others); he is the co-owner of Boing 
> Boing, a special consultant to the Electronic Frontier Foundation, a 
> visiting professor of Computer Science at the Open University and an MIT 
> Media Lab Research Affiliate.
> 
> 


-- 
Roger Clarke                            mailto:Roger.Clarke at xamax.com.au
T: +61 2 6288 6916   http://www.xamax.com.au  http://www.rogerclarke.com

Xamax Consultancy Pty Ltd      78 Sidaway St, Chapman ACT 2611 AUSTRALIA 

Visiting Professor in the Faculty of Law            University of N.S.W.
Visiting Professor in Computer Science    Australian National University

-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 840 bytes
Desc: OpenPGP digital signature
URL: <https://mailman.anu.edu.au/pipermail/link/attachments/20231226/a9dc11c5/attachment.sig>


More information about the Link mailing list