Friday 12 January, 2024
It’s a long, long road…
… that has no turning.
(Though it might have been built by the Romans.)
Quote of the Day
”The four stages of life are infancy, childhood, adolescence and obsolescence”
Art Linklater
Musical alternative to the morning’s radio news
Leo Kottke and Mike Gordon | Tiny Desk (Home) Concert
A jewel from the pandemic time. I’ve enjoyed Kottke’s playing ever since I heard him play Jesu, God of Man’s Desiring at the Cambridge Folk Festival sometime in the 1970s.
Long Read of the Day
ChatGPT is an engine of cultural transmission
By Henry Farrell.
Background: A while back, the cognitive scientist Alison Gopnik, one of the wisest people around, gave a terrific lecture and co-authored an academic paper in which she offered the first original perspective I’d seen on Large Language Models (LLMs) like GPT-4, Claude, et al. She thinks we should regard them not as oracles (or even “stochastic parrots”) but as “cultural technologies” — more specifically an information retrieval technology. That is, says one insightful summary of her view,
we should not think of an LLM as being something like a mind, but much more like a library catalog. Prompting it with text is something like searching over a library’s contents for passages that are close to the prompt, and sampling from what follows. “Something like” because of course it’s generating new text from its model, not reproducing its data. (LLMs do sometimes exactly memorize particular sequences … but they simply lack the capacity to memorize their full training corpora.) As many people have said, an LLM isn’t doing anything differently when it “hallucinates” as opposed to when it gets things right.
And that’s because it doesn’t actually know anything. It just knows about statistical correlations between different ‘tokens’ standing for words or parts thereof. Artificial intelligence programs that learn to write and speak can sound almost human—but they can’t think creatively like a small child can.
Her lecture is, sadly, behind a kind of academic paywall, though the academic paper she co-authored is not, as far as I can see.
All of which is by way of a long-winded introduction to a remarkable essay by Henry Farrell that has just appeared, in which he picks up Gopnik’s insight and runs with it.
What’s lovely about reading people like Farrell and Gopnik is that they think about this stuff the way Lewis Mumford and other sages once thought about technology and its role in society. Which is why I think this latest essay is worth your time.
My commonplace booklet
Richard Susskind, who has just stepped down as Technology Adviser to the Lord Chief Justice of England and Wales, had an interesting piece in The Times (behind a paywall). Hefe’s the bit that caught my eye:
Long term, the significance of AI in law will not lie in replacing tasks currently taken on by human lawyers. To suppose this is to imagine, by analogy, that the future of surgery is entirely about robots replacing the work of human surgeons. Instead, the key to future health care is in non-invasive therapies and preventative medicine.
So too in law. The future of law is not robotic lawyering. It will be using AI to deliver the legal outcomes that citizens and organisations need, but in entirely new ways — for instance, through online dispute resolution rather than physical courts.
More fundamentally, the huge promise of legal AI systems lies in enabling a shift from dispute resolution to dispute avoidance.
Linkblog
Something I noticed, while drinking from the Internet firehose.
The Seven Social Sins
Wealth without work.
Pleasure without conscience.
Knowledge without character.
Commerce without morality.
Science without humanity.
Religion without sacrifice.
Politics without principle.
Ironic that we have built societies that reward every one!