Wednesday 15 November, 2023
Neil McGregor
Former Director of the British Museum, photographed in Cambridge in 2017 after a lecture.
Quote of the Day
“On some great and glorious day the plain folks of the land will reach their heart’s desire at last, and the White House will be adorned by a downright moron.
H. L. Mencken
American voters, be careful what you wish for.
Musical alternative to the morning’s radio news
Ry Cooder | How Can A Poor Man Stand Such Times And Live
Long Read of the Day
We’re sorry we created the Torment Nexus
Wonderful long and thoughtful piece (or rant, depending on your POV) by Charlie Stross on the role of science fiction in shaping the current generation of tech bosses who are busily engaged in undermining democracy, turbocharging inequality and frying the planet.
The hype and boosterism of the AI marketers collided with the Rationalist obsession in the public perception a couple of weeks ago, in the Artificial Intelligence Safety Summit at Bletchley Park. This conference hatched the Bletchley Declaration, calling for international co-operation to manage the challenges and risks of artificial intelligence. It featured Elon Musk being interviewed by Rishi Sunak on stage, and was attended by Kamala Harris, vice-president of the United States, among other leading politicians. And the whole panicky agenda seems to be driven by an agenda that has emerged from science fiction stories written by popular entertainers like me, writers trying to earn a living.
Anyway, for what my opinion is worth: I think this is bullshit. There are very rich people trying to manipulate investment markets into giving them even more money, using shadow puppets they dreamed up on the basis of half-remembered fictions they read in their teens. They are inadvertently driving state-level policy making on subjects like privacy protection, data mining, face recognition, and generative language models, on the basis of assumptions about how society should be organized that are frankly misguided and crankish, because there’s no crank like a writer idly dreaming up fun thought experiments in fictional form. They’re building space programs—one of them is up front about wanting to colonize Mars, and he was briefly the world’s richest man, so we ought to take him as seriously as he deserves—and throwing medical resources at their own personal immortality rather than, say, a wide-spectrum sterilizing vaccine against COVID19. Meanwhile our public infrastructure is rotting, national assets are being sold off and looted by private equity companies, their social networks are spreading hatred and lies in order to farm advertising clicks, and other billionaires are using those networks to either buy political clout or suck up ever more money from the savings of the poor.
Did you ever wonder why the 21st century feels like we’re living in a bad cyberpunk novel from the 1980s?
It’s because these guys read those cyberpunk novels and mistook a dystopia for a road map. They’re rich enough to bend reality to reflect their desires. But we’re not futurists, we’re entertainers! We like to spin yarns about the Torment Nexus because it’s a cool setting for a noir detective story, not because we think Mark Zuckerberg or Andreesen Horowitz should actually pump several billion dollars into creating it. And that’s why I think you should always be wary of SF writers bearing ideas.
Worth your time, right from his opening line:
“I’m Charlie Stross, and I tell lies for money. That is, I’m a science fiction writer: I have about thirty novels in print, translated into a dozen languages, I’ve won a few awards, and I’ve been around long enough that my wikipedia page is a mess of mangled edits.
And rather than giving the usual cheerleader talk making predictions about technology and society, I’d like to explain why I—and other SF authors—are terrible guides to the future.”
Great stuff.
Errata
The other day, extolling a pair of jigs played by Martin Hayes and Dennis Cahill, I inadvertently renamed Dennis as ‘John’.
Apologies to him, and thanks the the reader who gently pointed out the error.