Thursday 27 October, 2022
Lunch break
This wonderful Van Gogh — Noon, rest from work — is in the Musee d’Orsay in Paris. I thought I knew most of his paintings, but this one is new to me — and a revelation.
Thanks to Andrew Curry, who used it to illustrate a post on his splendid blog.
Quote of the Day
“We inherited a bunch of formulas from the Labour Party that shoved all the funding into deprived urban areas. That needed to be undone. I started the work of undoing that.”
Rishi Sunak to Tory activists, Tunbridge Wells, 5 August 2022.
Selected just in case anyone had the idea that the UK’s latest PM might be some kind of liberal. He is, after all, an alumnus of Goldman Sachs. In fact, the person he most reminds me of is George Osborne, another fanatical believer in ‘fiscal rectitude’ who made ordinary people pay for the bailing out of the banks in 2008.
En passant… I wonder if the conspiracists of the DUP have tumbled to the fact that ‘Rishi’ is an anagram of ‘Irish’.
Musical alternative to the morning’s radio news
George Lewis “Burgundy Street Blues” with Mr. Acker Bilk & his Band (1965)
Should be played at everybody’s funeral.
Long Read of the Day
AI is changing scientists’ understanding of language learning – and raising questions about an innate grammar
Very interesting essay on what the large language models might be suggesting about how humans learn language.
New insights into language learning are coming from an unlikely source: artificial intelligence. A new breed of large AI language models can write newspaper articles, poetry and computer code and answer questions truthfully after being exposed to vast amounts of language input. And even more astonishingly, they all do it without the help of grammar.
Even if their choice of words is sometimes strange, nonsensical or contains racist, sexist and other harmful biases, one thing is very clear: the overwhelming majority of the output of these AI language models is grammatically correct. And yet, there are no grammar templates or rules hardwired into them – they rely on linguistic experience alone, messy as it may be…
Read on.
This is a challenge to conventional theories about language learning which postulate that language learners have a grammar template wired into their brains to help them overcome the limitations of their language experience. But large language models like GPT-3 can generate grammatical sentences — without knowing anything about the world — simply by being good at predicting what word comes next.
This essay brings to mind many earlier debates about the complex relationship between technology and scientific theory. Think about the telescope and astronomy, or the microscope and biology. Which is why it’s interesting.
Exit, Beijing style
Fascinating video of strange goings-on among the top brass of the Chinese Communist Party at the Congress in which the former Chinese President Hu Jintao was led out of the hall in a moment of unexpected drama during an otherwise fastidiously choreographed event.
The 79-year-old Hu was sitting beside Xi Jinping in when he was approached by a man in a suit and Covid mask who spoke to him and appeared to pull his right arm. With Xi looking on, the man then places both hands under Hu’s armpits and attempts to lift him out of his seat. Xi appears to talk to Hu before the man gets between them and tries to lift Hu again. Then another guy in a mask arrives and Hu eventually stands up, exchanges a few words with Xi and places a hand on the shoulder of Premier Li Keqiang, the China’s number two official, before he was led away. Weird.
My commonplace booklet
From Joe Dunne:
I think your quote this morning should read ‘Too many notes, dear Mozart, too many notes’ and it should be attributed to Emperor Joseph II. It was supposedly said after the first performance of Entfuhrung aus dem Serail on 16th July 1782 in Vienna. But never let the truth get in the way of a good story!
I won’t, Joe, I won’t.