Friday 15 November, 2024
Sunset in a wing mirror
Seen while driving on a Summer evening in West Cork, many years ago.
Quote of the Day
I have a foreboding of an America in my children’s or grandchildren’s time – when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness…
Carl Sagan, The Demon-Haunted World, 1995
Prescient, eh? Thanks to Sheila Hayman (Whom God Preserve) for spotting it.
Musical alternative to the morning’s radio news
Bud Powell Trio | Blues for Bessie
Long Read of the Day
The 3 AI Use Cases: Gods, Interns, and Cogs
A lovely, clear-sighted view of the technology by Drew Breunig, who cuts through the noise and extracts the signal.
We talk about so many things when we talk about AI. The conversation can roam from self-driving cars to dynamic video generation, from conversational chatbots to satellite imagery object detection, and from better search engines to dreamlike imagery generation. You get the point.
It gets confusing! For laypeople, it’s hard to nail down what AI actually does (and doesn’t) do. For those in the field, we often have to break down and overspecify our terms before we can get to our desired conversations.
After plenty of discussions and tons of exploration, I think we can simplify the world of AI use cases into three simple, distinct buckets: * Gods: Super-intelligent, artificial entities that do things autonomously. * Interns: Supervised copilots that collaborate with experts, focusing on grunt work. * Cogs: Functions optimized to perform a single task extremely well, usually as part of a pipeline or interface.
Let’s break these down, one by one…
Read on.
Thanks to Andrew Curry for pointing me to it.
My commonplace booklet
Doc Searls (Whom God Preserve) is an Elder of the Web and one of the most perceptive observers of the online world.
I’ve just read a lovely tribute he’s written to his long-term friend, Paul Marshall, who has passed away.
Paul also taught me to believe in myself.
I remember a day when a bunch of us were hanging in our dorm room, talking about SAT scores. Mine was the lowest of the bunch. (If you must know, the total was 1001: a 482 in verbal and a 519 in math. Those numbers will remain burned in my brain until I die.) Others, including Paul, had scores that verged on perfection—or so I recall. (Whatever, they were all better than mine.). But Paul defended me from potential accusations of relative stupidity by saying this: “But David has insight.” (I wasn’t Doc yet.) Then he gave examples, which I’ve forgotten. By saying I had insight, Paul kindly and forever removed another obstacle from my path forward in life. From that moment on, insight became my stock in trade. Is it measurable? Thankfully, no.
Linkblog
Something I noticed, while drinking from the Internet firehose.
AI Chatbot Added to Mushroom-Foraging Facebook Group Immediately Gives Tips for Cooking Dangerous Mushroom
An AI chatbot called “FungiFriend” was added to a popular mushroom identification Facebook group Tuesday. It then told users there how to “sauté in butter” a potentially dangerous mushroom, signaling again the high level of risk that AI chatbots and tools pose to people who forage for mushrooms.
One member of the Facebook group said that they asked the AI bot “how do you cook Sarcosphaera coronaria,” a type of mushroom that was once thought edible but is now known to hyperaccumulate arsenic and has caused a documented death. FungiFriend told the member that it is “edible but rare,” and said “cooking methods mentioned by some enthusiasts include sautéing in butter, adding to soups or stews, and pickling.” The situation is reminiscent of Google’s AI telling people to add glue to pizza or eat rocks on the advice of a Redditor named Fucksmith.
Time was when ‘magic mushrooms’ used to cause hippies to hallucinate. Now the ‘hallucinations’ of an AI can kill you, it seems.