by Ashutosh Jogalekar
Like most people, I have been baffled, mystified, unimpressed and fascinated by chatGPT, the new AI engine that has taken the world by storm over the last few months. I cannot remember any time that a new AI development got so much attention, led to so much mockery and caused so much alarm. As the writer Ted Chiang wrote, chatGPT – and what would inevitably be its subsequent versions and spinoffs – represent a blurry, lossy, version of the Internet, containing all the unique strengths and gory flaws of that medium. There is little doubt that chatGPT is not intelligent the way humans are based on the complete lack of nuance and and elementary factual errors it still makes as it cobbles together patterns from data without high-level understanding. But there is also no doubt that it creates an illusion of human intelligence that is beguiling, endlessly fascinating, addictive and frankly disturbing and potentially dangerous. Dangerous not because the machine is intelligent but because humans will no longer be able to distinguish between intelligence and the illusion of intelligence.
Some people think that AI engines like chatGPT herald a global threat of artificial general intelligence (AGI) because of their unprecedented ability to generate misinformation and the illusion of intelligence. AGI is AI that escapes from its narrowly defined applications to become all-encompassing. We should grapple with the challenges that AGI would pose, even if the probability of a true AGI remains slim. But as the parent of a 2-year-old, I feel an even more urgent question looming over the horizon: what are chatGPT or AGI going to do to our children’s generation? How can we try to ensure that they can handle the challenges posed by these unprecedented technological developments?
We don’t know how to answer this question yet since the impact of technology on society is inherently unpredictable – who would have predicted that social media would drive us apart by cocooning us in our echo chambers instead of triggering a global open exchange of ideas – but some lessons from history combined with old-fashioned, boring common sense might help us make sense of the brave new AI world that chatGPT represents. When the new becomes unpredictable and risky, the old-fashioned and boring may not look too bad. Read more »



1. In nature the act of listening is primarily a survival strategy. More intense than hearing, listening is a proactive tool, affording animals a skill with which to detect predators nearby (defense mechanism), but also for predators to detect the presence and location of prey (offense mechanism).
Njideka Akunyili Crosby. Still You Bloom in This Land of No Gardens, 2021.

A metal bucket with a snowman on it; a plastic faux-neon Christmas tree; a letter from Alexandra; an unsent letter to Alexandra; a small statuette of a world traveler missing his little plastic map; a snow globe showcasing a large white skull, with black sand floating around it.
I liked to play with chalk when I was little. Little kids did then. As far as I can tell they still do now. I walk and jog and drive around town for every other reason. Inevitably, I end up spotting many (maybe not 




In The Art of Revision: The Last Word, Peter Ho Davies notes that writers often have multiple ways to approach the revision of a story. “The main thing,” he writes, “is not to get hung up on the choice; try one and find out. … Sometimes the only way to choose the right option is to choose the wrong one first.” I’m easily hung up on choices of all kinds, and I read those words with a sense of relief.
A friend just sent me a copy of materials that the Cornwall Alliance is sending to its supporters. Here is an extract [fair use claimed]: