by Sarah Firisen
During the height of lockdown, I was stressed. We were all stressed. We were scared of getting sick and terrified that the most vulnerable among our friends and family would get sick. We were anxious and bored, but many of us, more than anything, were lonely. Very, very lonely. My husband worked out of the house at night and slept during the day. So even though I was one of the lucky ones and did have another human presence in the house, that presence mainly manifested as a lump under the bedclothes. Many people had no one, and there was a surge in pet adoptions as people looked for anything to help cope with the day-after-day overwhelming loneliness.
Before COVID, I attended a regular kickboxing studio. A month or two before lockdown, a random group of students gathered to celebrate one woman’s birthday. I knew some of the women well, some not at all. Many I’d smiled at over the years I’d been attending the class but had never talked with. We had so much fun at that dinner that we decided to do it again and said we’d use the Facebook chat from the original dinner to coordinate. Then, COVID happened, and we started chatting as a group. First, occasionally, but as we all became lonelier and more desperate for company, it became a constant scrolling chat about parenting, marriage, TV shows, anything and everything. The one thing we all agreed on was that the connection we made to each other, through that chat, in those hard times was a lifesaver (we’re still friends, and we still use the chat even though we meet regularly now in person.)
Loneliness is not just horrible, it can be bad for our health, sometimes even fatal. According to the Campaign to End Loneliness,
“Loneliness can increase the risk of early mortality by 26%…There appears to be an association between mental wellbeing and loneliness: research estimates that 60% of people experiencing chronic loneliness experience mental distress, compared to 15% of people who are not chronically lonely.”
While COVID lockdown may have created a loneliness emergency, the increase of adults who say they’re profoundly lonely was on the rise before lockdown, and the issue is still with us after. According to the US Surgeon General,
“At any moment, about one out of every two Americans is experiencing measurable levels of loneliness. This includes introverts and extroverts, rich and poor, and younger and older Americans. Sometimes loneliness is set off by the loss of a loved one or a job, a move to a new city, or health or financial difficulties — or a once-in-a-century pandemic.”
Many of us live very differently than previous generations; we don’t belong to the kinds of religious or civic groups our parents and grandparents did, and so our community ties are much weaker. We’re online all the time and have many of our interactions virtually. The dramatic increase in remote work has brought many benefits, but for some people, it has loosened the connections of their social fabric even more.
I’ve written recently about the challenges and opportunities that generative AI, such as ChatGPT, might bring. Since I wrote that first piece, there has been a tidal wave of news pieces on what this new AI technology might mean for education, the workplace, the doctor’s office, and the list goes on. Is it a trustworthy source? Is it sentient? Will it ultimately destroy humanity? There are lots of opinions, almost as many as the various answers ChatGPT will give when you keep asking it the same question – so no, you can’t trust its accuracy. But even if it can’t and shouldn’t be trusted to write your paper for class, might AI be a useful tool for combatting loneliness? This recent NY Times piece discusses some of the early innovations in this space,
“Pi is a twist in today’s wave of A.I. technologies, where chatbots are being tuned to provide digital companionship. Generative A.I., which can produce text, images and sound, is currently too unreliable and full of inaccuracies to be used to automate many important tasks. But it is very good at engaging in conversations. That means that while many chatbots are now focused on answering queries or making people more productive, tech companies are increasingly infusing them with personality and conversational flair.”
Of course, as the piece goes on to discuss, the potential benefits of using the technology in this way should not blind us to the dangers; ChatGPT is often wrong and sometimes spectacularly wrong, but in a very plausible-sounding way. There are and should be serious concerns about the personal, identifying information that can be fed into these systems and how it might be stored and used. There are claims it has biases baked into its system, “ChatGPT has been shown to produce some terrible answers that discriminate against gender, race, and minority groups, which the company is trying to mitigate.” While perhaps many people have the critical reasoning skills to use generative AI as a useful tool despite these issues, for people with mental health challenges, it’s easy to imagine the dangers.
But even before the rise of ChatGPT, there have been efforts to use AI and robotics to provide companionship to the elderly.
“…the industry has been on the upside in the use of AI-equipped elderly care robots. Senior citizens may have difficulty keeping themselves busy and active, and companionships can encourage many seniors to participate in daily activities.”
There’s no way to close Pandora’s Box; Generative AI is here. All we can do is try to build in as many safeguards as possible,
“Mustafa Suleyman, Inflection’s chief executive, said his start-up, which is structured as a public benefit corporation, aims to build honest and trustworthy A.I. As a result, Pi must express uncertainty and “know what it does not know,” he said. “It shouldn’t try to pretend that it’s human or pretend that it is anything that it isn’t….Pi was designed to tell users to get professional help if they expressed wanting to harm themselves or others. He also said Pi did not use any personally identifiable information to train the algorithm that drives Inflection’s technology. And he stressed the technology’s limitations.”
I’m not suggesting technology can or should be a substitution for human interaction. But just as ChatGPT can be a valuable tool for work and education, perhaps this is another way it can be valuable for some people under some circumstances. If my Facebook Messenger friend chat hadn’t had real people on the other side, it might still have helped me during a challenging and lonely time. And maybe that would have been good enough.