by Sarah Firisen
I’m going to date myself in a significant way now: when I was in high school, we had to use books of trigonometric tables to look up sine and cosine values. I’m not so old that it wasn’t possible to get a calculator that could tell you the answer, but I’m assuming that the rationale at my school was that this was cheating in some way and that we needed to understand how actually to look things up. I know that sounds quaint now. I also remember when I used an actual book as a dictionary to look up how to spell words. Yes, youth of today, there were actual books that were dictionaries, and you had to find your word in there, which could be challenging if you didn’t know how to spell the word to begin with.
These days, if you have turned in a paper without putting it through basic digital spellcheck, you deserve to fail the class. And most editing tools have some at least rudimentary grammar checking. In addition to those built-in tools, I use Grammarly and have encouraged my college-age daughters to use it. It used to be the case, at least when I was in school and college, that you lost marks for bad spelling and grammar. There is no good reason a piece of writing today shouldn’t have mostly correct spelling and basic grammar. But spelling and grammar checking doesn’t make you a good writer, and a scientific calculator doesn’t make me a better mathematician. They’re just tools. By the way, I also used to get marks deducted for my bad handwriting. Bad handwriting isn’t an issue for anyone over ten or so (or maybe younger these days) when almost all communication is electronic. So does bad handwriting matter? I can’t remember the last time I wrote anything longer than a greeting card. These days, it’s far more important to be computer literate than to be able to write good cursive.
Which brings me to ChatGPT, a new AI chatbot created by OpenAI, an artificial intelligence research company.
ChatGPT can do anything from writing poetry and prose to generating computer code. The odds are that, unless you’ve been living under a rock for the last few weeks, you’ve read something about ChatGPT, even if it’s just people on social media raving about it. I’ve lost count of how many articles I’ve read about it that end with a paragraph saying something along the lines of, “Oh and by the way, paragraph 4 of this article was generated using ChatGPT”. This leads the reader to go back and reread it and marvel at how it is indistinguishable from the other human-written paragraphs.
I started playing with ChatGPT a couple of weeks ago, and it’s both astonishing and a cautionary tale. I used it to help me write a blog (not this one) that I was finding hard to start. It did a good job of creating 1000 words that made sense and were on topic. But it also generated entire sentences that were incorrect. I was loathed to talk about ChatGPT with my daughters for fear that they’d try to use it to do all their papers. I finally mentioned it to one of them with heavy caution that her college might consider it plagiarism (and I have no idea what would happen if two people asked ChatGPT to write about precisely the same topic with the same initial prompt.) Interestingly, after finals were over, my daughter told me that two of her friends had used ChatGPT to write a paper but that she hadn’t. My daughter’s paper had gotten a higher grade than her friends’ papers. This article supports the idea that ChapGPT is good, but not human good yet. The author used ChatGPT to create cover letters for job applications. Across the board, the recruiters said the letters were good, but they also said, “ the letters lacked personality and research about the companies.” They sounded stiff and formal.
Even with Grammarly, you shouldn’t blindly accept every recommendation it makes. It’s usually right, but every so often, it is really wrong and doesn’t understand the context of what you’re trying to write. It’s a tool that needs a thoughtful user to employ critical thinking to maximize its benefits. I also remember when I was a kid and my father created spreadsheets on pieces of paper with a pencil and eraser. No one, except perhaps the most profound Luddite, could think that Excel isn’t an improvement over this. It hasn’t made users lazy; they need to learn higher-level skills to use the application to it’s fullest. And while the user isn’t doing the bulk of the calculations, it is possible to make errors in creating the formulas. So while an Excel user isn’t using a pencil and eraser and painstakingly doing every sum, just as with Grammarly or spellcheck, there has to be some critical thinking employed to ensure that the calculations are correct.
At least in the short to medium term, I think that an AI tool, such as ChatGPT, becomes just that, a tool. Just as a scientific calculator or an application such as Excel helps us to skip some of the more mundane steps and focus on higher-value activities, ChatGPT will become an aide, maybe a research assistant. An article in the New York Times discussing the big tech trends for 2023 posited,
“Imagine that you are writing a research paper and want to add some historical facts about World War II. You could share a 100-page document with the bot and ask it to sum up the highlights related to a certain aspect of the war. The bot will then read the document and generate a summary for you.
“If you want to enrich your writing with a historical fact, you won’t need to go and search the web and find it,” said Yoav Shoham, a professor emeritus at Stanford University who helps compile the AI Index, an annual report on the progress of artificial intelligence. “It’ll be right there with a click of a button.”
And yes, I realize that many people will say that understanding how to do research is part of the task of doing a research paper and that having a good grasp of grammar is part of being a good writer. But is it? Plenty of people are good at spelling and grammar and aren’t good writers. And once a task becomes automatable, does it matter anymore if people can manually do it or not? Perhaps we do run the real risk of becoming so dependent on technology that we lose the ability to perform any lower-level tasks. If we’ve learned anything from The Walking Dead, it’s that the survivors are the people who have, or can quickly develop basic skills. But, as a western society, we’re already so far down this road. We’re already doing whatever research we’re doing on a computer, making heavy use of Google and other search tools. Most of us aren’t writing using a pen and paper. We’re already using spellcheck. Perhaps ChatGPT is just the next obvious evolution.
I’ve written before about the increasing automation of work. Despite a lot of hand-wringing about jobs disappearing, what seems to have mostly happened so far is that tedious, manual tasks are disappearing, moving humans into higher-level, more interesting work. There’s no doubt that some kinds of jobs are disappearing. But technology advancements are creating new, hitherto unimagined, jobs.
While early automation mostly displaced blue-collar, manual jobs, technology is definitely now coming for more white-collar, knowledge work. But as Paul Krugman recently pointed out when discussing ChatGPT:
“It is possible that in some cases, A.I. and automation may be able to perform certain knowledge-based tasks more efficiently than humans, potentially reducing the need for some knowledge workers. This could include tasks such as data analysis, research and report writing. However, it is also worth noting that A.I. and automation may also create new job opportunities for knowledge workers, particularly in fields related to A.I. development and implementation.”
And as with so many recent articles about ChatGPT, he goes on to say that ChatGPT wrote that paragraph for him! Krugman acknowledges that these seismic shifts in the labor market are rarely painless; jobs are displaced, and workers need to gain new skills to participate in the newer economy. The education system probably has to evolve and adapt faster than it currently is. To some extent, the internet has already driven some educators to adapt how they’re teaching, and ChatGPT and its ilk will (or should) only accelerate these changes:
“Rather than listen to a lecture in class and then go home to research and write an essay, students listen to recorded lectures and do research at home, then write essays in class, with supervision, even collaboration with peers and teachers. This approach is called flipping the classroom.
In flipped classrooms, students wouldn’t use ChatGPT to conjure up a whole essay. Instead, they’d use it as a tool to generate critically examined building blocks of essays. It would be similar to how students in advanced math classes are allowed to use calculators to solve complex equations without replicating tedious, previously mastered steps.”
Like so much that has been true of advancements in the classroom, there is a real challenge in ensuring that all students can access and benefit from the new technologies. Automation and AI will be part of the new knowledge worker economy, and all students need to understand how to interact with it and utilize it appropriately. The above article ends with, “The way forward is not to just lament supplanted skills, as Plato did, but also to recognize that as more complex skills become essential, our society must equitably educate people to develop them.“
I didn’t use ChatGPT to write any of these paragraphs. But I could have, and I will likely have it help me in the future. And I think it will quickly become as an accepted writing tool as spellcheck, and that’s ok.