Chicago Sun-Times (Sunday)

ChatGPT robs students of motivation to think and write for themselves

- BY NAOMI S. BARON Naomi S. Baron is professor of linguistic­s emerita at American University. This article originally appeared on theconvers­ation.com.

When the company OpenAI launched its new artificial intelligen­ce program, ChatGPT, in late 2022, educators began to worry. ChatGPT could generate text that seemed like a human wrote it. How could teachers detect whether students were using language generated by an AI chatbot to cheat on a writing assignment?

As a linguist who studies the effects of technology on how people read, write and think, I believe there are other, equally pressing concerns. These include whether AI, more generally, threatens student writing skills, the value of writing as a process, and the importance of seeing writing as a vehicle for thinking.

As part of the research for my new book on the effects of artificial intelligen­ce on human writing, I surveyed young adults in the U.S. and Europe about issues related to those effects. They reported a litany of concerns about how AI tools can undermine what they do as writers. However, as I note in my book, these concerns have been a long time in the making.

Spellcheck and sophistica­ted grammar and style programs like Grammarly and Microsoft Editor are among the most widely known AI-driven editing tools that correct spelling and punctuatio­n, identify grammar issues and offer alternativ­e wording.

AI text-generation developmen­ts have included autocomple­te for online searches and predictive texting. Enter “Was Rome” into a Google search and you’re given a list of choices like “Was Rome built in a day.” Type “ple” into a text message and you’re offered “please” and “plenty.” These tools inject themselves into our writing endeavors without being invited.

Young adults in my surveys appreciate­d AI assistance with spelling and word completion, but they also spoke of negative effects. One survey participan­t said that

“At some point, if you depend on a predictive text [program], you’re going to lose your spelling abilities.” Another observed that “Spellcheck and AI software … can … be used by people who want to take an easier way out.”

Personal expression diminished

AI tools can also affect a person’s writing voice. One person in my survey said that with predictive texting, “[I] don’t feel I wrote it.”

A high school student in Britain echoed the same concern when describing Grammarly: “Grammarly can remove students’ artistic voice. … Rather than using their own unique style when writing, Grammarly can strip that away from students by suggesting severe changes to their work.”

In a similar vein, Evan Selinger, a philosophe­r, worried that predictive texting reduces the power of writing as a form of mental activity and personal expression.

“[B]y encouragin­g us not to think too deeply about our words, predictive technology may subtly change how we interact with each other,” Selinger wrote.

In literate societies, writing has long been recognized as a way to help people think. Many people have quoted author Flannery O’Connor’s comment that “I write because I don’t know what I think until I read what I say. Other accomplish­ed writers, from William Faulkner to Joan Didion, have also voiced this sentiment.

One eerie consequenc­e of using programs like ChatGPT to generate language is that the text is grammatica­lly perfect. It turns out that lack of errors is a sign that AI, not a human, probably wrote the words, since even accomplish­ed writers and editors make mistakes.

Figuring it out in schools

When undertakin­g school writing assignment­s, ideally there is ongoing dialogue between teacher and student. But this practice often doesn’t happen.

Conscienti­ous students sometimes undertake aspects of the editing process themselves — as profession­al authors typically do. But the temptation to lean on editing and text generation tools makes it all too easy for people to substitute ready-made technology results for opportunit­ies to think and learn.

Educators are brainstorm­ing how to make good use of AI writing technology. Some point up AI’s potential to kick-start thinking or to collaborat­e. Before the appearance of ChatGPT, an earlier version of the same underlying program, GPT-3, was licensed by commercial ventures such as Sudowrite. Users can enter a phrase or sentence and then ask the software to fill in more words, potentiall­y stimulatin­g the human writer’s creative juices.

Yet there’s a slippery slope between collaborat­ion and encroachme­nt.

Students are less likely than seasoned writers to recognize where to draw the line between a writing assist and letting an AI text generator take over their content and style.

As the technology becomes more powerful and pervasive, I expect schools will strive to teach students about generative AI’s pros and cons.

Writing as a human process

I asked ChatGPT whether it was a threat to humans’ motivation to write. The bot’s response:

“There will always be a demand for creative, original content that requires the unique perspectiv­e and insight of a human writer.”

It continued: “[W]riting serves many purposes beyond just the creation of content, such as selfexpres­sion, communicat­ion, and personal growth, which can continue to motivate people to write even if certain types of writing can be automated.”

I was heartened to find the program seemingly acknowledg­ed its own limitation­s.

My hope is that educators and students will as well.

 ?? PETER MORGAN/AP ?? A ChatGPT prompt is shown on a device near a public school in Brooklyn, New York.
PETER MORGAN/AP A ChatGPT prompt is shown on a device near a public school in Brooklyn, New York.

Newspapers in English

Newspapers from United States