No rhyme or reason to have a bot write poetry
Recently, a poster on Reddit admitted that he had set up an AI-powered chatbot similar to ChatGPT to help him out with his online dating. He taught the bot his dating preferences and set it to work scanning and choosing potential dates, and then the bot even chatted up the women he matched with, leaving him with nothing to do until the actual, inperson date. He posted on Reddit because he was worried (a bit belatedly) about the ethical and moral implications of his scheme — was he a bad person to enlist an artificial intelligence to break the ice and start conversations with human women?
The post was funny, and it reminded me of a short story by Kurt Vonnegut titled “EPICAC.” In this story, written in 1950, EPICAC is a fictional early electronic computer based on the real life computer ENIAC. In the story, the unnamed narrator and EPICAC’s programmer explains how he has fallen in love with his coworker, but he can’t get her to pay any attention to him. For her part, the coworker finds the narrator dull and lacking in poetic sensibilities. The narrator then inadvertently prompts EPICAC to write a lovely poem that causes the coworker to fall into a romantic swoon and agree to marry him.
When I taught this short story in my class recently, I asked my students to imagine that they were in a longdistance relationship and they had discovered that their significant other had been sending them love letters written by an artificial intelligence. They were then to respond with a letter of their own. Most of the students were upset by the betrayal and decided to break up, and even those who did not decide to break up were not terribly happy that the love letters were not really written by their partners.
All of these ideas have been rattling around in my head lately because one of my students is working on a major research paper about AI-generated poetry. He gave the computer a series of instructions about what kind of a poem he wanted, making references to certain poets and themes, and then he presented that computer’s poem in my poetry writing class. The results were solidly sub-mediocre. There was nothing wrong with the poem. It rhymed and had a recognizable meter. The imagery was adequate if not interesting or inspiring. It was a safe, boring poem that did not take any chances or introduce anything new.
When the class critiqued the poem, we all noted how lifeless the words felt. The references to nature felt like the verbal equivalent of clip art; we understood that the poem was referring to trees, but we couldn’t really see them or smell them or do anything else with them that good poems written by talented poets can do. No one loved the poem. The general consensus among the students was that the computer poet made us feel vaguely uneasy in that uncanny way that food made with lots of artificial flavors does.
Many years ago I read about an athlete who was asked to explain step-by-step how he performed. He produced a fairly complicated list of movements — I bend my waist here, I swing my arm there, I step forward at this point, and so on — that added up to an algorithm of his athleticism. The sports scientists who asked him to do this then asked him to follow his instructions to the letter without adding any variations at all. He failed miserably and performed worse than the average amateur might. Humanity — and our poems and love letters — cannot be reduced to algorithms.