As a child, I sat with a palette of paint at my disposal and attempted to mix as many colors as I could manage. With eager eyes, I watched as vibrant colors combined on the page. As more and more began to meld into one, I stared at the resultant conglomeration, confused. The more shades I added to the mix, the less interesting it became. The identity of each was lost in a dull monotony.
“How can such a varied array of color result in something so bland?” I wondered.
When multiple colors are blended, many might be surprised to learn that the result is typically little more than a muddy brown or gray. But this is rooted in the way pigments absorb and reflect specific wavelengths of light. Primary colors when combined should theoretically produce black, as all light is absorbed.
However, due to the imperfections in real-world pigments, we usually end up with varied shades of brown. When too many colors are melded together, each one becomes muted. Each individual’s effect is less pronounced. They get lost in the noise.
The number of shifting variables at play in the world of AI is enormous. At OpenAI, there are over 100 trillion parameters — essentially the knobs that are adjusted behind the scenes at the company — that dictate the experience of ChatGPT users. With each major update, more and more parameters have entered into play in an attempt to heighten the performance capabilities of their product.
ChatGPT-4 draws from a body of 300 billion words in the text, advice, and essays that it generates for users. It has a data set ecompassing thousands of authors and it can draw from more knowledge than any of us will attain in our lifetime. And yet, there’s a formulaic quality to what ChatGPT produces when prompted with nearly any writing assignment.
Regardless of the subject it’s addressing, there’s a stilted component to AI writing that people are developing an eye for. With an idea of what to expect, it can be almost a challenge to miss. If you feed an alien a bunch of YouTubes and documentaries and tell it suddenly to “act human!” it won’t be hard for most to see the Martian in sheep’s clothing.
Attempting to composite a million essays into one voice is almost bound to lead to a certain blandness. Satire mixed with suspense, fiction, fantasy, poetry, and every conceivable prose won’t all culminate in a piece worth reading. Parody will cancel out emotion. Journalistic will dull descriptive. And the descriptions it contains will be no more than copies of details others have crafted. Downscaled images carried over from other writers and condensed into a compendium of data points and code.
So maybe it shouldn’t come as a surprise that when language learning models are prompted to write essays, there’s an almost comical predictability to the tone and vocabulary at play. They open without soul and talk about the subjects they will “dive” into. They type loftily about “weaving tapestries” and “exploring ideas.” They wax a phony poetic about the topics they “delve into” and the “mosaics” they each represent.
ChatGPT might “juxtapose” concepts in ways that the untrained eye could believe are novel. But the models cannot invent new ways of expressing ideas. They’re restricted entirely to patterns they’ve seen in their training data. They choose safe, well-established routes of communication. They don’t carve new ones.
But they have their value. While ChatGPT can’t yet craft a novel of its own design, it can speak meaningfully about a new one. It can converse about ideas outside of its arsenal. It can reflect on the contents of a memoir introduced to it for the very first time. It just can’t draw from it in its future conversations. And if it tried to assimilate our language, our words would lose what make them unique. They’d become a drop in a bucket of brown paint.
And yet, image creation software can craft concoctions of color that exude a certain beauty. Even if restricted to shades of gray, it can assemble pieces worth a second glance. It’s hard to deny that there’s a provocative sort of artistry to some of its creations.
But at its very best, it can still do no better than the vast reservoir of what’s already been created. To simply call it drab wouldn’t be fair, but to say it’s novel wouldn’t either. Its every stroke of artistry is owed to some anonymous other. To take credit for its work is as criminal as copy and pasting PhD theses.
The magic of generative AI is one that largely falters under further examination. While there’s certainly cause for concern about what the future of the technology still has in store, I’m not sure whether it will be quite the reckoning I feared a year ago. But that a tidal wave of scams, deepfakes, and turmoil is at our door already is hard to deny. How our foundations will ultimately bear this blow is something only time will tell.
There’s no denying that AI can replace large swaths of writers already. For those of us who have little more to offer than listicles about “The 6 Biggest Mansions in LA” and “The 10 Celebrities With the Most Badly Botched Surgeries,” our days as writers may be numbered. But for those whose writing is centered around novel human thought, they stand out from the noise.
We’re more than a data point for AI to draw from. We have thoughts and experiences all our own. And until we allow AI permission to our very minds, these memories we have and new stories we write will always have a human characteristic that can’t be simply mimicked. There will be a soul to our words that won’t translate into a world of 1s and 0s.
To comprehend the definition of words isn’t to viscerally understand the feelings they elicit — the precise meaning they impart when placed together in certain patterns. Some feelings transcend vocabulary. Unless the AIs we create develop souls, there will be a feeling and flavor to our sentences that rise above the sum of their parts. They’ll mean more than a hollow series of definitions.
AI can’t yet untangle the areas in which words soar to a level of artistry. It can’t discern why a certain collection of words means more than their synonym counterparts. There’s a human component to language that no collection of datapoints can impart.
I wrote a response to your story, Ben. You bring up some deep thoughts here.
https://open.substack.com/pub/britnip/p/i-learned-a-new-word-today?r=506k2&utm_campaign=post&utm_medium=web
Amazing article and I am so glad it came to me today.
I’m an advanced beginner with AI and I love it. I’ve had some great conversations with ChatGPT about loads of things.
And I gave it my attempt at feelings I was having about my work’s path as I age. I wanted to add it to a character’s story.
What I got back made me cry.
Therefore, I love my AI and hope we can continue to grow together.
I appreciate your thoughts and will subscribe.