AI must be the first new class of technology to instantly lower expectations to rock bottom. The bland, lazy, and often just plain wrong content now infesting online culture is quite rightly called AI slop.
No, this isn’t really about the “slop factor”. What people aren’t doing properly yet is defining “slop” in value terms. It’s actively degrading the value of information to users. If you can’t be bothered listening to it or looking at it, it ceases to be information. It might as well not exist at all.
AI slop speaks in a monotone and delivers more or less the same thing, regardless of platform, subject, or author. It condenses information into a prefabricated, scripted style. You think you’ve seen it or read it before, and you have.
At the back end, it’s called “AI inbreeding”, and that’s really more than a bit of a much-too-kind euphemism. The origins of this mindless mediocrity aren’t hard to find. In marketing, it’s a product of mediocre human culture, transposed clumsily onto AI operations.
The human culture tends to create these bland, lifeless forms of commercial information. Lack of depth is aggravated by lack of talent and lack of perception. This was a pre-existing human-made problem, made worse by an extremely undemanding culture.
A decade or so ago, I was routinely hired to fix this garbage. I had to rewrite it and turn it into something differentiated from the market standard. Market standard was already slop. It was inexcusably minimal and useless information, supposed to sell products.
In some cases, I was literally rewriting from scratch. The information was neither interesting nor informative. As a sales pitch, it was nothing. Every subject had the same flat, uninteresting content and general lack of value built in.
Imagine my total lack of surprise to find AI doing the same thing. It’s pretty obvious that the LLMs are using the same styles to deliver the same utterly worthless content.
There’s a massive own goal built in here. One of the most basic bits of information in advertising and marketing is that 95% of information is instantly ignored. The critical ideas of a unique selling point, standout information, and above all of high value content are nowhere to be seen.
I’m not at all against AI. It will, eventually, be useful. I just have to turn off Copilot before I write a word. It simply can’t do what I need it to do.
Yes, that’s a problem because I write about a million words per year, manually or by dictation. I could use a trustworthy assistant, and AI isn’t anything like trustworthy for writing purposes, certainly not for creative writing. (I use Grammarly for proofing and nitpicking purposes, but it doesn’t write the content.)
In the broader sense of the word culture, AI is making waves for all the wrong reasons. The UNESCO report on how AI is transforming culture is alarming at best, however politely phrased.
This IS a monoculture. It can’t really be anything else, at least, not yet.
Now read the highly unsatisfactory definition of culture. It’s poles apart from a monoculture. Culture is extremely hard to define. A monoculture is all too easy to define.
That’s the problem. The true value of information is very much in how it’s expressed, and why it’s expressed in a particular way. The bottom line is that it either evokes insight and provokes interest, or it doesn’t.
“I wandered lonely as a clod trying to decipher the toxins in this food from the two-point font on the packet” may not be the most poetic thing you’ve ever read. On the other hand, you may feel some empathy with the other clods, if nothing else. It has no sales value at all, quite the opposite, but at least you’ll read it.
You are definitely NOT going to see “Reading the ingredients made me a better and more spiritual being”. Try fitting that into a style guide. Relevant to sales? No. Risk of keeping the reader awake and focused, yes.
This is where the toothless bear trap of AI culture fails predictably. The way information is expressed directly affects its content value, and for marketing purposes, its perceived value.
Political culture, contradiction in terms as it is, defines this. Everyone knows what political biases are. They are infinitely predictable. They’re instant Off and On switches. They’re not even information as such, and there’s usually no room for facts or any substance at all. This is exactly the same thing as scripted AI. Compulsory crud.
AI can write something like “the luxurious tenets of applied poverty as an inherent multi-generational lifestyle choice are many, rich, and varied.”
Now, try to care what that load of irretrievable garbage means. The expression is counterproductive; however it’s written. Never mind the inbuilt insults. It’s meaningless crap.
A human writer can’t really write that at all. Regardless of which culture the writer may come from, from medieval American to the 21st century, the expression is so deeply flawed on so many levels, it’s not culturally viable.
So, are we enforcing AI slop at the expense of culture? Yes. It’s just that it’s done so badly that instant sales resistance is the inevitable result. AI may well create more jobs in damage control, just fixing the mistakes, than anything else.
Human culture has created this AI culture, and it’s nothing like good enough. The bar is set way too low, and the results are horrendous.
The world does not need another instantly disposable, inexcusably expensive, utterly useless monoculture.
______________________________________________________
Disclaimer
The opinions expressed in this Op-Ed are those of the author. They do not purport to reflect the opinions or views of the Digital Journal or its members.
