AI seems to be everywhere in the news these days. It’s taking jobs, it threatens to (someday maybe) take over the world, etc. More to the point for this post, it might run off real writers by replacing them with content generated at little cost or effort to the business trying to save the expense of hiring a real writer.
But we’ve started to see some serious pushback to this AI onslaught. People have noticed that the AI emperor has no clothes. They are objecting to what they’re seeing, and how all this content is being generated.
Some businesses were quick to latch on to AI-generated content, thinking they could save a lot of money that they were paying to writers and other creatives. But the content was basically boring dreck, and some companies that went down that rabbit hole had to pay a real creative person a lot of money to fix the mess that the AI-generated content had made.
What was the problem? Where did AI go wrong?
First of all, AI learns what it knows by how it is trained—it slurps up reams and reams of data from outside sources. Some of those sources have objected to this “appropriation” (maybe “theft” would be a better term) due to infringement of copyrights and taking of intellectual property. A major lawsuit from the New York Times against Microsoft’s OpenAI, alleging theft of the NYT’s intellectual property, has been allowed to proceed.
In addition to that, AI has been found frequently to “hallucinate”—inventing “facts” and sources for those non-facts. While doing some research for a recent article, I had to call BS on my AI assistant for making up sources for what it asserted was factual. (It didn’t apologize for the error but did back off from the bogus source claims.)
A related problem is called training data bias. If what the AI entity gobbles up is not factual, it will nonetheless include it in what it spits out. Garbage in, garbage out.
And the content AI generates? Even if it is factually correct, it tends to be bland and rather lifeless. There is little if any originality. How can there be when it is basically repeating something that some real person somewhere created? It’s not by chance that it feels like you’ve seen it somewhere before; you probably have.
AI can define the term “point of view” but the content it generates has no real point of view, unlike real people who know how to write effectively, to craft the message so that is has the proper slant and tone. It repeats whatever it is that it has “learned” but is incapable of interpretation or any original thought. I’ve seen some visual artwork that AI has generated. It looked good but, since it came about by following a formula, it wasn’t really an original work of art—nothing to move you, make you think, or even make you angry or sad.
AI may be able to replace poor writing with mediocre content, but cannot beat the skills and experience of good writers and other creative types. Caveat emptor!