About a year ago, I wrote my first article on artificial intelligence, or AI. At the time I wrote “If I Only Had a Brain; The Future of AI,” the latest rendition of the Chat GPT platform by openai.com had just burst onto the scene. This “new” technology was all the buzz in the tech industry. But I pointed out that, by comparison, this was nothing new. Grammarly, which had been around 10 years prior, is software which considers the context of what is written, thereby making it somewhat “intelligent.”
I also explained that AI was a tool, not a replacement for anything or anyone. I even showed my work, as it were, demonstrating how I used OpenAI’s program to actually write that particular post. It was a great help in getting my mental juices flowing, but I probably did not save that much time. There was still a lot of editing, correcting and rewriting to do. That’s because the devil is in the details, as it is with all creative works.
Anything worth reading needs a human touch. Otherwise, anything written by artificial intelligence will read like it was written by artificial intelligence: dull and contrived. No feelings or cautionary tales that apply to the author, just facts. That is, unless those emotions and anecdotes are “borrowed” from other writers. It seems AI is not that artificial, nor is it intelligent.
Over the past year, several other AI platforms have been developed, while others simply piggyback off of ChatGPT. Almost as soon as the software went public, The New York Times filed a lawsuit stating, “Millions of articles published by The Times were used to train automated chatbots that now compete with the news outlet as a source of reliable information.”
It seems there may not be anything original with today’s AI.
Several prominent authors including John Grisham and George R. R. Martin have also filed suits claiming copyright infringement. But it’s not just big-name artists who have a beef with the program. A few months ago, I had lunch with a friend who, along with her husband, helps charities with administrative needs. They have maintained a blog for their clients for several years. On a lark, the husband decided to use a platform to write his next post. What he got back from the software was an article that included a word-for-word paragraph he wrote in another post a few years prior.
It seems much of this technology take bits and pieces, or whole paragraphs, from here and there. It leaves one thinking what it spits out is an original work. And what it cannot find, it simply makes up.
In seeking a writing prompt, I recently asked a chatbot to write a paragraph or two on a particular topic and cite three sources. It did so in just a few seconds, but had I taken the bot’s word on it, I would have potentially faced embarrassment. One of the “sources” was Dr. Kyle Smith of Harvard. Harvard, impressive, right? I researched Dr. Smith thinking there would be some great quotes for me to include in my article.
Only there was no Kyle Smith at Harvard, nor at Yale, Brown, Princeton or anywhere in the Ivy League as far as I could tell. So, I went back to the computer and asked, “Where is this Dr. Smith that you cited?” Only then did the interface admit Smith was a construct to facilitate the narrative. So, automation was taking artistic license with a research source? Not good!
Visual AI is no exception to ridicule either. Gemini was an embarrassment to Google when it launched. It delivered renderings of such things as black Vikings, Asian Nazis and other historical nonsense. As it turns out, the chief developer may have had certain biases against . . . truth.
Now, what was that old computer maxim I learned in the ’90s? Oh, yeah, “garbage in, garbage out.” That is to say, you only get what is already available, but that won’t necessarily be anything creatively authentic, or even accurate for that matter. Any end-users who don’t proof what they use publicly could be susceptible to ridicule, lawsuits or, at minimum, an “F” on a term paper.
Still, we are perhaps decades away from a true form of sentient cognition on our desktops. Until then, we will have to do our due diligence in research and composition, as I intend to do. For now, what is called “AI” is nothing more than a glamorized search engine.