End of last year, an AI-based language app from OpenAI entered the periphery of our consciousness and quickly filled it with text, human-like conversations and a whole lot of buzz. By definition, it is a natural language processing tool driven by AI technology, and, apparently, it is very skilled at writing essays, stories, articles, blog posts and much more. All we need to do is feed it some text prompts… and off it goes writing its composition… and can do so in several languages. It even recommends a title/headline for the write-up.
At that time, The Economist magazine had this to say in an article titled ChatGPT is a marvel of multilingualism: Click here to read now
“OpenAI’s natural-language system creates recipes, writes computer code and parodies literary styles. Its latest iteration can even describe photographs. It has been hailed as a technological breakthrough on a par with the printing press. But it has not taken long for huge flaws to emerge, too. It sometimes “hallucinates” non-facts that it pronounces with perfect confidence, insisting on those falsehoods when queried. It also fails basic logic tests.”
As content creators, naturally, we were curious and put the app to test. We prompted it to write articles and stories for us on topics as varied as Online Education, Crime Fiction, Indian Women in Tech, Afrofantasy, Justice Denied to an Indian Woman, a Corrupt Prime Minister, Cookies and Ice Cream, and a Recipe for Chili Con Carne. Each time, the Chat GPT app came forth with its compositions without hesitation and true to the topic at hand.
ChatGPT could do with some imagination
However, as content creators with a creative bent of mind, we were somewhat disappointed. We felt the ChatGPT compositions were mechanically written, like the content we find in college text books or study guides, encyclopedias or in Wikipedia. They were true to the task but lacked inspiration and creative variation. For instance, when we prompted ChatGPT, “What is ChatGPT?”, we received the following response:
“I am ChatGPT, a large language model created by OpenAI based on the GPT-3.5 architecture. I have been trained on a vast amount of data and am capable of generating human-like responses to a wide range of prompts and questions. I can converse on a variety of topics and provide informative and helpful answers to your inquiries.”
Informative? Yes. Helpful? Well, not quite; only at an elementary level, as if it was written by a school student. The creative component was missing. Factually correct, but lacking in imagination and inspiration. That’s probably because, as ChatGPT itself says, “I have been trained on a vast amount of data and am capable of generating human-like responses to a wide range of prompts and questions.
Using the ‘regenerate response’ feature, we received a very similar response with little variation on the first composition. For scientific or fact-based writing, this may be acceptable; but for stories or creative writing, it’s not much fun to read. That’s because, “As a language model, ChatGPT has been trained on vast amounts of text data and can generate coherent and contextually relevant responses to a wide range of queries.”
What we mean to say is that, for every composition, it is giving us an output which is a customised and well-structured content from all that it has gathered and learnt from previously-published works available on the internet. They are composed mechanically as if by a robot, not like a trained (human) writer. This may be acceptable to some of us, but it’s not quality writing. Not yet, anyway. Maybe it will improve over the years and, sometime in the near future, we can depend on it for quality writing, rich in creativity and (human) imagination.
ChatGPT doesn’t create new ideas
Marvin Krislov, President of Pace University, aptly summarises this in a recent article in Forbes Magazine (Click Here to Read More)
“As more people spent more time playing with ChatGPT, they began to realize that it’s good — but ultimately not
that good. What people slowly came to see was that generative AI doesn’t really generate anything new; rather, by the very nature of its technology, it simply spits back out what it has taken in. Even the best AI, like ChatGPT, doesn’t create new ideas but instead repeats old ones. It doesn’t create new artwork but generates derivative ones.”
The article goes on to say, in other words, “AIs simply give back to us what we’ve put into them.” [ https://www.forbes.com/sites/marvinkrislov/2023/03/13/why-chatgpt-makes-me-hopeful-not-worried-for-the-future-of-college-and-careers/ ]
This means, although ChatGPT “can be used for a variety of applications, such as chatbots, language translation, and text generation, [making] it almost irresistibly easy to produce essays and papers,” it’ll struggle to find a place where high-quality and academically merit-worthy content is appreciated and required. ChatGPT will struggle to replace tasks that require creativity, imagination, social and emotional skills, human connections and interventions.
If such is the case, are we giving ChatGPT too much credit?