OPINION —

Most of us are abundantly familiar with virtual assistants such as Siri or Alexa. And now, chatbots like ChatGPT are gathering interest as they stimulate human conversation to assist customers. This ended barroom bets, where drunk patrons challenged each other to answer a sports question or TV trivia. Now, people just check on their smartphones for the answer.

At least in the former barroom arguments, humans had to think. Their curiosity was already dulled by television; now it is cast aside by simply asking a virtual assistant a question. Evidence: How many people believed weapons of mass destruction were in Iraq?

“You can keep your own doctor” — did anyone besides me research that one? Or, “The Mexicans will pay for our border wall.” All were lies that, strangely enough, were believed by millions of citizens.

“German magazine Die Aktuelle is getting lots of pushback for running what they billed to be ‘Michael Schumacher, the first interview,’” noted Sportico. “The tabloid used the AI-generated conversation platform character AI to mimic the former world champion.”

Schumacher is a former Formula One champion, 54, who has not been seen in public since he suffered a serious brain injury in a skiing accident in December 2013.

In a recent New York Times guest essay, Noam Chomsky was joined by fellow linguist Ian Roberts and director of a science and technology company Jeffrey Watumull in addressing “The False Promise of ChatGPT.” They explain that these computer programs excite people because mechanical minds are surpassing human brains, on several levels. Those include intellectual insight, artistic creativity and every other distinctively human faculty.

“To be useful, ChatGPT must be empowered to generate novel-looking output,” the authors wrote. “To be acceptable to most of its users, it must steer clear of morally objectionable content. But the programmers of ChatGPT and other machine learning marvels have struggled — and will continue — to achieve this kind of balance.”

Farhad Manjoo, NYT columnist, notes that people seem to start off skeptical of ChatGPT but soon take it more seriously as a work and personal tool. They see it as indispensable, like iPhones and the internet. Most acknowledge that ChatGPT has potential for good and evil.

Manjoo has been using ChatGPT in different ways that he said help him in his job as an editorial writer. Although he noted that ChatGPT turns out clichés he should not be using, he said the program helps him when he is at a loss for an appropriate word or phrase.

I am skeptical not just of ChatGPT as a wordfinder, but of Manjoo as a full-time journalist for a significant newspaper. I have seen “How to write an essay” guides that said, “Look at a thesaurus to get help.” What? I imagine that anyone hired by the NYT would have a large vocabulary and not need to use ChatGPT much.

Incredibly, even post-secondary instructors tell students to grab a thesaurus instead of locating the right word in their mind. A student may look up the word “power” and get “influence” as a synonym. He will use “influence” instead of “power.” Any U.S. president has power, but Tibetan monk Dalai Lama has influence as a spiritual leader instead. ChatGPT will not be much better when used as a thesaurus.

“Take the problem of transitions — you’ve written two sections of an article and you’re struggling to write a paragraph taking the reader from one part to another,” Manjoo said in discussing how ChatGPT ends stumbling blocks. “You can plug both sections into ChatGPT. Its proposed transition probably won’t be great, but even bad ideas help in overcoming a block.”

I recall the adage, “You cannot write with a slide rule.” That meant that using a number system, for breaks in long copies, etc., is silly. I also remember a sports writer, the best columnist in the state when I was a copyboy in the early 1980s. He would take short breaks (10 to 15 minutes) around the building to get ideas and break a dry period. He did not rely on a computer to read his mind and provide good copy — there were no computers, just word processors.

I was thinking about brains being underdeveloped in using your own memory to win a barroom bet. Chatting online and surfing the web are fine with limits. Someday, maybe, you won’t have a virtual assistant device with you when you really need it. Just don’t use ChatGPT as a crutch. You may someday be in a barroom debate and must use your wits to win.

Greg Markley first moved to Lee County in 1996. He has master’s degrees in education and history. He taught politics as an adjunct in Georgia and Alabama. An award-winning writer in the Army and civilian life, he has contributed to The Observer for 12 years. gm.markley@charter.net