The simple question that leaves ChatGPT stumped - and what to ask it

Watch more of our videos on ShotsTV.com 
and on Freeview 262 or Freely 565
Visit Shots! now
Not so super after all 😅
  • ChatGPT has developed and grown smarter since it launched.
  • However it can still suffer issues known as AI hallucinations - and the same is true for other chatbots.
  • And there is one simple question than can really bamboozle it.

Computers continue to get smarter by the minute and this rapid progress is only going to carry on, according to the experts. Driven by the development of advanced AI and machine learning, our laptops, PCs, tablets and phones will continue to evolve by leaps and bounds.

But despite how intelligent artificial intelligence has become, it doesn’t stop it from occasionally making some rather hilarious mistakes. A big talking point at the World Congress on Innovation and Technology in Yerevan, Armenia this weekend was the problem of “hallucinations” - and no I’m not talking about near death experiences, it's an issue with AI tools - don’t worry I’ll explain later.

Hide Ad
Hide Ad

One of the most humorous examples of “hallucinations” is the way asking ChatGPT a simple question about strawberries can completely stump it. For reasons that can’t really be explained, it struggles with one particular query - and you can try it out for yourself.

What is the question that stumps ChatGPT?

Okay, so you’ve loaded up the ChatGPT app or opened it up on a web page and you are ready to go. The prompt you have to type into the AI-chatbot tool is “how many Rs are in Strawberry”.

Person on a mobile phone. Photo: AdobePerson on a mobile phone. Photo: Adobe
Person on a mobile phone. Photo: Adobe

A simple test right? The answer is three of course. But infamously, so much so that it was the subject of a segment of a panel during WCIT, ChatGPT will more than likely reply with the answer “two”. And if you try to tell it's wrong, it will continue to argue that strawberry does in fact only have two rs in it.

I tried the question on both the ChatGPT and webpage on my phone and it answered three initially both times. However my fellow journalists also tried at that moment and it told them “two”.

Hide Ad
Hide Ad

But when I asked a follow-up question, the AI-tool seemed to get befuddled and started to claim “strawberry” had two rs and became stubborn. I have added the screenshots from the full exchange below - but in summary, I asked it to prove that “strawberry” had three rs and it proceeded to spell out the word before concluding that it only had two letter rs.

The question that befuddles ChatGPT. Photo: ChatGPT screenshotsThe question that befuddles ChatGPT. Photo: ChatGPT screenshots
The question that befuddles ChatGPT. Photo: ChatGPT screenshots | ChatGPT screenshots

Eventually it admitted it was wrong - but it is quite the funny experience. Despite all the computing brain power behind ChatGPT, it can still be tripped up by such a simple question.

What is a hallucination?

You are probably familiar with the word already - but it can also be applied to AI. If you, a human, hallucinates it means that you are seeing or hearing things that aren’t there.

But when it comes to artificial intelligence, it is along the same lines but it refers to when the tool spits out an incorrect or misleading answer. For example, claiming the word strawberry has two rs instead of three.

Hide Ad
Hide Ad

On Google’s website it explains: “AI hallucinations are incorrect or misleading results that AI models generate. These errors can be caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model. AI hallucinations can be a problem for AI systems that are used to make important decisions, such as medical diagnoses or financial trading.”

The issue of AI hallucinations also reared their head earlier in the summer when Meta’s chatbot caused consternation after describing the assassination attempt on Donald Trump as “fictional”. Meta apologised and said it was the result of an hallucination. Oops.

I am at the World Congress on Innovation and Technology from 4 October to 7 October. You can find all my stories from the event here. You can get in touch with me by emailing: [email protected]

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.

News you can trust since 1917
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice