May 15, 2025
Trending News

OpenAI’s speech-to-text generation tool Whisper was found to have a hallucination problem

  • October 27, 2024
  • 0

With the popularization of artificial intelligence, hallucinations were one of the concepts that appeared in our lives. In particular, language models can present non-existent information as if it


With the popularization of artificial intelligence, hallucinations were one of the concepts that appeared in our lives. In particular, language models can present non-existent information as if it were real while answering questions. Not even sometimes academic sources can also add it to the bibliography as if it were real.

something like artificial intelligence “made up” It’s a very controversial topic, but at least technically understandable just in case. Finding different solutions when making connections between different concepts or not having access to the right information can be a problem for texts created by artificial intelligence. On the other hand, it was unexpected that the problem arose in an area such as transcription.

Hallucination problem in transcription

According to Associated Press OpenAI’s transcription tool WhisperAs he converts the conversations into text, he adds a lot of irrelevant information. This information causes problems in many different areas, from comments about racism to medical treatments. Especially researchers in hospitals and other medical examinations He states that this situation will cause major problems.

A University of Michigan researcher also examined transcripts of public meetings. Under review in 8 out of 10 texts They turned out to be hallucinations. A machine learning engineer stated that in a study that lasted more than 100 hours, at least half of the texts were hallucinations.

OpenAI In The Guardian’s statement, they state that they are working to increase the accuracy of the models and reduce hallucinations and that Whisper “in certain high-stakes decision-making contexts” It was stated that there are usage policies that prevent its use. The researchers were also thanked for sharing their findings.

Source: Web Tekno

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version