AI Transcription Tool in Hospitals is Fabricating Text! 😲

Summary:

  1. Whisper’s Promised Capabilities
    OpenAI’s Whisper, an AI-powered transcription tool, has been marketed as having near “human level robustness and accuracy.”

  2. Major Flaw Identified
    However, researchers have found that Whisper frequently invented text, producing false chunks or even entire sentences, a phenomenon known as hallucinations.

  3. Nature of Hallucinations
    These fabrications can include problematic content, such as racial commentary, violent rhetoric, and imagined medical treatments.

  4. Extent of the Issue
    The scale of the hallucination problem is concerning, with one researcher reporting that 80% of transcriptions contained errors, while another found hallucinations in half of the analyzed transcripts.

  5. Impact on Transcription Reliability
    Even in well-recorded audio, hallucinations were common, with a study revealing 187 hallucinations across 13,000 clear audio snippets. This trend could lead to tens of thousands of faulty transcriptions across millions of recordings.

Read more at: AP News