AI Model Used By Hospitals Caught Making Up Details About Patients, Inventing Nonexistent Medications and Sexual Acts

Generative AI and medical documents. What could go wrong?Health ScareIn a new investigation from The Associated Press, dozens of experts have found that Whisper, an AI-powered transcription tool made by OpenAI, is plagued with frequent hallucinations and inaccuracies, with the AI model often inventing completely unrelated text.What’s even more concerning, though, is who’s relying on the tech, according to the AP: despite OpenAI warning that its model shouldn’t be used in “high-risk domains,” over 30,000 medical workers and 40 health systems are using Nabla, a tool built on Whisper, to…

This content is for Member members only.
Log In Register