Home National Experts Warn Against Using OpenAI’s Transcription Tool In Hospitals: Here’s Why

Experts Warn Against Using OpenAI’s Transcription Tool In Hospitals: Here’s Why

by rajtamil
0 comment 17 views

experts warn against using openai’s transcription tool in hospitals: here's why

OpenAI's Whisper, a newly developed AI-powered transcription tool, has called for some criticism from researchers and experts who cite its dangers as a flawed tool in healthcare and other sensitive sectors. Whisper has been adopted in a variety of fields, such as hospitals, where it has been used to transcribe various medical consultations. But studies have recently surfaced to draw attention to the tool's tendency to hallucinate – a term used to describe its capacity to weave into existence fabrications, with outcomes ranging from false to immensely harmful texts. Experts fear such hallucinations manifesting in very sensitive environments like healthcare, where minor transcription mistakes could misinterpret meanings or cause dangerous medical outcomes.

Why Whisper's "Hallucinations" Are Problematic

"AI hallucinations" is a poorly defined term that describes the situations when a program makes up a word or sentence that doesn't exist in the source audio. In healthcare settings, this simple defect could have catastrophic consequences. One Michigan study discovered that nearly 80% of Whisper's transcriptions of public meetings contained made-up sentences or phrases. Alondra Nelson, former head of the White House Office of Science and Technology Policy, expressed concern about the use of such technologies in hospitals, “Nobody wants a misdiagnosis. There should be a higher bar.” She believes it's a severe risk to the wellness of patients to use an unreliable tool for that industry.

Whisper’s Impact on Healthcare

Notwithstanding the disclaimer from OpenAI not to employ Whisper in sensitive environments, the software is presently being used extensively in healthcare settings to transcribe patient interactions. Because of such inaccuracies, these technologies could create grave discrepancies between patients and providers, affecting patient outcomes. OpenAI has admitted that they are aware of the problem and that they are actively working towards a solution for the hallucinations, but experts say that the need for quick-fix solutions is important due to the widespread adoption of the tool.

Calls for Accountability and Transparency

William Saunders, a former research engineer at OpenAI who resigned in February over concerns about the direction of OpenAI, calls for greater transparency and accountability. “It's problematic if you put this out there and people are overconfident about what it can do and integrate it into all these other systems,” says Saunders, clearly outlining the dangers of unchecked dependence on Whisper.

Experts recommend that OpenAI prioritise these concerns so as to avoid potential risks in critical areas, such as health care, with an insistence that improvements to Whisper's accuracy would be central before entering sensitive applications.

You may also like

2024 All Right Reserved.

Adblock Detected

Please support us by disabling your AdBlocker extension from your browsers for our website.