News

Insiders tell CNN the FDA’s AI is “hallucinating” studies and can’t access key documents. Agency leaders insist the AI is ...
There’s no questioning that AI should have governance and guardrails. As much good as it can bring, it also heightens risk.
But three of those FDA employees told CNN that Elsa just makes up nonexistent studies, something commonly referred to in AI ...
With reports that FDA’s AI Elsa is “confidently hallucinating” studies that don’t exist, the use of AI to streamline drug ...
Insiders at the Food and Drug Administration are ringing alarm bells over the agency's use of an AI to fast-track drug ...
The federal agency introduced Elsa last month, boasting about the AI tool's ability to increase efficiency at the FDA.
FDA officials say the assistant is flawed, just as the Trump administration stresses AI adoption in healthcare.
Despite ambitions to streamline regulatory review, FDA’s Elsa platform has been prone to hallucinations, prompting internal scrutiny and questions about AI reliability and governance.
Hallucinations are a known problem with generative AI models—and Elsa is no different, according to Jeremy Walsh, the head of ...
The FDA's generative AI, Elsa, has a massive hallucination problem, according to the agency's employees themselves.
Following FDA’s first AI-assisted medical product review, the agency has committed to expanding the use of generative AI across all centers by June 30, 2025.
FDA’s Broader AI Goals. The launch of Elsa is part of a broader “AI-forward” strategy. In January 2025, the agency issued a draft guidance on considerations for the use of AI to support ...