“Better” AI has more “hallucinations”

    Imagine you are a doctor with a patient whose hallucinations keep increasing. Of course, you might try to figure out how to stabilize the patient if possible so the hallucinations stop increasing in number. And you might be furiously trying to figure out the cause of these hallucinations so that you might reduce or eliminate them. One thing you almost certainly would NOT do is suggest that other people rely on your patient as a source of information.

    And yet, that is precisely what developers of a “better” version of artificial intelligence (AI) are telling us to do. According to the linked article, “On one test, the hallucination rates of newer A.I. systems were as high as 79 percent.” Contemplate that for a minute when you ask AI for help in answering a question.

    Executives in the industry now admit that AI hallucinations—that is, information provided by AI tools that is just made up—will always be part of AI. The reason: AI tools use mathematical probabilities to construct responses. They cannot and will not ever have access to lived human experience and the judgement that results from that experience.

    As the linked piece explains:

    Another issue is that reasoning models are designed to spend time “thinking” through complex problems before settling on an answer. As they try to tackle a problem step by step, they run the risk of hallucinating at each step. The errors can compound as they spend more time thinking.

    As AI struggles with its mental health, the owners of AI technology continue to require more and more data centers that require more and more electricity. States are now competing to subsidize the building of data centers. According to this piece:

    What the “winners” of this competition receive is the opportunity to spend hundreds of millions or even billions of dollars subsidizing large corporations, while receiving few jobs or other public goods in exchange. Dressing up subsidies in the language of “competition” does not make them any less costly or any more effective.

    The article adds that “at least 10 states already lose more than $100 million per year on data center subsidies, with Texas’ costs climbing to $1 billion.”

    The writer also describes how previous efforts to attract film and television production to various states finally became such a costly competition that many states abolished their film and television commissions. The states were no longer getting their money’s worth because too many states had entered the competition and kept upping the ante.

    I leave you with another AI development that would strain credulity if rational minds were actually contemplating it. The U.S. Food and Drug Administration has been meeting with one of the largest players in the AI field, OpenAI, to discuss how AI could be used “to speed up the drug approval process.” What could possibly go wrong?

    That’s a question worth asking again and again whenever you rely on AI for information. It turns out that on most topics you’ll need to have some expertise in order to sort through the mishmash that AI answers provide. Far from replacing experts, AI will actually require experts to sort through the mountains of unreliable information. Hey, maybe we should just skip AI and go back to consulting experts in the first place.

    Hallucinations of Don Quixote (2005) by Dany duquefer via Wikimedia Commons https://commons.wikimedia.org/wiki/File:ALUCINACIONES_DEL_QUIJOTE.jpg

    With Gorilla Gone Will There Be Hope For Man?

    Ishmael: Chapter 13

    By Tom Murphy, Do the Math

    Modernity’s inevitable failure need not be humanity’s ultimate failure. Modernity never could have worked in the long term, and represents only a small sliver of human existence.


    May 8, 2025

    Discussion