> The people working on this have got to realize this, but they’re doing it anyway.
This is the most horrific part of all of this, including using the LLMs on everything and it is industry wide.
> They responded with “it happens but I would say it’s accurate 98% of the time.” They said that with a straight face. The number told me they don’t actually know the hallucination rate, and this is not the kind of work where you want to fuck it up any percent of the time. Hallucinations are incompatible with corporate finance.
Also incompatible with safety critical systems, medical equipment and space technology where LLMs are completely off limits and the mistakes are irreversable.
This is the most horrific part of all of this, including using the LLMs on everything and it is industry wide.
> They responded with “it happens but I would say it’s accurate 98% of the time.” They said that with a straight face. The number told me they don’t actually know the hallucination rate, and this is not the kind of work where you want to fuck it up any percent of the time. Hallucinations are incompatible with corporate finance.
Also incompatible with safety critical systems, medical equipment and space technology where LLMs are completely off limits and the mistakes are irreversable.