This marks the latest instance of AI confabulations (also called “hallucinations”) causing potential business damage. Confabulations are a type of “creative gap-filling” response where AI models invent plausible-sounding but false information. Instead of admitting uncertainty, AI models often prioritize creating plausible, confident responses, even when that means manufacturing information from scratch.
It amazes me how many different terms we’ve assigned to “an LLM we call artificial ‘intelligence’ just made some shit up because it sounded like the most logical thing to say based on probability.”
It’s funny how “AI bros” will argue that we shouldn’t use “lies” because that implies intelligence, and then simultaneously treat their LLM as intelligent
It amazes me how many different terms we’ve assigned to “an LLM we call artificial ‘intelligence’ just made some shit up because it sounded like the most logical thing to say based on probability.”
“Lies.” It’s an old fashioned term, but a beautiful term
We love the word don’t we folks? Very popular word. We love it.
It’s funny how “AI bros” will argue that we shouldn’t use “lies” because that implies intelligence, and then simultaneously treat their LLM as intelligent
Officer I didn’t commit tax fraud, it was simply a hallucination you see.
You joke, but the accounting industry is working on LLM tools for auditing and tax
Seems like it’d be easier to say the piece of shit doesn’t work
tech journalists stop assigning agency to the probability-based slop generator [Challenge: IMPOSSIBLE]