Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
Terminology changes and bad terminology can and should be changed.
Anthromorphising software only helps the ones that are pushing it.
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.