Hallucination is a thing. It’s a problem because you can’t know when AI is hallucinating or not. But there are a vast number of grade- and high-school level things that it won’t. Like yeah you can’t ask how many footballs long is a hockey rink, but you can ask it how to go about solving the question for yourself, and it will answer, which is what you want the AI to be doing anyway instead of trying to solve the problem.
AI is exceptionally good at spouting facts that look real but are actually bullshit.
Hallucination is a thing. It’s a problem because you can’t know when AI is hallucinating or not. But there are a vast number of grade- and high-school level things that it won’t. Like yeah you can’t ask how many footballs long is a hockey rink, but you can ask it how to go about solving the question for yourself, and it will answer, which is what you want the AI to be doing anyway instead of trying to solve the problem.