Hallucination (in AI)
When AI generates incorrect or fictional information but presents it as fact.
Help me explain to...
K–5th
Sometimes a computer makes things up. It sounds real, but it’s not true, like when your friend tells a silly story.
6–8th
AI can sound confident but be wrong. A hallucination happens when it gives a wrong answer like it’s correct. We have to fact-check it.
9–12th
Hallucination in AI refers to confidently generated false outputs. It’s a critical issue in language models and requires careful validation and user skepticism.
Expeditions
K–5th
Play “Two Truths and a Lie” and ask students to guess the false one. Talk about how AI sometimes does this on accident.
6–8th
Show an AI-generated text with one or two made-up facts. Ask students to spot the errors.
9–12th
Assign a fact-checking activity: Give students AI responses to check for accuracy. Discuss sources and reliability.
Share Term
