What it is
A hallucination is when an AI confidently makes something up. It'll cite a study that doesn't exist, invent a legal precedent, or give you statistics it pulled from thin air. It doesn't know it's lying. It's not trying to deceive you. It's just predicting what plausible-sounding text looks like, and sometimes plausible-sounding text happens to be completely wrong.
Why it matters for your job
This is actually good news for humans. AI hallucinations mean that someone still needs to check the work. If you're the person who can spot when the AI has made something up, you're the quality control layer that prevents your company from publishing nonsense, sending incorrect information to clients, or making decisions based on fictional data. That's a very secure position to be in.
What to do about it
Never submit AI-generated work without verifying the claims. Especially numbers, citations, and anything that sounds a bit too perfect. Build a reputation as the person who catches the mistakes. It's the easiest way to make yourself essential in an AI-augmented workplace.
This glossary is part of the full guide, along with role-specific playbooks and redundancy rights cheat sheets → See what’s inside