What it is

A transformer is the architecture (the underlying design) that makes modern AI language models work. Before transformers came along in 2017, AI was pretty rubbish at understanding context in language. Transformers introduced a trick called "attention" that lets the model look at all the words in a sentence at once and figure out which ones relate to each other. It's the reason ChatGPT can write a coherent paragraph instead of word salad.

Why it matters for your job

You'll never need to build a transformer. But knowing this is the foundation underneath basically every major AI tool means you understand why everything changed so fast. It wasn't a gradual improvement. One architectural breakthrough unlocked LLMs, image generators, code assistants... the lot. And we're still in the early innings of what transformers make possible.

What to do about it

When someone at work dismisses AI as "just a chatbot," you'll know that transformers made these tools fundamentally different from anything that came before. That understanding helps you spot which AI capabilities are real and which are hype. It also makes you sound like you know what you're talking about in meetings, which never hurts.

This glossary is part of the full guide, along with role-specific playbooks and redundancy rights cheat sheets See what’s inside