anxiety7 min read

When Your Coworker Automates Your Job (And Doesn't Know What to Do)

There was a post on Reddit that stuck with me for weeks. A developer had built an automation — partly using AI, partly just smart scripting — that replaced the manual work of someone on their team. A colleague who spent their days doing data entry, reconciliation, and report formatting. The automation did it in minutes. The colleague was made redundant within two months.

The developer who posted it wasn't celebrating. They were gutted. They'd built the thing to make the team more efficient. They hadn't thought through the second-order consequence, which was that "more efficient" meant "needs fewer people." They were asking Reddit what they should have done differently.

The comments were a mess. Some people said the developer was naive. Some said it wasn't their fault, it was management's decision. Some said the automated colleague should have seen it coming and upskilled. All of which managed to be partially right and entirely unhelpful at the same time.

I've thought about this a lot because it touches on something we don't talk about enough. AI isn't coming for your job — your company is. But sometimes the person who hands your company the tool is sitting at the next desk.

The automator's perspective

Let's start here because it's the less discussed side.

If you're someone who builds things — a developer, a data scientist, someone who's good with tools and systems — automation is what you do. It's literally your job to find inefficiencies and fix them. When you see a colleague spending four hours a day copying data from one spreadsheet to another, your instinct is to fix it. That instinct is the reason you were hired.

The problem is that "fixing" a process and "replacing" a person can be the same action with very different consequences. And most people who build automations are thinking about the process, not the person.

I know this because i was this person. As a data scientist, i built tools and pipelines that automated things humans used to do. I told myself it was making the team more productive. Which it was. What i didn't always think about was what "more productive with fewer people" looks like from the perspective of the fewer people.

This isn't an argument against automation. It's an argument for thinking about it more carefully. If you're building something that could make someone's role redundant, you have a choice about how you handle it. You can build it quietly and present it to management as a fait accompli. Or you can have a conversation with the person whose work it affects, ideally before management sees the demo.

That conversation is uncomfortable. But it's less uncomfortable than finding out your colleague lost their job because of something you built and you never even warned them.

The automated person's perspective

Now the harder side.

Finding out that a colleague has built something that does your job is a specific kind of humiliation. It's not like being replaced by a new hire, where at least there's another human who presumably needed a job too. It's being replaced by a script. By something someone built in their spare time. The implication, whether intended or not, is that your work wasn't complex enough to resist automation.

That implication is often wrong, by the way. The automation probably doesn't handle the edge cases, the exceptions, the judgment calls, the relationships. But that's cold comfort when you're sitting in an HR meeting being told your role has been "restructured."

This topic is covered in detail in AI Proof Your Job: The 30-Day Survival Checklist Get it for $7

If you're the person whose work has been automated, the emotional response is going to be a cocktail of anger, betrayal, shame, and panic. All of which are entirely reasonable. You trusted the person who built this. You worked alongside them. And now their side project is the reason you're looking for a new job.

Here's what i want to say to you, directly: this is not about your worth. The fact that a task can be automated says nothing about the person who was doing it. Humans were doing long division for centuries before calculators existed, and nobody thinks mathematicians are worthless because we have calculators now. The task changed. That's all.

But i also want to be honest: feeling philosophical about it takes time. Right now, if this is happening to you, you're probably just angry. That's fine. Be angry. Then, when the anger has a chance to settle, start thinking about what comes next.

The management problem

Here's where the real responsibility lies, and it's neither with the automator nor the automated.

When a team member builds an automation that could eliminate a role, management has choices. They can:

  1. Use the automation to make the team faster while keeping everyone employed, with the affected person moving to different work.
  2. Use the automation as an opportunity to cut headcount and save salary costs.
  3. Ignore the automation entirely because organisational change is hard and they don't want to deal with it.

Option one is the ethical choice and the one that preserves institutional knowledge. Option two is the financially expedient choice and the one that boards and CFOs tend to prefer. Option three is more common than you'd think.

The point is: the decision about what happens to your job is a management decision. The person who built the automation gave management a tool. What management does with that tool is on management.

This is why the Reddit debate about whether the developer was "at fault" missed the point entirely. The developer was doing their job. Management made the call. Blaming the developer is like blaming the person who invented the self-checkout for supermarket job losses. The technology enabled the decision, but a human made it.

What this means for your workplace

This dynamic is going to become more common, not less. As AI tools become easier to use, it won't just be developers automating things. Marketing people will use AI to automate design tasks. Finance people will use AI to automate reporting. Managers will use AI to automate the work of their direct reports without fully realising what they're doing.

This creates a strange new workplace tension. Being good with AI tools makes you valuable. But being good with AI tools also potentially makes you a threat to your colleagues. How do you navigate that?

If you're the person building automations:

Be thoughtful about who you show them to and how. Before you demo to your boss, think about who's affected. Have a conversation with them. Frame the automation as something that could change their role, not eliminate it. "I've built something that handles the data entry part — maybe that frees you up for the analysis work?" is very different from "look, i automated Sarah's job."

Advocate for your colleagues. If you know management might use your automation to cut a role, say something. "I built this to help the team, not to reduce it" is a statement worth making, even if management ultimately decides otherwise.

Document what the automation doesn't do. Every automation has limits. Be explicit about them. "This handles 80% of cases but the remaining 20% require human judgment" is an argument for keeping the human.

If you're the person whose work is being automated:

Get involved in the automation if you can. Seriously. The person who understands the work and the tool is more valuable than the person who only understands one. Ask to be part of the implementation. Ask to be the person who handles the exceptions. Position yourself as the quality layer, not the manual layer.

Start expanding your role before you're forced to. If the repetitive part of your job is being automated, what's left? What could you be doing that you're not? Client relationships? Process improvement? Training others? The best time to expand your role is before someone builds a script that handles the narrow version of it.

Don't wait for the redundancy meeting. If you can see automation heading for your work, start planning financially and professionally now. Not because you're definitely going to lose your job, but because having a plan removes the paralysis.

The conversation we're not having

The deeper issue here is that most workplaces have no framework for what happens when internal AI automation affects jobs. There's no policy. No process. No ethical guidelines. It's all ad hoc.

When efficiency means layoffs, someone needs to be asking: what's our responsibility to the people whose roles change? How do we handle redeployment? What support do we provide? How do we make sure that the person who builds the automation and the person affected by it aren't left to sort it out between themselves in an awkward conversation by the coffee machine?

These are management questions. They're leadership questions. And most companies are completely winging it.

Until that changes, all you can do is look after yourself. Be aware of the automations happening around you. Be curious about the tools rather than frightened of them. And if you're the person building, remember that behind every process you're optimising, there's a person who gets up every morning, commutes to work, and does that process to pay their rent.

That's not a reason to stop building. It's a reason to build carefully.

The one thing to do today: Look at your own role. Which parts are most vulnerable to automation by a colleague with the right tools? Not to panic about it — but to start thinking about how you'd expand into the parts that aren't.

Get the 30-Day Checklist — $7

Instant download. 30-day money-back guarantee.

Includes 7 role-specific playbooks, AI glossary, and redundancy rights cheat sheets for US & UK.

Not ready to buy? That’s fine.

Get 3 free tips from the guide. No spam.