You Were Never Paid to Type Code

You Were Never Paid to Type Code
by Mauro Erta

I once heard a story about a man walking with the devil. They saw someone ahead of them pick something up from the ground.

That man just found a piece of Truth

the man said.

Doesn’t that worry you?

The devil shrugged.

No, he’ll make a religion out of it.

[2022]

It's just hype

When ChatGPT was first introduced in 2022 and people started experimenting with AI-assisted software development, many of us had the same reaction.

The code was buggy. Often unreliable. Sometimes confidently wrong in ways no human would ever be.

If you actually tried to use it for real work, skepticism wasn’t just understandable — it was rational.

The community quickly split.
Some believed software was simply too complex, too contextual, too brittle to ever be meaningfully automated. Others were convinced this was the beginning of something bigger, what felt like the end of software development as we know it.

I grew up as a deeply techy person. I watched the internet become what it is today, social networks turn into the most addictive products ever built, and smartphones collapse entire industries into a slab of glass in our pockets. I went from an indestructible black&white Nokia to a world where knowledge, services, and people became one click away.

Growing up in a world like that teaches you something uncomfortable:
change doesn’t ask for permission.

The earlier you recognize its direction, the faster you can adapt. And that’s the part we often get wrong at the beginning.

We judge these moments by asking whether the new thing is better than what we already do. But most technological shifts don’t start by outperforming experts — they start by quietly changing where decisions are made.

The question was never whether AI could write better code than us in 2022. The real question was whether it was already starting to move software development one layer up the abstraction stack.

At the time, it was easy to say “this is just hype.” And in many ways, that was true.

But denial doesn’t fail because it’s irrational. It fails because it focuses on the wrong signal.

[2023]

We're screwed

Forking VS Code and raising billions suddenly became normal. New tools appeared every week: Cursor, Windsurf, Copilot, Devin.

So I did what everyone does.

Let’s give it a try.

I downloaded Cursor and pointed it at a real project.

I have this bug in this codebase. Can you fix it?

It scanned the code.

The issue was caused by this line [...]. I fixed it and added a test.

Then it paused.

Can I help you with anything else?

I stared at the screen.

What the fk?

It was a simple bug, sure — but it was still a bug. One I would normally investigate, fix, test, and commit myself. It did all of that in seconds. Calmly. Correctly. And then asked for more work.

That’s the moment denial stops working.

Because once you see something like that, the fear isn’t abstract anymore. It’s not about “the future of AI” — it’s about your future. Right now.

I wasn’t the only one asking that question. The industry started answering in its own way. Layoffs became routine. Funding rounds for GPT wrappers became news. Junior developers started sounding optional. And words like “vibe coding” entered the conversation, casually flattening years of learning into a mood.

Am I a vibe coder now?
Do I want to be one?

How long before I’m not valuable anymore?

[2024]

It's just a tool

First week with Cursor went by, second week too.

A month in, it had quietly become part of my routine.
Not in a dramatic way. More like a convenience.

I could ask questions about the codebase instead of digging through files. I could get unstuck faster. I could challenge an approach before committing to it. It felt less like outsourcing thinking and more like having a very patient rubber duck.

Around the same time, I found myself building more AI-heavy systems - what people call AI Engineering, whatever that means - and I saw how much work is needed to make a product that is valuable for real people, enhanced by AI:

Agents, RAG pipelines, tool calling, workflows — new patterns were emerging fast. Powerful, yes, but also fragile. Easy to break. Easy to mislead. The kind of systems that work well until they don’t.

And that was reassuring.

Because fragile tools don’t replace professions. They extend them.

At least, that’s what I told myself.

[2025]

It writes better than me

At some point, the pace changed.

It stopped being about new tools and started being about new baselines.

I realized I wasn’t really writing code anymore — or at least not in the way I used to. Most of the time, I was assisting: nudging a model when it hallucinated, correcting it when it got stuck, reviewing what it produced.

And what it produced was often... better than what I would have written.

Not more creative.
Not more inspired.
Just cleaner. More consistent. More disciplined.

Tests where I would have skipped them. Structure where I would have improvised. No fatigue. No shortcuts. No ego.

That’s the moment the discomfort set in.

Because if the thing I took pride in — being good at writing software — was now reproducible, tireless, and increasingly cheap, then the uncomfortable question wasn’t about productivity anymore.

It was about identity.

[]

The piece of Truth

If software engineering was ever about knowing a specific language, or a framework, if it ever was about the syntax, patterns or in general knowledge - then yes, this job is gone. Forever:

Syntax can be generated.
Patterns can be inferred.
Tests can be written.
Refactors can be suggested.

The price to generate code is getting cheaper and cheaper, and it's already cheaper than the price to hire a human developer - with more quality.

If your value was typing on a keyboard, you're not valuable anymore.

So... was it ever about that?

I don't think so, actually, I don't think it was ever about that - for years, writing code felt like the hard part because it was the bottleneck. It required patience, discipline, experience. It was visible effort. It looked like value.

But it was never the whole job, It was just the most obvious part of it.

The real value was always upstream.

It was in deciding what to build.
In choosing between imperfect trade-offs.
In recognizing when a feature solves the wrong problem.
In understanding constraints that aren’t written in the ticket.
In seeing second-order effects before they ship to production.
In owning the consequences when things break.

Code was the answer. The hard part was always the question.

Now that answers are cheap, the question is exposed, and that’s uncomfortable.

Because if your identity was “I’m good at implementing”, you feel replaceable. And in many contexts, you are.

But if your identity was “I’m good at deciding what should exist”, then nothing fundamental has been taken from you. The leverage has changed. The surface area has changed. The speed has changed.

The center of gravity has moved up a layer.

This is the piece of truth most people miss.
AI doesn’t eliminate engineering. It eliminates the illusion that implementation was the essence of engineering.

If you thought you were paid to type code, this era feels like erosion.
If you understand you were paid for judgment, taste, mental models, and responsibility, this era feels like exposure.

Exposure of what actually matters.

And maybe that’s the uncomfortable part: there’s less hiding now.
Fewer excuses.
Fewer technical bottlenecks to blame.

When building becomes easier, deciding becomes harder.

That’s where the value concentrates.
Maybe that’s more than just a piece of truth.

Or maybe it’s simply the most useful one — for now.
We’ll see what survives when the next model comes.

Drafted by an AI. Judged by a Human.