AI Opinie

The code factory: why Europe is in danger of missing the era of digital software production

Remy Gieling
Remy Gieling
March 2, 2026
8
min read
The code factory: why Europe is in danger of missing the era of digital software production
AI now writes the vast majority of the code at Big Tech β€” the software factory has been built and Europe hasn't figured it out yet.

Something has fundamentally changed in Silicon Valley. Not gradually, not as a slow shift that you see coming over years. No β€” abruptly, irreversibly, and at a speed that most European boardrooms don't even have on their radar. The computer programming factory has been built. And it's already running.

No one writes code by hand anymore

At the World Economic Forum in Davos, Anthropic CEO Dario Amodei said something most people have treated as a footnote but which actually marks a landslide: there are engineers within Anthropic who no longer write a single line of code. Not one. They use Claude Code as a backbone β€” the latest model, the latest tools β€” and instruct the system to build software. They think about system architecture, about the right instructions, about which sub-agents should use which skills. But the code itself? The machine writes it.

This isn't an experiment. This isn't a pilot. Anthropic has confirmed to Fortune that 70% to 90% of all code the company produces is now written by AI. Boris Cherny, the head of Claude Code, said earlier this month that he hasn't written any code himself in more than two months. And his approach is significant: he runs five to 15 parallel Claude Code sessions at the same time β€” five in his terminal, five to ten in the browser, plus sessions that he starts from his phone in the morning and picks up later. One user on X compared it aptly: it no longer feels like coding, it feels like playing StarCraft β€” you're not controlling code, you're commanding autonomous units.

At Spotify, co-CEO Gustav SΓΆderstrΓΆm said at the latest earnings call that the company's top developers haven't written a single line of code since December. They instruct AI via Slack on their phones on their way to the office, merge the results before they sit at their desks, and rolled out more than fifty new features using this method in 2025.

Let that sink in: the creators of the tools no longer write code themselves.

The factory can also be felt internally

At The Automation Group, we recognize this phenomenon firsthand. We have a team of forward-deployed engineers β€” technical specialists who build implementations for customers. Recently, someone internally asked the question: β€œWhen was the last time you wrote a piece of code yourself instead of instructing an AI system to do it?”

It became quiet. Literally silent.

No one could remember the last time they had manually written a piece of software. Not weeks, but months ago. And our job isn't even to be a full-time programmer β€” we build business solutions, integrations, automations. But even we write software ourselves. Or rather, we wrote software. Now we're instructing machines that do it for us.

That's the point where you realize that the shift isn't coming β€” it's already here.

The numbers behind the shift

Anyone who thinks this is limited to a few AI labs is wrong. The figures are sobering and come from the world's largest tech companies:

Microsoft CEO Satya Nadella revealed at Meta's LLAMacon that 20% to 30% of all code in Microsoft's repositories has now been written by AI. Microsoft CTO Kevin Scott expects this to reach 95% by 2030. Meta's Mark Zuckerberg expects AI to do half of all development work within a year. And Google's Sundar Pichai confirmed that more than 30% of all new code at Google is AI-generated, leading to an increase in engineering velocity by around 10%.

Jensen Huang, CEO of NVIDIA, put it perhaps most visually at COMPUTEX 2025: β€œAI is now infrastructure, and this infrastructure, like the Internet, just like electricity, needs factories. These factories are essentially what we're building today.” And when talking to Citadel Securities, he estimated that the market for agentic AI as a workforce is trillions of dollars, with digital nurses, accountants, lawyers, and marketers complementing the workforce. At NVIDIA itself, there are already more AI agents working on cybersecurity than people.

Huang's prediction: β€œEvery company's IT department will be the HR department of AI agents.” That is not a metaphor. That is an operational model.

The labour market paradox

And this is where it gets painful. Because while these factories are up and running, more people in America are graduating in computer science than ever before β€” the number of bachelor degrees doubled from 51,696 in 2013 to 112,720 in 2023. But the labor market has collapsed.

A landmark Stanford University study, led by economist Erik Brynjolfsson, analyzed millions of salary records and found a nearly 20% decline in employment for software developers between the ages of 22 and 25 since ChatGPT launched in late 2022. Inflows into AI-exposed occupations fell 13% compared to less exposed occupations such as nursing. Unemployment among computer science graduates will reach 6.1% by 2025 β€” almost double that of philosophy graduates.

Jan Liphardt, professor of bioengineering at Stanford, summed it up: Stanford computer science graduates are struggling to find entry-level jobs at major tech companies. That's crazy.

And this is where the paradox lies. Why? Because the factories don't need operators β€” they need architects. AI can perform the structured, repetitive tasks that previously served as a learning curve for junior developers. But to effectively control the plant, you need years of experience with unexpected problems, complex systems, and messy real-world scenarios. The Stanford study confirms this: for experienced developers, employment remained stable or even grew. It's the beginners who disappear.

The language of the machines does not appear to be that difficult. But knowing what the machines should say β€” that requires depth.

The return of an exhausted developer

Take the story of the founder of Open Claw β€” once a respected software developer who built a famous framework years ago, then lost love for software, and did something else for years. When he returned, he didn't do that by learning to code again in the classical sense. He started to vibe codes. Attempt after attempt, iteration after iteration β€” until he launched Open Claw in attempt 44: the first general-purpose, personalized AI agent you could install as a product. Open Claw hit like a bomb. The product was fully vibe-coded. No handwritten architecture, no team of dozens of developers. One person with a vision and a machine that wrote the code.

OpenAI has now taken over the project. The founder of OpenAI expected that 80% of all applications will eventually be created via vibe coding. And that's not just about hobby projects or prototypes. We're talking about production-grade software, SaaS applications, business-critical systems.

According to the 2025 Stack Overflow Developer Survey, 65% of all developers are now using AI coding tools every week. This is not the future. This is the present.

The analogy we need to understand

Think of the textile industry before the industrial revolution. Clothing was made by people behind looms β€” manual, artisanal, limited by the number of hands available. Then came the factory. Not a slightly better version of the loom, but a fundamentally different production system that could generate endless amounts of textile with a fraction of the human effort.

The same thing has now happened with software. Anthropic, OpenAI, Google and a handful of other players built the digital factory β€” a system that can produce endless amounts of computer code, controlled by a relatively small number of people who instruct and monitor the machines.

Satya Nadella himself compared it to the rise of electricity, noting that it took 50 years for most factories to learn how to use electricity to increase productivity. The difference: this time, it's much faster.

The paradigm shift that Europe hasn't experienced yet: these factories are already operational. They're already running.

The downside: the vampire effect

But let's be honest β€” it's not all glory. There is a serious downside that gets too little attention.

Steve Yegge, a veteran engineer with more than 30 years of experience at Amazon and Google, recently warned insistently about what he calls the β€œvampire effect.” In a widely shared blog post, he described how he suddenly falls asleep after long vibe-coding sessions β€” in the middle of the day. His colleagues at his startup were seriously considering installing sleep pods in the office. His analysis is sharp: AI gets you excited, you work like a fool, you produce enormously β€” and then it sucks you out.

β€œWith a 10x productivity boost, one engineer with Claude Code provides the value of nine additional engineers,” Yegge wrote. β€œBut building with AI takes an enormous amount of human energy.”

A study by METR confirms the picture. In a randomized trial with 16 experienced open-source developers, those who used AI tools were found to take 19% longer to complete tasks β€” even though they estimated that they would have been 20% faster. The tools feel faster, but the reality is more complex. The cognitive burden of constantly switching between sessions, reviewing output, and adjusting agents is exhaustive.

Yegge calls for engineers to limit their intensive code sessions to a maximum of three hours a day. Companies that treat their people like factory workers β€” delivering 10x productivity eight hours a day β€” are chasing them into a burnout.

Boris Cherny's workflow illustrates this perfectly: yes, he is extremely productive with his fifteen parallel sessions. But he also throws away 10-20% of his started sessions because they lead nowhere. And he's probably the world's most experienced user of his own tool. For the average developer, the learning curve is flat but long, as prominent open-source developer Armin Ronacher aptly put it.

The factory produces without interruption. The person who controls him, not.

People in the factory

Because let's be honest: even in the most advanced physical factories β€” even those we visited in China β€” people are still walking around. Not to operate the machines, no. But because machines sometimes give an error. Because you need people who understand how the machine works on a deep technical level. They know which bolt needs to be replaced. Who have the fundamental knowledge about the software, the hardware, the system as a whole.

The same goes for these digital factories. People are still needed β€” but people with a fundamentally different profile. People who check the quality of output. Which provide the correct instructions. Who do the error handling. Who update the models when necessary. Until, of course, the models do that themselves. They are system engineers of the highest level. Not operators, but architects and guardians of a system that drives itself.

Hiring managers now say it out loud: where they used to need ten engineers, they now need two experienced engineers and an LLM-based agent who are just as productive. Skill requirements in AI-exposed occupations are changing 66% faster than in other sectors, according to the 2025 JetBrains State of Developer Ecosystem Report.

The prediction that is no joke

It's no joke when Elon Musk says that his team of AI agents at xAI can compete with Microsoft. That those agents can replicate Excel, PowerPoint and Word in a heartbeat β€” but can also process company-specific information and provide advice on cybersecurity. It sounds like big talk. But the underlying logic is solid: if you have a factory that can endlessly produce software, building a productivity suite is nothing more than an instruction.

And the models have now reached a recursive milestone: they are now substantially helping to build better versions of themselves. OpenAI said about GPT-5.3 codex that it's β€œour first model that was instrumental in creating itself.” The machines build the machines that build the machines.

The next abstraction layer

And we are only at the beginning. At the moment, these machines still write in languages like Python and Java β€” programming languages that were once designed so people they could read and write. But as Musk recently predicted: soon, machines will simply build machines in machine code. Binary. That is many times more efficient than the detour via a human-friendly programming language. We will have interpretation agents to show us what the machines do so that we can control. But the abstraction layer β€” the distance between what the machine does and what we understand about it β€” is getting thinner.

Jensen Huang said it perhaps best at London Tech Week: AI is the big equalizer. β€œVery few people know C++ or Python. But everyone knows 'human'.” The way you program a computer today is to ask nicely. And the amazing thing is that the way you program AI is now similar to how you control a human being.

The language of the machines has always been the exclusive domain of a handful of specialists. That monopoly has been broken. AI models now speak German, French, English, Mandarin and Greek with ease. Turns out they speak Python, JavaScript, and C# just as easily. The language of machines was more difficult than human language β€” but not for machines.

The impact on SaaS and business software

The implications for the software industry are enormous. If 80% of all applications can be built via vibe coding, what does that mean for the thousands of SaaS companies that exist because building software is difficult and expensive?

The answer is simple: their moat evaporates. If an entrepreneur with a clear idea and an AI agent can build a working application in a week and a half β€” like Boris Cherny's team did with Claude Cowork β€” then the value of a SaaS product is no longer the code, but the network, data, and trust.

Seed-stage startups already had 21% fewer employees in the first half of 2025 than in 2020, according to Carta. These founders are using AI to achieve more with small teams than was previously possible with dozens of people. That is not a trend. This is a structural redefinition of what it means to be a software company.

The urgency for Europe

This is the time when I want to address my readers on AI.nl: don't be complacent. Don't sit back with the idea that it's not that far yet, or that it's going to run wild, or that the way we've been building software for decades is also how we'll do it in the coming years.

Silicon Valley's heartbeat is the fact that these factories were not just built, but are being scaled up at a breakneck pace. Infrastructure is being built on a massive scale in India β€” an entire AI city, in cooperation with America, where tech giants alone are investing $650 billion. Google has increased its capital expenditures to $85 billion. Software production capacity is being increased exponentially.

Meanwhile, in Europe, we have to do two things at the same time. First, empower people within organizations to automate their own work with the tools that are currently available β€” local, practical, direct. Second, reassess our leadership. The old paradigm, where only a handful of people spoke the language of the machines and were therefore the gatekeepers of digital innovation, is over.

Sundar Pichai told his own people at Google: β€œIn this AI moment, we need to achieve more by taking advantage of this transition to achieve higher productivity.” And at the same time, he expects Google's engineering team to sprout β€” don't shrink β€” because AI makes engineers more effective, not redundant. That nuance is crucial. The job is changing; the need for people isn't going away. But the people that are needed are fundamentally different.

Salesforce's Marc Benioff said it perhaps the most confrontational thing at Davos: the current generation of CEOs is the last to manage a fully human workforce.

The real question

The companies that use these digital factories will mercilessly surpass incumbents that still program manually β€” with incremental software updates, limited by the number of programmers who sit behind a desk from Monday to Friday, from nine to five β€”. Not in years. Now.

The limitation is no longer computer access. The limitation is no longer finding programmers. The old limitation β€” that only a handful of people spoke the language of the machines β€” has evaporated.

The new limitation is different knowledge: the ability to understand complex systems, formulate appropriate instructions, perform quality control over output, and diagnose errors when the machine crashes. And β€” perhaps most importantly β€” the human ability to do this without burning out.

The factory is already up and running. The real question is not whether this will happen. The question is whether we jump on the train β€” or watch the competition leave from the platform.

Remy Gieling
Job van den Berg

Like the Article?

Share the AI experience with your friends