The collapse of code cost: why AI is a structural accelerator, not magic
While everyone focuses on which roles will die because of AI, the real shift is the collapse of code production costs. Software is becoming a disposable commodity. This article explores the historical evolution of abstraction, the extinction of the super-specialist, and why injecting AI into a broken and highly coupled organization will only accelerate its failure.
Until today, everyone has focused on the fact that certain roles are dying because of artificial intelligence. Although no one knows exactly how things will unfold, many articles on the subject are garbage. We can, however, make objective considerations using a historical analogy to separate the fluff from what actually makes sense.
If we consider artificial intelligence as something that slashes development costs, we can draw a parallel with what happened in the tech industry from the past to the present.
We moved from armies of developers writing machine code in Assembly, to compiled languages, interpreted languages, virtual machines, libraries, and finally frameworks. The evolution of computer science has always been a compression of complexity. We created meta-languages and abstractions to hide the sheer amount of instructions given to the computer.
A programmer using Java today is probably a thousand times faster than an Assembly developer from years ago, thanks to these abstractions, open-source repositories, and the Cloud.
The Evolution of Natural Language Syntax
The AI revolution seems different, and indeed it is. The difference between past simplifications and the current one is that, until yesterday, a developer's intervention was always required to put the pieces together and describe the logical flow.
Even with the most automated frameworks, the core business logic was always reserved for human beings. Today, machines can translate natural language into instructions and execute them. Countless vibe coding and assisted development tools have emerged. As anyone not living under a rock knows, it is no longer strictly necessary to physically write every line of code.
The main aspect of introducing AI into development is a collapse in the cost of pure code production. What required 50 developers yesterday might require five today. It is exactly like when 100 people manually did what a single framework automatically handles today.
Software as a Commodity
This scenario reinforces a fundamental idea: code must be viewed as a commodity, not an asset.
If code can be built very quickly by anyone, it no longer represents a massive corporate asset. It is easily replicable. Except for extremely complex architectural layers that AI will struggle to replicate in the next ten years, there is no longer a need to write the vast majority of corporate code from scratch.
This is a crucial aspect for calibrating strategies. The industry is evolving so fast that talking about "perfect stacks" or definitive best practices is premature and secondary to organizational principles.
Treating software as a commodity forces us to think about architecture differently. If the individual pieces of the architecture are easily replaceable and replicable, all the complexity and human intelligence shift to connecting these pieces and understanding the needs of the business.
The Extinction of the Super-Specialist
Over the years, the need for code super-specialists focused on skills that will become obsolete will disappear. This has always happened. Twenty years ago, there were experts dedicated exclusively to writing efficient GUI algorithms; those jobs were replaced by graphics engines. Those who wrote regular expressions by hand were replaced by libraries that build them automatically.
Skills based on volume or the rote memorization of thousands of frameworks will tend to vanish.
In the near future, the software engineering profession will move increasingly higher up the stack. It will be a professional who must put the pieces together, understand business economics, and translate business needs. We will become digital experts orchestrating components. We will obviously maintain an awareness of how things work under the hood, but without the need to write the implementation details.
How to Adapt the Architecture
The winning approach today is integrating AI while adapting the architectural structure so that components are increasingly and quickly replaceable.
We must invest heavily in the solution design phase and in the use of rigorous contracts (APIs). We must lay down tracks to prevent the architecture from becoming an unmanageable monolith. This is because AI, as it is built today and barring improbable trajectory shifts, will hardly ever be able to grasp the actual complexity of the business and the system architecture. That comprehension requires true reasoning, which these probabilistic tools do not offer.
AI as a System Accelerator
Many questions remain open that no one can answer today. What should be the role of junior profiles in a company, given that AI replaces exactly their tasks? What happens when a team produces 10 times the code volume compared to a few years ago?
In times of great uncertainty, the most important thing is to remember that fundamental principles remain valid and become even more critical.
Artificial intelligence is an accelerator. If we accelerate an inefficient organization full of structural problems, we will only amplify those problems at an unprecedented speed. If we have an efficient and decoupled organization, then we will truly move faster.