Opinion Here are two snapshots of AI in coding in mid 2025. The CEO of GitHub, coding’s universal termite mound, says that AI is going to do all the coding and that’s a good thing. Meanwhile, real life AI coding tools make coders less productive while spreading the hallucination that they’re more so.
Can something that makes life worse for the workers building the digital world turn around so completely? Put into historical context, it looks not only inevitable but essential.
The proper place to start is, as so often, with “Amazing” Grace, or Rear Admiral Grace Hopper as she was more properly known. Seventy years ago, she began work on the first high-level computer language designed to create data processing programs through English-like words rather than equation-esque symbols. The result, FLOW-MATIC, led in short order to COBOL and our current mayhem. A creation myth of the digital age.
Less well known is the opposition to the idea of high-level languages that Hopper had to overcome. There were around 88 electronic computers in operation in the US in 1955, making computation an incredibly expensive commodity, jealously guarded by its creator class of high status mathematicians and engineers. Many considered compute cycles and storage far too valuable to waste on anything other than directly crunching numbers. Translating words from people who couldn’t or wouldn’t learn how to express themselves in machine symbols was unconscionable.
Hopper, having been a mathematics professor herself, knew this attitude to be hopelessly limiting even at the time, let alone if computers were going to become widespread. As you may have noticed, she was right. While the detractors were correct about resources being too limited at the time, that time changed with dizzying rapidity. That specific criticism was equally rapidly extinguished.
The underlying theme was not. Resource limitation coupled with entrenched thinking has been used to decry many fundamental advances since. As computers moved towards a mass market, each breakthrough brought initial resource constraints that kept alive previous programming practice optimised for efficiency and speed.
C made cross-platform software work in the early days of minicomputers, It didn’t stop it being ridiculed as a gussied-up macro assembler by assembler programmers. Likewise, during the Ediacaran era of microcomputers, there was much snorting at the arrival of intermediate representation or IR.
IR is where a compiler initially produces a common format that is later translated to executable code via a virtual machine. That offers a very dynamic portability across architectures, but initially at too great a demand on the base hardware. P-code, beloved of Pascal heads, was just too tardy to live. Java and its bytecode was equally sluggish at first, the joke being it was impossible to tell the difference between a machine that had stopped because of a crashed C program, and one that was running Java.
Java was saved by a combination of Moore’s Law and ubiquitous networking. Now IR is itself ubiquitous through technologies such as LLVN, and C itself has become an IR in compilers for languages such as Nim and Eiffel. Which makes sense for portability and optimization, at least for now. It’s impossible to envision a coding world as rich and powerful in this interconnected age without these ideas.
All this illustrates that increased abstraction comes hand-in-hand with increased complexity and capability. In truth, almost no code actually running on silicon in mainstream IT has been touched by human finger nor seen by human eye. Even ignoring platform VMs and microcode, the code that actually processes data these days has been written by machines, usually many times.
Thus to AI. AI is cursed thrice: by name, by hype, and by resource limitations. It is very clever data analytics and inference, it is not intelligent, and calling it so sets bad expectations. It can be very effective at well-formed, well-bounded tasks, but it is being sold as universal fairy dust. More bad news, more fuel for entrenched attitudes.
Looking at the fit to current capabilities, it’s uncomfortable. You can usefully run big AI models on mildly muddled local hardware, but training is a different matter. Huge resource restraints and very questionable business models are not great ingredients for evolving tools that fit well. These are early days, and no wonder coding AI is a very mixed blessing.
This will change, and it can only go one way. As Grace Hopper knew with complete clarity, removing barriers between thought and result accelerates technology. Coding AI will do that, if we advance the art with care and vision. It will mean more of the human work going into forethought, design and consideration of what we actually want, which are good disciplines that are badly undercooked at the moment.
There’s one last old programming joke to bear in mind. When computers can be programmed in written English, we’ll find out that programmers can’t write English. Here’s hoping that the law of limited resources and embedded attitudes makes history of that one too. If not – hey – people are still making a living fixing COBOL. ®