On The Shoulders of Giants Paradox

I think Ai will create more "software engineers" not less.

Similar to how the early 2000's generation of programmers are less focused on hardware constraints, programming writing-style, and how things work from the operating system level down to memory and processing. The "program used to program" is reminiscent of Isaac Newton's quote: “If I have seen further, it is by standing on the shoulders of giants.” So it is that Ai enabled programmers will build their apps.

TLDR;

I think we are very far from AGI and Ai building apps on its own - if that will ever happen - hopefully not. I think we are still pretty far from the first Ai-mostly (no-code - or very-little-code) developers being able to build successful products or contribute to successful products as members of the engineering team. Thinking that because Ai tools can write code in JavaScript or any other programming language that that therefore means these tools can build apps is a misunderstanding of what it takes to build a viable product. I haven't seen "real evidence" of Ai-mostly developers building viable products. I do see a lot of marketing... some fraud, some nonsensical attempts at creating things that already exist... but no successful Ai-mostly developers - yet.

I think Ai can copy existing apps fairly well and maybe run and deploy them on its own. In terms of modifying those apps in a meaningful way, or translating business needs into an “app” that combines many features that need to be woven together, an Ai enabled “programmer” that doesn’t have an in-depth understanding of programming languages will need to start to memorize techniques, processes, and tricks of the trade, and develop a detailed mental model of how various building blocks fit together.

So an Ai-enabled (very-little-code) developer will acquire domain specific knowledge that constitutes a skill which would require the sort of “10,000 hours” of practice that engineers, and myriad other professional disciplines of all sorts eventually acquire within their domain to become capable of things that those who haven’t practiced that much just can’t do. In other words, the use of Ai as a programming tool could turn into a deep skill that takes multiple years of full-time work to learn and master.

Therefore, the new programmers will likely still be programmers or even "software engineers." Mostly because they will choose to focus on building software, enjoy it, and learn as much as they can about it. Whether they decide to practice writing code or manually doing DevOps work might not be required and may detract from their ability to actually build things. Just as a JavaScript developer doesn’t necessarily need to learn the stack all the way down to the compiler, browser runtime, operating system, firmware, and hardware, etc. Each layer is very "deep" or "wide" in its detail. To truly understand would take many years in terms of getting to the point where you could actually build an innovative compiler, browser, operating system, etc. Will the Ai-mostly developer really need to know how to write code, or simply what type of code needs to be written?

I think Ai-mostly programmers will continue to focus on Software as a Service and applying LLMs in a Software as a Service context… in essence… building apps, but they will do it by articulating minutia about user experience and design, permissions, and personifying data so as to be able to talk about data in the same way we are used to talking about human relationships. I think we will likely see many games or hybrids of games and apps in some way that will emerge from the new generation of software developers that will use AI as the program to build programs. It could unlock unprecedented expressions of imagination.

This shouldn’t be mistaken for something “easy” though even with AI as a helper, or something that you can or would be able to do without a lot of technical knowledge. I’m trying to think of an example of an engineering effort that has become “easier” with technological advancement. Please share here if you have any examples. I think what happens is an engineer is expected to do “more.”

When you are empowered as an engineer with technology, people who aren’t all that interested in the technology, or how it works, will assume you can build something with it. The business minded won’t be concerned with “effort” as much as product. Because they will have seen very sophisticated apps, they will assume you will be able to build something like what they have seen.

How can a technology make something “easier” and yet the effort required is still the same? It’s because the expectations change. An AI empowered developer will run at 1.000 mph where the JavaScript developer ran at 10 mph, but the goal will no longer be to run 10,000 miles, it will be to run one million - and so the effort and talent required to complete the objective or build the product will be the same.

Of course, the fluctuation in effort required to meet expectations is another topic entirely. We might want to consider how to drive it down, instead of just asking more from better enabled engineers. So far, it seems it has been constant for a long while. Of course, this is individualized to some extent. There will be those who put in less effort and those that put in more effort. There will be more talented programmers and less talented programmers. It seems though, on average, the fluctuation is not driven down by technological advancement. If anything, it might be harder. For one needs to climb, at least on the surface, the tower of giants.

Which is to say, the advent of AI will not make life “easier” for the average software engineer. It will not lower the barrier of entry for meaningful contribution, because the standard for meaningful contribution keeps getting pushed further away. I suppose this is a sort of paradox of standing on the shoulders of giants.

However, I think it does assuage the idea that AI will replace software engineers.

Signed: Buckley Mower  2025-07-22