Every time a tech company announces they’re rolling out AI code generation to ‘free developers up for higher-level work’, ask yourself: whose productivity is actually being optimized here?

These tools are sold bottom-up to developers but bought top-down by executives who see headcount as a cost to minimize. The pitch to management isn’t ‘your engineers will write better code’ — it’s ‘you can do more with fewer people’. That’s the actual value proposition.

Meanwhile, experienced developers get quietly hollowed out. Why hire a senior who knows systems design when an AI can ‘assist’ a junior through the same motions? The institutional knowledge walks out the door the moment senior engineers get replaced or driven out, and it gets embedded into a proprietary model that the company controls. You didn’t build that knowledge base — you performed labor that trained someone else’s product.

Junior devs don’t get elevated. They get bypassed. Writing code from scratch is how you learn to think about code. When an AI fills in the boilerplate, generates the implementations, writes the tests, the apprentice never develops the muscle memory they needed to eventually do the real work. We end up with a generation of developers who can prompt well but can’t actually build.

And let’s talk about the open source angle, because this is where it gets genuinely sinister. These AI models are trained on publicly available code — including GPL projects whose licenses explicitly state that derivative works must be open source. Training a proprietary model on GPL code and closing the outputs? That’s not a gray area. That’s a license violation, but the copyright holders don’t have the resources to fight billion-dollar companies in court.

So we get: workers deskilled, knowledge monopolized, copyright flouted, and all the productivity gains captured by shareholders. But hey, your standup notes are automated now.

  • HobbitFoot @thelemmy.club
    link
    fedilink
    English
    arrow-up
    4
    ·
    23 hours ago

    Most research into AI tends to split is usefulness into two categories, it helps the top performers more than the average or it helps the bottom performers more than the average.

    If it helps the bottom performers more, that industry is dead since it means that AI can credibly do the job well enough to replace people. If it helps out the top performers more, it means you can run on reduced head count, but the standards for the remaining headcount are going to go up.

    What is being optimized in the end? Reduced headcount. The work may be slop, but if it is good enough, it will be cheaper to implement and cost may be more of an issue that quality.