Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>. In many domains, productivity is already sufficient. What’s being sold is workforce reduction.

This is a blindspot to many. People working on entrepreneurial projects need to build a lot. They start with nothing. They need (for example) features. There's a lot to do.

Most firms are not that. Visa, Salesforce, LinkedIn or whatnot. They have a product. They have features. They have been at it for a while. They also have resources. They are very often in a position of finding nails for a "write more software" hammer.

It's unintuitive because they all have big wishlist and to do lists and and a/b testing system for pouring software into but...

If there were known "make more software, make more money" opportunities available, they would have already done them.

Actual growth and new demand needs to come from arenas outside of this. Eg companies that suck at software(either making or acquiring) might be able to get the job done.

The Problem, bringing this back to the article, is fungibility. A lot of this "human capital" stuff cannot be easily repackaged. It's a "living" thing. Talent and skills pipelines can be cut off, and vanish.

A danger in Ai coding (and other fields) is that it leverages preexisting human capital and doesn't generate any for later.

 help



> If there were known "make more software, make more money" opportunities available, they would have already done them.

Sometimes they're available, but not palatable, when the opportunity could threaten their existing investments or patterns. That might mean "self-cannibalism", or changing the ecology so that the main product niche is threatened.

Then those opportunities are ignored, or actively worked-against via lobbying, embrace-extend-extinguish, etc.


Ok... but this just generalizes into the "known things" type.

Whether the reason of strategic (like your example), internal politics, insufficient knowledge.... The point is that there is a local equilibrium, and most mature firms are at this equilibrium.

More resources via Ai, at first order, goes after that diminishing returns part of the curve... which is a cliff especially for highly resourced firms topping the S&P500.

A lot of Ai-optimist:s " mental model" of the economy do not account for this stuff at all.

"Save time/money" outcomes are not similar at all to "make more stuff" outcomes. Firing employees does freeze up labour... but reutilizing this labour is non-trivial... as this article demonstrates quite well.


> doesn't generate any for later.

"any" is quite an assumption.


I didn't mean this as an absolute statement. Relatively, and in the short term.

I agree that any sufficiently complex human operation - whether industrial or scientific or whatever - requires a culture and a living tradition that develops over time and communicates knowledge and understanding across generations. In fact, many problems in our culture can be attributed to a contempt for tradition that developed. (It is true that tradition can ossify. That's can be a problem with attitudes toward tradition rather than tradition itself, or a sign that something needs to be addressed. A good tradition is a dialogue spanning history.)

However, it is also true that technology develops and produces changes that in the short term cause pain, but in the long term produce a better outcome in some desirable sense. Coding is not an end in itself. Just as switchboard operators and human computers are obsolete, because the conditions that caused the need for them ceased to exist, it may be the case that a certain manual style of programming is also becoming obsolete.

You can imagine human computers decades ago thinking that computing technology is bad, because people will loose numerical facility. But this misunderstands the structure of the value of practical skills and the difference between knowledge of principles and practical skill. Sure, few if any people today can perform numerical computation as quickly and competently in their heads or on paper as human computers, but...

1. that's different from understanding the principles of computation which is closer to a theoretical grasp and has eternal or at least lasting value

2. the value of the practical numerical facility was rooted in the need for obtaining results as quickly as possible, and that particular set of techniques or skills is no longer practical

Perhaps manual coding is like that. I don't know why people are surprised. Generative programming has always been a desired end in CS for along time. CS grads can still and should still learn the principles of their field and learn them well, but the profile of practical industrial techniques and needed skills is changing. As software eats more and more of the world, it is becoming increasingly impractical for manually fiddling with silly bits of plumbing. We obviously haven't been able to develop abstractions well enough to avoid it, and part of the reason is that appetite comes with eating. Once you make something easier, it makes it easier to achieve even greater things more easily...hence new plumbing and implementation complexity.

Let's be honest here. Much of programming is intellectually dull. It's is plumbing. It's not algorithmically interesting. It's not interesting from a modeling perspective. It's not interesting conceptually. It's not interesting as a matter of system design. Most programming out in the wild is the same old crap being recapitulated a million times over. If all you want is to become skilled in doing the same thing over and over again, then I can understand why you might find LLMs threatening. Your market value as a maker of yet-another-flask-web-app has plummeted hard. People who enjoy that kind of programming are generally not very intellectually motivated people - at least not where programming is concerned - and likely prefer the tedious comforts of rehearsed ephemeral detail. LLMs can keep us from rabbit holing and focused on the domain.

In any case, I don't think LLMs are a threat to the field per se. I just think that the skill set is shifting and developing. I think we are still figuring out what it means to develop the right understanding and intuitions to develop software without the benefit of having had done it manually. Time will tell. However, I also think being able to read code has become relatively more important than writing it. When you have to verify the quality of LLM-generated code and put your name behind it, you have to be able to understand it, and that's a somewhat neglected skill in my view. Programming very often prefer to write code than to read it. LLMs might be just the thing to coerce an improvement in the latter sort of literacy. With this also comes a greater importance of formal specification. That's where I would expect the future of the field to shift.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: