I think there is this duality in SW development that some people tend to believe in, that there are two types of work, one highly creative, like product or architecture design, and then the mundane one, like coding and testing. (I actually realized this reading a recent HN post: https://news.ycombinator.com/item?id=25775872)
I don't subscribe to this, I think the work is ultimately very similar in creativity (just different skill set), both on high and low level. If something seems mundane, it is either a sign that there is lots of hidden entropy that you failed to capture (and thus risk that the high level understanding is wildly incorrect), or it means that we are doing things conceptually wrong (with more effort than required), but we don't know how (and so there is potential for innovation).
And from that duality is derived the idea, that if only we can perfectly specify the mundane work (for instance through "acceptance criteria"), it's just a matter of getting enough bodies to do it according to some grander plan. From this misconception derives the history of software development methodologies.
So originally, people thought, this duality is the same as in building a house, you have an architect, he does the design, and workers (coders) will build it. So the waterfall was born, where the emphasis was on the plan, and it was also the weak point, because in the real world of SW development, it turns out, the actual implementation influences the plan (so the architect cannot just throw it over the wall, as they say).
Then, from the disappointments, Agile movement was born, pretty much out of the idea that all development work is creative, and it cannot be easily formulated/predicted, so we basically need to iterate quickly and hope for the best. Just treat everything like research.
But, as it happens, people promptly misunderstood this (where are my metrics now??). Instead of understanding that the belief in the duality itself is the problem (a kind of wicked problem, really, because it can manifests in many ways), they looked at superficial recommendations of "Agile practices" - Scrum rituals, "embracing change" and foregoing proper planning, etc. (For me, good summary of this discussion is http://steve-yegge.blogspot.com/2006/09/good-agile-bad-agile... and https://simpixelated.com/two-year-work-retrospective)
One can draw an analogy to this duality in other industries, it's a problem older than software. It seems that with technological and process automation, other industries are actually becoming more like SW development (aka "software eats the world"). So elsewhere, this belief in duality gave rise to Taylorism (https://en.wikipedia.org/wiki/Scientific_management), and the rejection of the duality gave rise to Demming's philosophy (https://en.wikipedia.org/wiki/W._Edwards_Deming) and the related management methods (which were, again, misunderstood, I am sure). There the fundamental notion coming from the duality is that it's the management (or anybody else not doing the actual production job), not the workers themselves, to decide how to effectively organize it.
And still, to this day, some people intuitively cling to this duality, and try to apply Taylorism instead of something like Kaizen or Kanban to SW development (and certainly not only that). So you have JIRA tickets and people beancounting story points..
I think that complex organizations tend to develop caste systems. The caste hierarchy could be arbitrary, or based on imaginary metrics such as being "closer to revenue." It could have some basis in fact, for instance doctor hours are more valuable than nurse hours, which are more valuable than patient hours.
The upper castes have to be responsible for the work environment, otherwise it degenerates: The best people in the lower castes switch jobs to join a higher caste or a company with a better caste system. Or they unionize, either formally (like nurses) or informally by forming silos. Thus the castes become self fulfilling.
As an aside, one thing that HN has taught me, reinforced by this thread, is that I would not have survived in a high throughput coding environment, and am glad I didn't steer my career in that direction, though I love programming. I don't have the self discipline, and I'd have burned out quickly.
I don't subscribe to this, I think the work is ultimately very similar in creativity (just different skill set), both on high and low level. If something seems mundane, it is either a sign that there is lots of hidden entropy that you failed to capture (and thus risk that the high level understanding is wildly incorrect), or it means that we are doing things conceptually wrong (with more effort than required), but we don't know how (and so there is potential for innovation).
And from that duality is derived the idea, that if only we can perfectly specify the mundane work (for instance through "acceptance criteria"), it's just a matter of getting enough bodies to do it according to some grander plan. From this misconception derives the history of software development methodologies.
So originally, people thought, this duality is the same as in building a house, you have an architect, he does the design, and workers (coders) will build it. So the waterfall was born, where the emphasis was on the plan, and it was also the weak point, because in the real world of SW development, it turns out, the actual implementation influences the plan (so the architect cannot just throw it over the wall, as they say).
Then, from the disappointments, Agile movement was born, pretty much out of the idea that all development work is creative, and it cannot be easily formulated/predicted, so we basically need to iterate quickly and hope for the best. Just treat everything like research.
But, as it happens, people promptly misunderstood this (where are my metrics now??). Instead of understanding that the belief in the duality itself is the problem (a kind of wicked problem, really, because it can manifests in many ways), they looked at superficial recommendations of "Agile practices" - Scrum rituals, "embracing change" and foregoing proper planning, etc. (For me, good summary of this discussion is http://steve-yegge.blogspot.com/2006/09/good-agile-bad-agile... and https://simpixelated.com/two-year-work-retrospective)
One can draw an analogy to this duality in other industries, it's a problem older than software. It seems that with technological and process automation, other industries are actually becoming more like SW development (aka "software eats the world"). So elsewhere, this belief in duality gave rise to Taylorism (https://en.wikipedia.org/wiki/Scientific_management), and the rejection of the duality gave rise to Demming's philosophy (https://en.wikipedia.org/wiki/W._Edwards_Deming) and the related management methods (which were, again, misunderstood, I am sure). There the fundamental notion coming from the duality is that it's the management (or anybody else not doing the actual production job), not the workers themselves, to decide how to effectively organize it.
And still, to this day, some people intuitively cling to this duality, and try to apply Taylorism instead of something like Kaizen or Kanban to SW development (and certainly not only that). So you have JIRA tickets and people beancounting story points..