Hacker Newsnew | past | comments | ask | show | jobs | submit | AIorNot's commentslogin

Thats because we have a trove of in depth specialist and deep youtube content including all those old documentaries to mine through these days

Youtube and the internet is a goldmine and way bigger than old 80s/90s content, im over 50 and remember the 80s well enough.. a few great well produced documentaries are not a comparable to gigabytes or petabytes of videos and podcasts we have today

The cultural format of exchange has changed and the consequences of that - so called tiktok attention deficit folks means perhaps no one watches this content but I think that too is a generalization and great content is watched probably by a greater proportion of smart curious people today than back in the 80s on your phone nonetheless- we have a pocket tv with an almost unlimited amount of content

Im an information junkie and just today I spent 3 hours watching a documentary series on the incan civilization follower by a Stanford video on LLMs and then watching Blaise Arcas’s interesting ideas on computational life and intelligence

https://youtu.be/KhSJuqDUJME?si=-TMkLdapsbcWuoft


This doesnt make much sense the article itself is AI written

It would have been easy to run a few ai agents to review the code and find these issues as well and architect it clearly


Yup -like the honesty

Why don't cloud services have reviews like amazon products.- I’m so tired of enterprise sales speak in corporate docs- just cut the BS and do straight talk

If you haven’t used a service you shouldn’t have to search reddit for dev experience on it


Is there any CEO out there as insecure as zuckerberg?

So many kids on hacker news

- I’d say SWE is an experienced engineer not a senior developer- for Pete’s sake he graduated in 2023 that was 3 freaking years ago

I’ve been developing production software for 20 years now -

What other profession counts someone with 3 years of professional experience out of college as senior?

Maybe competitive sports? Or academic math?

If it means this kid is smart and good at coding sure ill buy that but experiences and wisdom are something else entirely..


I disagree and think the software model described works better when done well. I have seen this within a company, where both the hardware side and software side used the same titles (senior, staff, senior staff, principal). The hardware side used largely a combination of industry tenure and especially whether they had PHDs/patents/inventions or not to determine these titles, while the software org was very gung ho on using responsibility and influence to determine promotions. The other thing this led to is in the hardware org, often people would get hired on as senior staff or principal, while this almost never happened on the software side (nobody could get hired on as these roles as they couldn't possibly meet the rubric, as it required some outsized impact in the company with thousands of people using software you near singlehandedly developed and maintained).

As other people pointed out in this post in a roundabout way, titles only matter at all internally to a given company. And considering that, compare these two systems; yes the software org in this system does end up in a position where a 25 year old that's been at the company for 3 years could be senior staff, but that's very telling, to do that, they absolutely had to ship something novel, useful to many, and keep it running and good. Knowing that someone is a very well educated graybeard that invented something at Sun in 1989 is also some good information, but from the context of communicating with people in other orgs within a company I don't know so well, it's more valuable to me personally to understand whether they are responsible for a large running process and to what degree, moreso than how long they have been around and what they did elsewhere.


Yeah I have similar thoughts. I think you have to just consider the situation holistically. Senior with two years experience? Ok, this person is obviously skilled, and has the ability to create value and have impact, and has seriously impressed people in their org, but they're still early in their career and they probably have future mistakes to make and lessons to learn.

Kind of reminds me of martial arts. You got what some call McDojo's where a 13 year old can be a "black belt" after 9 months vs. more "traditional" styles where after 5 years of hard work you get there. For the traditional styles this black belt is generally views as "serious beginner" or internalizing the basics.

Real learning takes time. Someone with 3 years of experience writing software is at the beginning of their professional development.

Ofcourse time alone is not enough. But time x work x aptitude = progression.

The inflation of "senior engineer" makes us look to many like the McDojo black belts.


Senior Engineer means many different things, even within the same company. It could mean, "This person is more productive than everybody else around them" or it could mean, "This person isn't that great at software development but they know some product area so deeply that it would be too expensive to replace them."

Haha, I know people who have worked on designing a single part smaller than a closed fist for over 5 years and were still considered just over junior because they didn't have enough experience with the system it was used in.

I mean we have Langgraph, BAML etc

Yes I get your frustration, the same thing is happening across orgs these days as claude and co-work has become widespread.

Wisdom is a thing, so is competence. Humans have it or they don't but machines do not (yet), but the massive capabilities of the tools are also something that can't be ignored.

We can't throw the baby out with the bathwater. It's going to take some cycles of learning the ropes with this technology for humans to understand it better.

I would push back -why couldn't the senior devs communicate these issues to senior management? It sounds like a broken human system not a broken tool or technology. All AI did was shine a light on the human issues on that org.


From past experiences (and I'm sure I'm not alone here), I can almost guarantee that the senior devs did communicate the problems, but they were ignored or brushed aside.

Very seldomly does middle/upper management truly listens to engineers, unless there's buy-in from the CTO/VP to champion the ideas and complaints.


Over time, as devs get more experience, they have seen countless fads come and go. Some worked, some screwed things up, etc. - NONE were the silver bullet / savior that they were touted to be by adherents. So they learn a default "no" or "slowly" response to "we need to do this <buzzword> ASAP" from management who only see $$$. I mean AI companies are telling management that devs will resist AI because "it's so good it will let you replace them", so management is getting their views reinforced by devs saying it's a bad idea.

Yeah, the developers who will argue and teeth-gnash about using an ORM for weeks on the hope it will save a few hours perceived as boring or obvious are, simultaneously, annoyed and upset at being told to save time with super tools that save time and effort…

Pay no attention to the software output or quality or competitive displacement of the people selling you tools. LLMs, like cheesy sales strategies, are something so lucrative the only thing you can really do is sell them first come first serve to other people. Makes so much sense. Why make infinite money when you can sell a course/tool to naive and less fortunate companies? So logical.


The CTO got fired last month, presumably for poor performance. And the director that has taken is place is now all in on AI because he's desperate to turn things around but has no idea how.

He doesn't care. When c suite gets fired they get like half a million in severance and go rinse and repeat somewhere else

And it was the AI's fault. So convenient.

Was the CTO advocating a more measured approached to ai adoption?

I have a feeling that I have witnessed it, although I was told the CTO decided to move on to other challenges.

Have you not seen the principals and seniors being offered the door or buyouts?

so is the answer

WebRTC + Kubernetes


the problem with this article is that he is right of course, but only right now. There is no reason to believe that future AI platforms won't be able to review code themselves and manage some aspects of themselves with minimal human oversight, yes we likely will always need a few experts

I'm reminded of this scene from the Matrix: https://www.youtube.com/watch?v=cD4nhYR-VRA where the older wise man discusses societies reliance on AI

"Nobody cares how it works, as long as it works"

We're done. I for one welcome our new AI Overlords, or more accurately still welcome the tech bro billionares who are pulling the strings


> There is no reason to believe that future AI platforms won't be able to review code themselves and manage some aspects of themselves with minimal human oversight

There are, IMHO, fewer reasons to believe they will be able to do that rather than not, though.


LLMs became much better at both reviewing and writing code over the last 12-18 months. Did you?

The current state of the art is irrelevant. Only the first couple of time derivatives matter.


> Did you?

I would say I got better at both of those over the last 12-18 months. Are your skills static?


Compared to Claude or GPT 5.5? Yeah, my skills are static relative to the progress seen recently. So are yours, unless your grandpa was named von Neumann or Szilard.

My brain got better at thinking deeper when I stopped using llms.

Lmao why does it seem outlandish to other people? Perhaps they never thought too deeply in the first place to recognise it.


> There is no reason to believe that future AI platforms won't be able to review code themselves and manage some aspects of themselves with minimal human oversight

Really? That's like someone during an economic boom saying "The economy is the worst it'll ever be. There is no reason to expect things to not continue to improve".


That simile breaks down because - unlike the state of the economy - the collective human capacity to understand, design, and produce these systems essentially only goes one way, barring the apocolyptic.

I think Jimmy Yang summarizes this essay more effectively: https://www.youtube.com/shorts/9wJ6B7z3Hyk

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: