Hacker Newsnew | past | comments | ask | show | jobs | submit | mschuster91's commentslogin

> Here and on reddit, AI debugging is viewed as some weird shallow pattern-matching that obviously fails to spot real stuff and overload the maintainers.

That's because that is what a lot of people did in the last years [1] to pad their resumes or to force developers to backport patches to older (but supported) kernel versions that wouldn't have gone in if they didn't have a CVE attached [2]. Maintainers have been legitimately swamped with low-quality spam for a very long time. Only recently, in the last few months, AI actually got "good enough", the problem is that maintainers still have to differentiate between AI slop by wannabes and by AI-assisted reports reviewed and refined by actual human professionals.

[1] https://www.zdnet.com/article/how-fake-security-reports-are-...

[2] https://opensourcewatch.beehiiv.com/p/linux-gets-cve-securit...


At the end of the day attackers don't give a fuck. "Waaa waaa, AI was bad 6 months ago so I'm going to throw a little fit" doesn't work when it's currently actively exploiting your shit. No one gives a damn if there are 4000 bullshit security PRs lined up. The one real RCE in there mean that everything you hold dear has already been carted off by nation states, and probably rediscovered by 3 or 4 other exploitation groups by this point.

It's time for all the little snowflake software writers to pull up their pantaloons and realize that Linus' vision has become real. With enough AIs all security bugs become shallow. And that software affects the real word, real money, and real people in it. That they are also under attack by well financed groups with rather evil motivations. If I'm attacking some group using your software (such as another nation) I'm going to flood the fuck out of your PR system till you give up hope and die. I'm going to make you attack your contributors. I'm going to sow confusion so I have the maximum amount of time to lay waste to my enemies and profit to the max.

The internet is hostile. Software is hostile. There are sharks looking to eat you.

Time to face that fact.


I would not be surprised if Tom Scott stated one day that this video was an inspiration for his legendary old video format: just him walking through some random place and explaining things, seemingly completely without script.

> Do you have a source for how little maintenance this will need?

In Germany, twice a year inspection is mandatory for infrastructure [1] but this is only a visual inspection. Once every 6 years you got a large inspection [2] that includes a full go over everything including functionality checks plus a review of documentation (if it is still up to code) and of accident documentation, as well as a "knock test" on every m² of surface [3]. Fire safety systems are checked every quarter [4].

And out of these reports then you get action items. Depending on the severity of findings, it can be anything from "someone needs to do this until the next major inspection" to "holy cow stop ALL traffic NOW".

[1] https://www.stbapa.bayern.de/service/medien/meldungen/2023/2...

[2] https://www.fba.bund.de/DE/Meldungen/20230201_Tunneluntersuc...

[3] https://www.merkur.de/lokales/muenchen/baustellen-besuch-sta...

[4] https://www.autobahn.de/aktuelles/aktuell/tunnelwartung-im-b...


Sometimes the inspections aren't worth much -- or aren't acted on -- and you end up with a bridge collapse, even in Germany:

https://en.wikipedia.org/wiki/Carola_Bridge


The problem is, it was known that the bridge was structurally unsound thanks to its age, but the elements that corroded and actually caused the damage could not be inspected at all. The report [1] is quite fascinating, the meat is on page 53/54:

> Auf Grundlage der gewonnenen Erkenntnisse und der positiven Berechnungsergebnisse wurde in der Gesamtbetrachtung weder ein akuter Handlungsbedarf festgestellt noch eine Verstärkung als erforderlich erachtet

> (Based on observation results and positive simulations no need to act was derived, nor was an increase in observation deemed to be necessary)

The root cause is deemed to be errors made all the way back during construction, most probably too long exposure of the steel cables to the environment (see page 108).

Only thanks to this desaster the actual failure mode and how to spot it got known in the first place. The report suggests (page 110) that bridges of a similar construction type (and thus, the same weakness) be retrofitted with acoustic monitoring to detect snapping cables.

[1] https://www.dresden.de/media/pdf/Strassenbau/Gutachten-Carol...


> In compensation I noticed they nap frequently in the day time, often in the hottest part of the day when it's unpleasant to work.

Yeah, a common thing in the Mediterranean as well. But unfortunately, capitalism does NOT like downtimes during "productive" daytime.


Unfortunately, can't have that in a society that requires workers be mobile to chase wherever the next gig job appears. Can't form trust bonds with neighbors when you gotta move every few years.

> It's definitely beneficial for the "elite" in the long term for people to have kids.

Why? The elites bank on AI and robots doing everything in the future. The plebs have no place in the visions of Musk, Thiel, Altman and the rest of the wankers.


That's not what reproducible builds aim to prevent, and no one claims that. When upstream pushes bad code, that's on upstream.

The thing reproducible builds aim to prevent is Debian or individual developers and system administrators with access rights to binary uploads and signing keys to get forced to sign and upload binary packages by attackers - be these governments (with or without court orders) or criminal organizations.

As of now, say if I were an administrator of Debian's CI infrastructure, technically there would be nothing preventing me from running an "extra" job on the CI infrastructure building a package for openssh with a knock-knock backdoor, properly signing it and uploading it to the repository. For someone to spot the attack and differentiate it, they'd have to notice that there is a package in the repository that has no corresponding build logs or has issues otherwise.

But with reproducible builds, anyone can set up infrastructure to rebuild Debian packages from source automatically and if there is a mismatch with what is on Debian's repository, raise alarm bells.


Reproducible builds shows that, within a specific configuration, the code produced the binary, regardless of who signed or published it.

Indeed, this could mitigate an attacker replacing the binary with something that's not produced from the code, but it does not mitigate the tool chain or code itself containing the exploit, creating a malicious binary.


Unfortunately, "holistic" investors able (and willing) to look at the bigger picture and recognize that things like "institutional knowledge" cannot be expressed on a balance sheet are not the norm.

The norm - outside of outliers like Warren Buffett - is "when numbers go up then buy when numbers go down then sell".

The financialization / stonkmarketization of everything is slowly destroying our economies like a cancer.


The problem is, it's still in contact with something, even if it's just the secondary loop. Saltwater is not just incredibly aggressive against metal, the major problem with using it for cooling is fouling. Fish, mussels, algae, debris, there are a lot of things that can clog up your entire setup.

That's useless, in fact it makes you stand out even more. There are SDKs that can differentiate based on an awful lot of signals if your user agent corresponds to your actual browser version.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: