Hacker Newsnew | past | comments | ask | show | jobs | submit | rvz's commentslogin

With zero down time!

GitLab has achieved "AGI" internally.

Does AGI mean Abhorrent Git Interface?

And absolutely pin, pin, pin, ALL your dependencies.

If I see a package version dependency that looks like this: ^1.0.0 or even this: "*", then stop reading, pin it to a secure version immediately.


Npm's package-lock.json already handles pinning everything to exact versions, including subdependencies. Pinning exact versions in package.json doesn't affect your subdependencies.

You aren't wrong. However, this article does offer some additional advice on this matter, and some potential reasons why it might still be desirable to pin your deps in package.json.

https://docs.renovatebot.com/dependency-pinning/#pinning-dep...

Some exerts:

> If a lock file gets out of sync with its package.json, it can no longer be guaranteed to lock anything, and the package.json will be the source of truth for installs.

> provides much less visibility than package.json, because it's not designed to be human readable and is quite dense.

> If the package.json has a range, and a new in-range version is released that would break the build, then essentially your package.json is in a state of "broken", even if the lock file is still holding things together.


Or help distributions do the manual process of packaging - which involves at least rudimentary security checks - so they can ship newer versions faster.

And then use distro packages.

(I'm not accepting distro fragmentation as counterargument. With containerization the distro is something you can choose. Choose one, help there, and use it everywhere.)


Are you talking about in package.json? What's your threat model? That's what the lock file is for, which also pins transitive dependencies, which is just as crucial. Now what's actually insecure is if you don't commit the lockfile. and if you don't do `npm ci`.

I think `npx` might pull down new versions, too? I wish npm worked more like Elixir where updating the lock file was an explicit command, and everything else used the lock file directly.


its so wild to have seen this advice reverse course over the past year.

it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.

different times, different threat model I suppose


> it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.

This is still the right advice for libraries. For security it doesn’t matter a whole lot anymore as package managers can force the transitive dependencies version, but it allows for much better transitive dependency de duplication.

For non-libraries it doesn’t matter as the exact versions get pinned in the package-lock.


I've been collecting things you can't pin:

- Python inline dependencies in PEP-0723, which you can pin with a==1.0, but can't be hash-pinned afaik.

- The bin package manager lets you pin binaries, but they aren't hash-pinned either.

- The pants build tool suggests vendoring a get-pants.sh script[0] but it downloads the latest. Even if you pass it a version, it doesn't do any checks on the version number and just installs it to ~/.local/bin

[0]: https://github.com/pantsbuild/setup/blob/gh-pages/get-pants....


Once again, Shai-Hulud wrecking havock in the Javascript and Typescript ecosystems via NPM.

One of the worst ecosystems that has been brought into the software industry and it is almost always via NPM. Not even Cargo (Rust) or go mod (Golang) get as many attacks because at least with the latter, they encourage you to use the standard library.

Both Javascript and Typescript have none and want you to import hundreds of libraries, increasing the risk of a supply chain attack.

At this point, JS and TS are considered harmful.


I don't really buy this. NPM is targeted because it's the largest attack surface with the biggest payoff for a successful attack.

Other ecosystems package managers are really no different in a lot of ways.

NPM's biggest fault is just it allows post/pre install scripts by default without user intervention.


Look I love Rust and hate Typescript. But if NPM didn't exist, wouldn't the attackers just hit the next most popular supply chain? Cargo isn't immune to this, as much as I love Rust and wish more shops used it.

It's not failure of npm/js ecosystem. It's Github Actions failure that allowed this to happen.

If cargo was as popular as npm, the same issues would surface.

> Both Javascript and Typescript have none and want you to import hundreds of libraries

There are plenty of very popular packages with zero dependencies like Hono or Zod. If you decide to blindly install something with hundreds of deps it's on you.

That said, I do agree the JS standard library should provide a lot more than it does now.


I wonder whether NPM has surpassed the costs of the billion dollar mistake, null references. NPM hasn't been around as long, but the industry is much bigger today than it was when systems languages were dominant.

Python had these too, no ecosystem is safe.

The Standard C library is also very small. Even though there’s POSIX, for anything that’s not system programming, you will be using libraries.

The difference is that the usual C libraries don’t split the project into small molecules for no good reasons. You have to be as big as GTK to start splitting library in my opinion.


This is a bit good for Rust if you want to use the language with CUDA. The problem is, it still doesn't really move the needle if you really don't like running closed source drivers and runtime binaries and care about open source.

Continuing from this discussion [0], this only makes it a Rust or a CUDA problem rather than a Python, CUDA and a PyTorch one if there bug in one of them.

Yet at the end of the day, it still uses Nvidia's closed source CUDA compiler 'nvcc' which they will never open source. A least Mojo promises to open source their own compiler which compiles to different accelerators with multiple backend support.

Unlike this...but uses Rust.

[0] https://news.ycombinator.com/item?id=48067228


IMO this has nothing to do with open source as an ideology; just a practical (and official?) lib for adding GPU interaction to your rust programs.

Small correction: cuda-oxide does not use `nvcc` to compile Rust kernels.

The device path is roughly: rustc frontend/MIR -> cuda-oxide’s Pliron-based IR/lowering pipeline -> LLVM IR -> PTX via LLVM’s NVPTX backend. Host code still goes through normal rustc/LLVM codegen.

You still need the NVIDIA driver/toolkit pieces to run CUDA code and load PTX on NVIDIA GPUs, so this does not change the broader CUDA ecosystem/dependency story. But it is not “Rust source handed to nvcc”; it is a Rust compiler backend generating device code.


Mojo remains to be seen if it isn't another Swift for Tensorflow, apparently 1.0 won't even support Windows properly.

[flagged]


All the game devs that forced Valve to come up Proton for Steam Deck to have any content.

The majority of computer owners on planet Earth

But also the majority of programmers?

Yes, because Windows software doesn't sprung into existence out of nowhere.

Neither did Linux software.

It actually did, plenty of it are UNIX ports that predate the Linux kernel, looking for a cheaper home.

Note how Linux only has a relevant market share on deployment scenarios that were previously UNIX/POSIX turf.


In AI-focused fields like business analytics and data science, yeah.

The claim is that people are running CUDA on Windows for business analytics and data science? This feels less likely an accurate picture and more likely any mass data processing is already happening on Linux K8s clusters.

Yes, if they happen to run tooling like Excel, PowerBI, Tableau,....

Also Linux support for CUDA on laptops, especially with dual GPU setup isn't particularly great.

Most workstation class laptops are Windows based.


AFAIK neither excel nor Tableau has any CUDA functionality to begin with so I’m not sure what point you’re trying to make. No one is doing CUDA number crunching on local laptops - either the problem is big enough to warrant a proper data center or it’s small enough that a local CPU is fine. Local CUDA is a weird middle ground that requires a lot of complexity for marginal compute capability.

Linux support for CUDA in such an environment is irrelevant.


Because you are lost how a day in the office of a big corporation looks like.

Also this attitude about people not doing compute on laptops, is yet another reason why Intel and AMD keep losing to NVidia.


The K8s clusters do exist, but I've never met anyone in my life that develops their Jupyter notebooks from their business' Kubernetes setup. Most of them don't even use WSL, to my chagrin (and to their detriment).

And they’re running these Jupyter notebooks on CUDA? I’m suspicious there’s a problem size where the complexity of GPU processing is warranted but it’s small enough to be workable with a local Windows machine.

many people

you mom!

My sentiment matches your exactly. I'm sick and tired of CUDA - but it's really not going to change.

Could maybe be forked with some dynamic smarts, HIP is basically 1:1 with CUDA: https://github.com/amd/amd-lab-notes/blob/release/hipify%2Fs...


Does it support a graphical GPU debugging for C++, Fortran and Python JIT GPU code?

Otherwise it isn't 1:1 with CUDA, and I am not counting everything else on CUDA ecosystem



All those are far from the 1:1 CUDA experience.

Considering how fast everything is changing with GPUs and how competitive it is. It doesn't make sense to have an open source driver.

> it still doesn't really move the needle if you really don't like running closed source drivers and runtime binaries

Those people probably did not buy an Nvidia GPU for themselves. It should be common knowledge that the "Open" Nvidia drivers still run gigantic firmware blobs to dispatch complex workloads. And Nouveau is close to useless for GPGPU compute.


in (2025)

This is great news for the PyTorch community! Congrats Unsloth and well deserved!

Thank you appreciate the support! It's all thanks to you guys and the community!

It is just software. TempleOS and this project just looks cool.

The "AI disruption" coming from the front-running of the OpenAI and Anthropic IPOs.

Exactly. I have said this for a very long time and the EU (and many other governments) are not our friends and they are just as corrupt. Remember ChatControl?

Anytime anyone criticises the EU here, you will get downvoted even after trying to warn the EU defenders that they are not our friends at all.

I was asking for evidence about the EU digital ID wallets about what the "disinformation" was around it 3 years ago [0] and not a single link of it was given.

At this point, being an EU defender and supporting the "open web" are incompatible since you will be using your EU digital identity wallet [1] with your phone to login to your bank and the internet will push age verification with it, locking you out if you don't sign up.

[0] https://news.ycombinator.com/item?id=36105002

[1] https://eudi.dev/latest/


> Remember ChatControl?

That thing that got refused multiple times already?

Because not all politicians think like you does not mean they are corrupt. Seems like enough politicians have voted against ChatControl until now.

I always wonder what people who say stuff like "politicians discussed this topic I hate and refused it, but the mere fact that they discussed means that they must all be corrupt" understand about politics. You know that it is about people with different opinions (representing people with different opinions) discussing stuff, right?


The Commission got it through on the last round, though, so eventually it passed.

Chat Control hasn't passed yet. But the Chat Control lobbyists are still lobbying for it behind the scenes, and are currently pushing for all phone calls in the EU to be covered.

Source:

https://www.patrick-breyer.de/wp-content/uploads/2026/05/861...

https://digitalcourage.social/@echo_pbreyer


So what should be done about it? EU Commission issue a decree that it should never be spoken or debated again in public? Never proposed? Any other tyrannical idea?

Do you have a list of other things that shouldn't be brought in front of the elected parliament?


So ChatControl was accepted and is in the process of being implemented is what you say?


(ignorant) people proposing things does not mean corruption: the fact that these things are voted down and never pass is proof that the system works, not evidence of corruption.

Corruption would be if it passed despite it being unpopular, because some corporate or rich peoples interests desired it.


> Exactly. I have said this for a very long time and the EU (and many other governments) are not our friends and they are just as corrupt. Remember ChatControl?

The EU parliament shot down ChatControl.

In fact, without the EU, most likely many member states would have ChatControl in some shape. National governments are the ones all in on this crap.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: