Npm's package-lock.json already handles pinning everything to exact versions, including subdependencies. Pinning exact versions in package.json doesn't affect your subdependencies.
You aren't wrong. However, this article does offer some additional advice on this matter, and some potential reasons why it might still be desirable to pin your deps in package.json.
> If a lock file gets out of sync with its package.json, it can no longer be guaranteed to lock anything, and the package.json will be the source of truth for installs.
> provides much less visibility than package.json, because it's not designed to be human readable and is quite dense.
> If the package.json has a range, and a new in-range version is released that would break the build, then essentially your package.json is in a state of "broken", even if the lock file is still holding things together.
Or help distributions do the manual process of packaging - which involves at least rudimentary security checks - so they can ship newer versions faster.
And then use distro packages.
(I'm not accepting distro fragmentation as counterargument. With containerization the distro is something you can choose. Choose one, help there, and use it everywhere.)
Are you talking about in package.json? What's your threat model? That's what the lock file is for, which also pins transitive dependencies, which is just as crucial. Now what's actually insecure is if you don't commit the lockfile. and if you don't do `npm ci`.
I think `npx` might pull down new versions, too? I wish npm worked more like Elixir where updating the lock file was an explicit command, and everything else used the lock file directly.
> it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.
This is still the right advice for libraries. For security it doesn’t matter a whole lot anymore as package managers can force the transitive dependencies version, but it allows for much better transitive dependency de duplication.
For non-libraries it doesn’t matter as the exact versions get pinned in the package-lock.
- Python inline dependencies in PEP-0723, which you can pin with a==1.0, but can't be hash-pinned afaik.
- The bin package manager lets you pin binaries, but they aren't hash-pinned either.
- The pants build tool suggests vendoring a get-pants.sh script[0] but it downloads the latest. Even if you pass it a version, it doesn't do any checks on the version number and just installs it to ~/.local/bin
Once again, Shai-Hulud wrecking havock in the Javascript and Typescript ecosystems via NPM.
One of the worst ecosystems that has been brought into the software industry and it is almost always via NPM. Not even Cargo (Rust) or go mod (Golang) get as many attacks because at least with the latter, they encourage you to use the standard library.
Both Javascript and Typescript have none and want you to import hundreds of libraries, increasing the risk of a supply chain attack.
Look I love Rust and hate Typescript. But if NPM didn't exist, wouldn't the attackers just hit the next most popular supply chain? Cargo isn't immune to this, as much as I love Rust and wish more shops used it.
> Both Javascript and Typescript have none and want you to import hundreds of libraries
There are plenty of very popular packages with zero dependencies like Hono or Zod. If you decide to blindly install something with hundreds of deps it's on you.
That said, I do agree the JS standard library should provide a lot more than it does now.
I wonder whether NPM has surpassed the costs of the billion dollar mistake, null references. NPM hasn't been around as long, but the industry is much bigger today than it was when systems languages were dominant.
The Standard C library is also very small. Even though there’s POSIX, for anything that’s not system programming, you will be using libraries.
The difference is that the usual C libraries don’t split the project into small molecules for no good reasons. You have to be as big as GTK to start splitting library in my opinion.
This is a bit good for Rust if you want to use the language with CUDA. The problem is, it still doesn't really move the needle if you really don't like running closed source drivers and runtime binaries and care about open source.
Continuing from this discussion [0], this only makes it a Rust or a CUDA problem rather than a Python, CUDA and a PyTorch one if there bug in one of them.
Yet at the end of the day, it still uses Nvidia's closed source CUDA compiler 'nvcc' which they will never open source. A least Mojo promises to open source their own compiler which compiles to different accelerators with multiple backend support.
Small correction: cuda-oxide does not use `nvcc` to compile Rust kernels.
The device path is roughly: rustc frontend/MIR -> cuda-oxide’s Pliron-based IR/lowering pipeline -> LLVM IR -> PTX via LLVM’s NVPTX backend. Host code still goes through normal rustc/LLVM codegen.
You still need the NVIDIA driver/toolkit pieces to run CUDA code and load PTX on NVIDIA GPUs, so this does not change the broader CUDA ecosystem/dependency story. But it is not “Rust source handed to nvcc”; it is a Rust compiler backend generating device code.
The claim is that people are running CUDA on Windows for business analytics and data science? This feels less likely an accurate picture and more likely any mass data processing is already happening on Linux K8s clusters.
AFAIK neither excel nor Tableau has any CUDA functionality to begin with so I’m not sure what point you’re trying to make. No one is doing CUDA number crunching on local laptops - either the problem is big enough to warrant a proper data center or it’s small enough that a local CPU is fine. Local CUDA is a weird middle ground that requires a lot of complexity for marginal compute capability.
Linux support for CUDA in such an environment is irrelevant.
The K8s clusters do exist, but I've never met anyone in my life that develops their Jupyter notebooks from their business' Kubernetes setup. Most of them don't even use WSL, to my chagrin (and to their detriment).
And they’re running these Jupyter notebooks on CUDA? I’m suspicious there’s a problem size where the complexity of GPU processing is warranted but it’s small enough to be workable with a local Windows machine.
> it still doesn't really move the needle if you really don't like running closed source drivers and runtime binaries
Those people probably did not buy an Nvidia GPU for themselves. It should be common knowledge that the "Open" Nvidia drivers still run gigantic firmware blobs to dispatch complex workloads. And Nouveau is close to useless for GPGPU compute.
Exactly. I have said this for a very long time and the EU (and many other governments) are not our friends and they are just as corrupt. Remember ChatControl?
Anytime anyone criticises the EU here, you will get downvoted even after trying to warn the EU defenders that they are not our friends at all.
I was asking for evidence about the EU digital ID wallets about what the "disinformation" was around it 3 years ago [0] and not a single link of it was given.
At this point, being an EU defender and supporting the "open web" are incompatible since you will be using your EU digital identity wallet [1] with your phone to login to your bank and the internet will push age verification with it, locking you out if you don't sign up.
That thing that got refused multiple times already?
Because not all politicians think like you does not mean they are corrupt. Seems like enough politicians have voted against ChatControl until now.
I always wonder what people who say stuff like "politicians discussed this topic I hate and refused it, but the mere fact that they discussed means that they must all be corrupt" understand about politics. You know that it is about people with different opinions (representing people with different opinions) discussing stuff, right?
Chat Control hasn't passed yet. But the Chat Control lobbyists are still lobbying for it behind the scenes, and are currently pushing for all phone calls in the EU to be covered.
So what should be done about it? EU Commission issue a decree that it should never be spoken or debated again in public? Never proposed? Any other tyrannical idea?
Do you have a list of other things that shouldn't be brought in front of the elected parliament?
(ignorant) people proposing things does not mean corruption: the fact that these things are voted down and never pass is proof that the system works, not evidence of corruption.
Corruption would be if it passed despite it being unpopular, because some corporate or rich peoples interests desired it.
> Exactly. I have said this for a very long time and the EU (and many other governments) are not our friends and they are just as corrupt. Remember ChatControl?
The EU parliament shot down ChatControl.
In fact, without the EU, most likely many member states would have ChatControl in some shape. National governments are the ones all in on this crap.
reply