I have a question: when I’ve seen people discussing this setting, people talk about using like ”3 days” or ”7 days” as the timeout, which seems insanely short to me for production use. As a C++ developer, I would be hesitant to use any dependency in the first six months of release in production, unless there’s some critical CVE or something (then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important).
Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
NPM packages follow semantic versioning so minor versions should be fine to auto update. (there is still an issue what for package maintainer might be minor not being minor for you - but let's stick to ideal world for that)
I don't think people are having major versions updated every month, it is more really like 6 months or once a year.
I guess the problem might be people think auto updating minor versions in CI/CD pipeline will keep them more secure as bug fixes should be in minor versions but in reality we see it is not the case and attackers use it to spread malware.
> Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
In 2 months, a typical js framework goes through the full Gartner Hype Cycle and moves to being unmaintained with an archived git repo and dozens of virus infected forks with similar names.
If everyone is going to wait 3 days before installing the latest version of a compromised package, it will take more than 3 days to detect an incident.
A lot of people will still use npm, so they'll be the canaries in the coal mine :)
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
Not really, app sec companies scan npm constantly for updated packages to check for malware. Many attacks get caught that way.
e.g. the debug + chalk supply chain attack was caught like this: https://www.aikido.dev/blog/npm-debug-and-chalk-packages-com...
'Delayed dependency updates' is a response to supply-side attacks in the JavaScript world, but it aptly describes how I have come to approach technology broadly.
Large tech companies, as with most industry, have realized most people will pay with their privacy and data long before they'll pay with money. We live in a time of the Attention Currency, after all.
But you don't need to be a canary to live a technology-enabled life. Much software that you pay with your privacy and data has free or cheap open-source alternatives that approach the same or higher quality. When you orient your way of consuming to 'eh, I can wait till the version that respects me is built', life becomes more enjoyable in myriad ways.
I don't take this to absolute levels. I pay for fancy pants LLM's, currently. But I look forward to the day not too far away where I can get today's quality for libre in my homelab.
A better (not perfect) solution: Every package should by AI analysed on an update before it is public available, to detect dangerous code and set a rating.
In package.json should be a rating defined, when remote package is below that value it could be updated, if it is higher a warning should appear.
But this will cost, but i hope, that companies like github, etc. will allow package-Repositories to use their services for free. Or we should find a way, to distribute this services to us (the users and devs) like a BOINC-Client.
Perfect is the enemy of good. Current LLM systems + "traditional tools" for scanning can get you pretty far into detecting the low hanging fruit. Hell, I bet even a semantic search with small embedding models could give you a good insight into "what's in the release notes matches what's in the code". Simply flag it for being delayed a few hours, till a human can view it. Or run additional checks.
Should have included the units in the name or required a choice of unit to be selected as part of the value. Sorry, just a bugbear of mine.
Or just use ISO8601 standard notation (e.g. "P1D" for one day)
ISO8601 durations should be used, like PT3M.
Should be easy, just add the ISO8601-duration package to your project ..
/s
There are a few commercial products that allow you to do this also for other ecosystems (e.g. maven, nuget, pypi etc), including ours https://docs.bytesafe.dev/policies/delay-upstream/
Good to see some OSS alternatives showing up!
I have a question: when I’ve seen people discussing this setting, people talk about using like ”3 days” or ”7 days” as the timeout, which seems insanely short to me for production use. As a C++ developer, I would be hesitant to use any dependency in the first six months of release in production, unless there’s some critical CVE or something (then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important).
Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
NPM packages follow semantic versioning so minor versions should be fine to auto update. (there is still an issue what for package maintainer might be minor not being minor for you - but let's stick to ideal world for that)
I don't think people are having major versions updated every month, it is more really like 6 months or once a year.
I guess the problem might be people think auto updating minor versions in CI/CD pipeline will keep them more secure as bug fixes should be in minor versions but in reality we see it is not the case and attackers use it to spread malware.
Yes, but this is not only JS dependent, in PHP (composer) is the same.
Normally old major or minor packages don't get an update, only the latest.
E.g. 4.1.47 (no update), 4.2.1 (yes got update).
So if the problem is in 4.1 you must "upgrade" to 4.2.
With "perfect" semver, this shouldn't be a problem, cause 4.2 only add new features... but... back to reality, the world is not perfect.
> Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
In 2 months, a typical js framework goes through the full Gartner Hype Cycle and moves to being unmaintained with an archived git repo and dozens of virus infected forks with similar names.
If everyone is going to wait 3 days before installing the latest version of a compromised package, it will take more than 3 days to detect an incident.
A lot of people will still use npm, so they'll be the canaries in the coal mine :)
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
If only there was a high-ranking official at Microsoft, who could prioritize security[1]! /s
[1] https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec...
Not really, app sec companies scan npm constantly for updated packages to check for malware. Many attacks get caught that way. e.g. the debug + chalk supply chain attack was caught like this: https://www.aikido.dev/blog/npm-debug-and-chalk-packages-com...
how about requiring some kind of interaction if they want to run an install script?
'Delayed dependency updates' is a response to supply-side attacks in the JavaScript world, but it aptly describes how I have come to approach technology broadly.
Large tech companies, as with most industry, have realized most people will pay with their privacy and data long before they'll pay with money. We live in a time of the Attention Currency, after all.
But you don't need to be a canary to live a technology-enabled life. Much software that you pay with your privacy and data has free or cheap open-source alternatives that approach the same or higher quality. When you orient your way of consuming to 'eh, I can wait till the version that respects me is built', life becomes more enjoyable in myriad ways.
I don't take this to absolute levels. I pay for fancy pants LLM's, currently. But I look forward to the day not too far away where I can get today's quality for libre in my homelab.
That solve not really the problem.
A better (not perfect) solution: Every package should by AI analysed on an update before it is public available, to detect dangerous code and set a rating.
In package.json should be a rating defined, when remote package is below that value it could be updated, if it is higher a warning should appear.
But this will cost, but i hope, that companies like github, etc. will allow package-Repositories to use their services for free. Or we should find a way, to distribute this services to us (the users and devs) like a BOINC-Client.
Ah, yes! The universal and uncheatable LLM! Surely nothing can go wrong.
Perfect is the enemy of good. Current LLM systems + "traditional tools" for scanning can get you pretty far into detecting the low hanging fruit. Hell, I bet even a semantic search with small embedding models could give you a good insight into "what's in the release notes matches what's in the code". Simply flag it for being delayed a few hours, till a human can view it. Or run additional checks.
As i wrote "not perfect". But better than anything else or nothing.
The Politician's Syllogism[0] is instructive.
[0] https://en.wikipedia.org/wiki/Politician's_syllogism
OK, we are here now on reddit or facebook?
I thought we discuss here problems and possible solutions.
My fault.
I can't wait to read about your solution.