- cross-posted to:
- rust@lemmy.ml
- cross-posted to:
- rust@lemmy.ml
Removed by mod
That can be done. But how do you trust a software at runtime if it can’t be trusted at build time?
This is more of a supply chain issue. Users are likely to build only reputed crates. However, their dependencies may not be that reputed. For example, malicious actors may buy out a common and deeply buried dependency and use that to propagate malware (this regularly happens with browser extensions - even open source ones). How do we ensure that this doesn’t happen?
Removed by mod
Also, do you know anybody who has solved it in opensource?
I forgot to mention that this is a problem on every major language registry - especially PyPI and NPM.
How would you enforce the solution on some dude writing code in his basement to “just make it work” on his 1 day off from an otherwise busy life?
There are two things to consider. The first is that all major open source languages are run by foundations with big players and a lot of funding and donations. It’s probably a good idea to invest in a paid team dedicated to security. I’m sure everyone’s thought about it already but hasn’t done enough so far.
The second fact is that professionals - especially security companies - do occasionally report them. Like this story, for instance. So they are doing something right and it’s possible. It’s a good idea to fund them and increase their scope (hopefully, they won’t introduce any malware just to claim the prize).
Thanks! I missed that one. They are awesome!
This had to happen. Most build-scripts could run on wasm, and the others should have to be manually approved before being executed
Misuse/Overuse of the word “federated” is apparently all the rage in 2023.