That doesn't seem correct. Almost all of the projects installed on a standard Linux distro need funding. I just stopped applying to NLnet after getting nothing but rejections.
I promise we are actively working on a much better solution we hope any distro can use, but... for now we just enforce signed merge commits by a different maintainer other than the author as something they only do for code they personally reviewed.
The biggest problem with crev is it is (last I checked) entirely centralized on git/github making it not all that useful for early supply chain dependencies that often just live on random tar files on servers, or svn, or cvs, or mercurial.
Crev also lacks support for public identity bound keys which would make us give up the highly valuable 25+ year web of trust we have built in identity bound keys that predate AI and cannot be easily impersonated.
Also they don't sign commits or reviews themselves because they think crev eliminates the need for such things, which I consider ridiculous.
I really like dpc and worked next to him when he was designing crev and tried to explain these exact problems, but in the end he wanted to ship something that only solved the limited set of problems he cared about at the time which was blessing rust packages on github, which he is of course entitled to do.
We will still certainly cite crev and we are incorporating some of what we feel are the good ideas such as the actual general shape of the reviews, confidence, etc.
We are seeking a design that works across every VCS, mirror, or vendored copy of code by generating a universally stable SWHID across all distribution methods of a given version of software source code. The goal being you can just request software by the swhid or version and a copy on the internet will be discovered and a normalized folder of extracted code that matches the SWHID will appear, regardless of git, cvs, svn, tar, tar.gz, and even from slightly different mirrors as long as it is a superset of expected files, etc. All resolve to a single SWHID.
Once we have finished compiling that database of swhids for at least the sources live-bootstrap and stagex rely on, we then are all set for anyone to publish a signed review in basically any format they want bound to that swhid.
We will likely have our own PGP signed review tools that are a superset of the review fields of crev and validate web of trust etc, however nothing at all would stop people from publishing crev reviews signed with software exposed ssh keys or w/e as well which are still of much more value than nothing.
Anyone would be free to host their own signature servers as well with signed reproducible build proofs or code reviews.
Naturally spam will be a thing in a distributed system, so we would likely choose to only mirror and make discoverable signatures from an extended web of trust of linux distro maintainers, security researchers, etc, but anyone could publish subsets of signatures on their own servers under their own criteria. E.g every distro could host their own but if the stagex team has not reviewed a source yet but Debian has on their mirrors we will still assign limited trust to it.
Anyway that is the 50 floor elevator pitch. Absolutely looking to collaborate with anyone interested.
Using SWHID seems like a good choice, although maybe something not based on SHA-1 would be a good idea, hopefully SWHID v2 will do this.
The entire bootstrap process is a huge amount of code for a single person to review, so presumably individual people will review smaller subsets of the code and sign different subset SWHIDs instead of the main one?
Outside of old-school FOSS folks, OpenPGP is dead and even toxic waste. So StageX folks might want to also publish alternative reviews with other cryptography based on what is popular amongst modern devs; IIRC OpenSSH keys, age and so on. So different sets of folks can trust differently-signed reviews. Or figure out a how to make the crypto stuff be interoperable between cryptosystems (like the Monkeysphere folks were doing for OpenPGP and the web PKI). I'm thinking a starting point would be to make the reviews unsigned and then add multiple detached signature types alongside the reviews.
I feel like conflating OpenPGP web of trust and source code review trust would be a mistake; you can trust me to sign OpenPGP keys relatively well, but you definitely should not trust me to review machine code, assembly or Haskell and I feel like my reviews of POSIX shell would be trustworthy to some, but definitely not everyone.
SWH will never contain all source code, because forge admins keep objecting to having their code archived, and there are hundreds of different unsupported forge types and many different unsupported code sources (VCS/etc) types. Hopefully that won't be an issue for your bootstrap processes, but you may want to consider it in your design anyway. As an example; the canonical SQLite repository is in Fossil, which is an unsupported VCS (but of course there are tarball exports). Or codeberg.org archiving has problems due to rate limiting, so the latest versions of some repos might not be archived.
The latest Firefox build that Debian did only took just over one hour on amd64/armhf and 1.5 hours on ppc64el, the slowest Debian architecture is riscv64 and the last successful build there took only 17.5h, so definitely not days. Your average modern developer-class laptop is going to take a lot less than riscv64 too.
I think Bootstrappable Builds from source without any binaries, plus distributed code audits would do a better job than locking down already existing binaries.
> That sounds like what Software Freedom Conservancy would call a GPL violation
Sure, it is. So what? Have you got 200k for lawyers and years of your life to spend in court fighting over it?
I have personally contacted the SFC with ample evidence of deliberate and wilful GPL violations, such as providing a written offer for source code and then ignoring or flat out refusing requests for the source code. The SFC has acknowledged the vendors are violating the spirit and letter of the GPL.
Nothing happens. The SFC is one organisation with limited resources, FOSS developers don't want to spend their time in court, they'd rather develop software. Vendors know 9 times out of 10 they will get away with the GPL violation scot-free.
It's fine to put on your rose colored glasses and pretend GPL forces companies to release source code. Reality is, the vendors have a larger marketing budget than the entire SFC endowment and the vendor's legal team is happy to tar-pit requests ad infinitum.
It is definitely true that any license including the GPL requires effort and resources to enforce, and that almost all authors of GPL software don't have enough of those.
If the SFC lawsuit against Vizio succeeds, then there will be another option; since yourself and others are third-party beneficiaries of the contract embodied in the GPL between Linux kernel developers and hardware vendors that ship Linux; start a class action with other users of the hardware where GPL violations are present, and sue for GPL compliance instead of money. The lawyers will get their legal costs presumably and the users should get source code. Probably some law firms would take this on just for the legal costs, especially if the Vizio precedent makes it easy to win future cases.
PS: another tactic I have seen applied for GPL enforcement is for the copyright holder to have customs block devices on import since they contain illegally obtained software. This is pretty rare, but can be effective.
ArchiveTeam definitely do not intend to kill websites with too fast crawling, but definitely have done that unintentionally and always will stop/slow the crawling when it happens.
Even the distributed crawling system has monitoring and controls to ensure it doesn't kill sites.
reply