Hacker Newsnew | past | comments | ask | show | jobs | submit | estebank's commentslogin

Even side stepping that tokio no longer pulls multiple packages, it used to be split into multiple packages, in the same way that KDE in Rust would be hundreds of packages.

Rust projects tend to take their project and split it into many smaller packages, for ease of development, faster compiles through parallelization, ensuring proper splitting of concerns, and allowing code reuse by others. But the packages are equivalent to a single big package. The people that write it are the same. They get developed in tandem and published at the same time. You can take a look at the del tree for ripgrep, and the split of different parts of that app allows me to reuse the regex engine without dealing with APIs that only make sense in the context of a CLI app or pulling in code I won't ever use (which might be hiding an exploit too).

Counting 100 100 line long crates all by the same authors as inherently more dangerous than 1 10000 line long crate makes no sense to me.


Don't do this. Use a package manager that let's you specify a specific version to pin against. Vendoring side steps most automated tooling that can warn you about vulnerabilities. Vendoring is a signal that your tooling is insufficient, 99% of the time.

Vendoring means you don't have to fetch the internet for every build, that you can work offline, that you're not at the mercy of the oh-so-close-99.999 availability, that it will keep on working in 10 years, and probably other advantages.

If your tooling can pull a dependency from the internet, it could certainly check if more recent version from a vendored one is available.


Is there any package manager incapable of working offline?

> Is there any package manager incapable of working offline?

I think you've identified the problem here: package management and package distribution are two different problems. Both tools have possibilities for exploits, but if they are separate tools then the surface area is smaller.

I'm thinking that the package distribution tool maintains a local system cache of packages, using keys/webrings/whatever to verify provenance, while the package management tool allows pinning, minver/maxver, etc.


Social media being bad is partly because of shady business practices, and partly because a lot of people suck (in different ways, at different times, including us).

Having said that all of that, have you tried mastodon?


Mastodon, Bluesky, etc are neat - both in what they're trying to be and their technology. But ultimately these days I reject them in favor of more local socialization (again, not geographically). What this looks like is a constellation of private (or pseudo private) discord communities. If I make friends in one, I often get invited to another. I recognize the merit in broader social forums like Mastodon, but it is not worth the drawbacks to me.

As an aside, I'm not happy with Discord as a platform so I'm working on my own clone with some common identity stuff but with community servers run independently. That is, there are some "federated" identity providers so community servers can agree on identity across servers, then each community server runs its own thing. The trust model is based on the community server - private channels in a community server are not E2E encrypted, you must trust the server. But DMs and DM groups are E2E encrypted and use mutual community servers as relays (with a special class of relay server for people who want to DM but don't have an actual mutual server). I'm having fun with it. Now if only I could figure out why my video has such high latency (even locally!).


A large part of the problem, imo, is that people haven't used the ability to talk to the entire planet as an opportunity to broaden their horizons, but to build themselves a transnational bubble of like-minded individuals.

Once upon a time, shouting "WTF are they thinking?" into the void was kinda understandable, but these days you can literally just ask them by changing a URL. Don't even have to go to a dodgy pub in an iffy part of town.

That said, assuming bad faith is so common these days, many people assume you're lying if your stated motives don't match their preconceptions.


> That said, assuming bad faith is so common these days, many people assume you're lying if your stated motives don't match their preconceptions.

A brutal reality to navigate if you're not acting in bad faith.


Hey! It also had a barely working physics engine.

Then again the dinosaurs were physics entities, so maybe you already mentioned it. :)


If the genersted PDFs are stored encrypted in an accessible server with proper access control, then that is a measurable improvement over email containing medical informstion that a random citizen would send, which would be bouncing around unencrypted around at least one third party SMTP server. Of course, if then that person uses an online Fax service, they are sharing that information with at least one other party...

And that's even without considering the security benefit of not receiving files that could be compromised, instead generating a file from an image stream. (Now I'm trying to picture what a daisy chain of exploits would be needed to craft a malicious Fax.)


The OP states they've migrated. That might mean that the field on their account database entry might be related to that move. The account is older, but when moving countries I've had to do weird dances to get my Google accounts to accept the new locale, and wouldn't be surprised if their computed account age coincides with me having done that change.

A law can be bad and its implementation can be worse.

I found that I would have enjoyed the movie a bit more if I hadn't read the book, but it was still a solid 8/10. I'm really glad that a movie like this did well in opening weekend.

> Zig vs Rust also shows up with how object destruction is handled.

I often hear critiques of Drop being less efficient for anything Arena-like, where batch destruction would be better, and holding that as the reason defer is a better approach. What is not mentioned there is that there's nothing stopping you from having both. In Rust you can perform batch destruction by having additional logic (it might require control of the container and its contents' types for easiest implementation), but the default behavior remains sane.


That's fair, since you can leak the box. I will say though it's not as ergonomic as defer, as defer handles all exits from the scope, where it's trickier to juggle destructors. Though on further thought, I suppose the arena can have Drop.

EDIT: What you can't really do is this: https://github.com/smj-edison/zicl/blob/ea8b75a1284e5bd5a309...

Here I'm able to swap out std.MultiArrayList's backing to be backed by virtual memory, and correctly clean it up. I'm not sure you can really do that with Rust, barring making custom data structures for everything.


You can look at the discussions in any of the language RFCs to see that increased complexity is one of the recurring themes that get brought up. RFCs themselves have a "how do we teach this?" section, that IMO makes or break a proposal.

Keep in mind that as time goes on, features being introduced will be more and more niche. If you could do things in a reasonable way without the new feature, the feature wouldn't be needed. That doesn't mean that everyone needs to learn about the feature, only the people that need that niche have to even know about it (as long as it is 1) it interacts reasonably with the rest of the language, 2) its syntax is reasonable in that it is either obvious what's going on or easy to google and memorable so that you don't have to look it up again and 3) it is uncommon enough that looking at a random library you won't be seeing it pop up).


Thanks for the context. That makes a lot of sense! Those three constraints seem pretty important and a useful way to think about the problem.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: