It’s not really about OS differences - as the GP said, games don’t typically use a lot of OS features.
What they do tend to really put a strain on is GPU drivers. Many games and engines have workarounds and optimizations for specific vendors, and even driver versions.
If the GPU driver on Linux differs in behavior from the Windows version (and it is very, very difficult to port a driver in a way that doesn’t), those workarounds can become sources of bugs.
In my experience using newtypes like this causes a constant shuffle between the original type and the newtype.
If a library exposes Foo and I wrap it in MyFoo implementing some trait, I need to convert to MyFoo everywhere the trait is needed and back to Foo everywhere the original type is expected.
In practice this means cluttering the code with as_foo and as_myfoo all over the place.
You could also impl From or Deref for one direction of the conversion, but it makes the code less clear in my opinion.
One strategy I like is to declare “view” types for serialization and deserialization, because you’re going to be doing that anyway if your serialized format is meant to be compatible across versions anyway.
Serde also comes with a bunch of attributes and features to make it easy to short-circuit this stuff ad hoc.
I know this only solves the serialization use case, but that seems to be where most people run into this.
honestly in my experience it rarely matters (if you care about stable APIs) as most types you want to have at an API boundary are written (or auto generated) by you
this leaves a few often small types like `DateTime<Utc>`, which you can handle with serde serialization function overwrite attributes or automatic conversions not even needing new types (through some of this attributes could be better designed)
serde is not perfect but pretty decent, but IMHO the proc macros it provides need some love/a v2 rewrite, which would only affect impl. code gen and as such is fully backward compatible, can be mixed with old code and can be from a different author (i.e. it doesn't have the problem)
Anyway that doesn't make the problem go away, just serialization/serde is both the best and worst example. (Best as it's extremely wide spread, "good enough" but not perfect, which is poison for ecosystem evolution, worst as serialization is enough of a special case to make it's best solution be potentially unusable to solve the generic problem (e.g. reflections)).
Other than duck-typed languages (and I count Go as basically that), which languages actually provide this feature?
AFAIK, it’s not really very common to be able to extend foreign types with new interfaces, especially not if you own neither.
C++ can technically do it using partial specialization, but it’s not exactly nice, and results in UB via ODR violation when it goes wrong (say you have two implementations of a `std::hash` specialization, etc.). And it only works for interfaces that are specifically designed to be specialized this way - not for vanilla dynamic dispatch, say.
> Other than duck-typed languages (and I count Go as basically that), which languages actually provide this feature?
There are only like 3 significant languages with trait-based generics, and both the other ones have some way of providing orphan instances (Haskell by requiring a flag, Scala by not having a coherence requirement at all and relying on you getting it right, which turns out to work out pretty well in practice).
More generally it's an extremely common problem to have in a mature language; if you don't have a proper fix for it then you tend to end up with awful hacks instead. Consider e.g. https://www.joda.org/joda-time-hibernate/ and https://github.com/FasterXML/jackson-datatype-joda , and note how they have to be essentially first party modules, and they have to use reflection-based runtime registries with all the associated problems. And I think that these issues significantly increased the pressure to import joda-time into the JVM system library, which ultimately came with significant downsides and costs, and in a "systems" language that aims to have a lean runtime this would be even worse.
> Scala is interesting. How do they resolve conflicts?
If there are multiple possible instances you get a compilation error and have to specify one explicitly (which is always an option). So you do have the problem of upgrading a dependency and getting a compilation error for something that was previously fine, but it's not a big deal in practice - what I generally do is go back to the previous version and explicitly pass the instance that I was using, which is just an IDE key-combo, and then the upgrade will succeed. (After all, it's always possible to get a conflict because a library you use added a new method and the name conflicted with another library you were using - the way I see it this is essentially the same thing, just with the name being anonymous and the type being the part that matters)
You also theoretically have the much bigger problem of using two different hashing/sorting/etc. implementations with the same datastructure, which would be disastrous (although not an immediate memory corruption issue the way it could be in Rust). But in practice it's just not something I see happening, it would take a very contrived set of circumstances to encounter it.
> (although not an immediate memory corruption issue the way it could be in Rust)
Just to note, all of Rust's standard container types are designed such that they guarantee that buggy implementations of traits like `Hash` and `Ord` do not result in UB - just broken collections. :-)
C# does not support adding interfaces to foreign types. It does support extension classes to add methods and properties to a type, but nothing that adds fields or changes the list of interfaces implemented by a type. Rust supports this as well, because you can use traits this way.
Dependency injection is a popular solution for this problem, and you can do that as well in Rust. It requires (again) that the API is designed for dependency injection, and instead of interfaces and is-a relationships, you now have "factories" producing the implementation.
I mean… Sure, if we’re just making stuff up, a compiler that can magically understand whatever you were trying to do and then do that instead of what you wrote, I guess that’s a nice fantasy?
But out here on this miserable old Earth I happen to think that Rust’s errors are pretty great. They’re usually catching things I didn’t actually intend to do, rather than preventing me from doing those things.
> But out here on this miserable old Earth I happen to think that Rust’s errors are pretty great. They’re usually catching things I didn’t actually intend to do, rather than preventing me from doing those things.
As it happens, you are replying to the person who made Rust's errors great! (it wasn't just them of course, but they did a lot of it)
I think there are legitimate criticisms of Rust that fall in this category, but the orphan rule ain’t it.
In most other languages, it is simply not possible to “add” an interface to a class you don’t own. Rust let’s you do that if you own either the type or or the interface. That’s strictly more permissive than the competition.
The reasons those other languages have for not letting you add your interface to foreign types, or extend them with new members, are exactly the same reasons that Rust has the orphan rule.
Yes, behavioral genetics is the climate science of the left. If there are PhDs and university departments studying it, I'm not gonna be someone who sticks there head in the sand for the sake of their flavor of political identity.
They are studying it, while you are drawing your own conclusions from a cursory understanding.
Your claim that “wealthy people are more intelligent” is so incredibly problematic on so many levels, starting with the fundamental methodological problem that we do not have any reliable way to actually measure intelligence. Add to this the extremely obvious fact that some rich people have no other qualifications than being born with a trust fund, and some poor people face extreme obstacles to reaching their potential.
This world view is total, utter dogshit, completely removed from reality.
> Your claim that “wealthy people are more intelligent” is so incredibly problematic on so many levels
If you want people to listen to you then don't advertise your biases like this. Call it "wrong" instead of "problematic", calling it "problematic" shows you don't want to see that result and not just that you think its wrong.
No, I used it as a euphemism for “fascist”, which is a somewhat stronger descriptor than simply “wrong”.
The notion that the deeply oppressive status quo is somehow fair and just is one of the worst post-hoc rationalizations that purportedly smart people can fall into. Open your eyes.
The problem with this stance is that the alternative (making people feel bad) will exacerbate the problem by contributing to feelings of hopelessness and ostracism.
The first prerequisite for making difficult changes is a supportive environment - not a judgmental one.
My personal experience was that the shame I'd been made to feel throughout middle school for being overweight fueled the motivation to buckle down and lose weight when I was independent and mature enough to come up with a diet that I could sustain.
I’ll offer this argument: Society aggressively tells overweight people that it is bad to be overweight, in no uncertain terms, and with significant repercussions. Yet overweight people exist. Hence, the strategy of shaming people until they lose weight does not work in the general or average case.
I’m sure they know, pointing it out doesn’t solve anything.
> Politely watching them die before you is maybe comfortable, but pretty messed up.
I disagree. It’s their choice, and they should be free to do what they want and not be criticized. In fact it’s not comfortable and sometimes I do want to say something but that’s not very kind.
Fair argument, but I don’t think criticizing their weight benefits either party. Unless you’re a super close friend and you do it occasionally as a reminder when they are going off the rails.
My family constantly says I’m on the bigger side - but does it help? Absolutely not. Does it hurt? A little bit, at least. Then they shame you for “going on a diet”, but also asks why you don’t eat. I don’t need others to pile their opinions on top of it.
This comment is Dunning-Kruger. Some overweight people are very unhealthy. Some thin people are very unhealthy. Some overweight people have genetics that prefer to store fat subcutaneously where its not very harmful. Some thin people have genetics which preferentially store fat in and around organs or muscles which is incredibly damaging, leads to chronic inflammation and eventually T2D and atherosclerosis, among others. Lets just say you can't judge a book by its cover and biology is complex. Unless you know a person, keep your mouth shut and your mind open!
There are sedentary thin people who live on doritos and active heavy people who eat salads. There are ALL KINDS!
Over 3 gigabases is a lot of room for genetic diversity, don't you think?
What they do tend to really put a strain on is GPU drivers. Many games and engines have workarounds and optimizations for specific vendors, and even driver versions.
If the GPU driver on Linux differs in behavior from the Windows version (and it is very, very difficult to port a driver in a way that doesn’t), those workarounds can become sources of bugs.
reply