As a systems administrator who has been doing this for a while, I can attest to this becoming more and more of a problem.
I've worked with CUDA programmers who weren't aware of what "architecture" means in the context of computing until I asked them to consider running binary code meant for NVIDIA on AMD GPUs.
I've worked with network administrators who couldn't conceive of the simple math behind something like single IP, 64k ports and approximate memory per entry in a NAT state table to figure out that it's actually not impossible to have NAT states that persist indefinitely.
I've worked with people to whom I've had to explain that bits are bits, and just as in the audiophile world where there's a huge amount of money and energy spent to try to convince people that pricey bits are somehow better than cheap bits, so unless you process them differently, they're bits and are exactly the same.
There's this tendency to believe and follow trends, even when there'e ample evidence of how these trends aren't much more than marketing fluff and more often than not lead to dead ends. It's fine if others want to believe the marketing fluff, but when they want me to change my workload to support something because it's "all the rage", I have to put it in terms they understand ("You love Go? Great. Now what if I told you to use Rust because it's all the rage, and I expect you to rewrite everything in Rust?")
It's unfortunate that students aren't taught more history of computing. If they were, they'd likely learn that computing is computing, and that things don't really change that much, and that if you make and use things that aren't different for gratuitous reasons, they'll still be around and relevant years from now.
> and just as in the audiophile world where there's a huge amount of money and energy spent to try to convince people that pricey bits are somehow better than cheap bits
I always assumed this came down to the analog part (usually some sort of speaker) being better, and all the extras on the cords were really just there because a cheap looking cord would ruin the aesthetic they're going for.
If you want to go down a deep, dark, sad rabbit hole, look in to the digital audio part of the "audiophile" world.
For me it began when I looked at a friend's catalogue that had a $2,500 (25 years ago) CD player that claimed incredible fidelity in part because they bathed the underside of the disc with blue light while reading with a red laser, as though this somehow made reading the bits better.
I explained to my friend that barring scratches and speed fluctuations, bits are bits, and none of that matters one iota until you get to the digital to analogue stage.
My example was that you could write the data on strips of cloth from old pyjamas using free crayons that restaurants give to kids, put them in Dixie Cups and transport them across the Sahara on camelback, and so long as you reassemble them in the right order, the bits will be identical to the bits from a $2,500 CD player.
He's a brilliant guy, but it took him longer to understand what I was saying than I could've predicted. The whole thing - the scammy market, the FUD, the bullshit, the desire to believe - was and still is fascinating, albeit sad.
Someone somewhat recently did a review, if you can call it that, of "audiophile" ethernet switches. Yes, they want you to believe that the bits somehow are better if you send them using their special ethernet switches. Might be worth a quick search.
I've worked with CUDA programmers who weren't aware of what "architecture" means in the context of computing until I asked them to consider running binary code meant for NVIDIA on AMD GPUs.
I've worked with network administrators who couldn't conceive of the simple math behind something like single IP, 64k ports and approximate memory per entry in a NAT state table to figure out that it's actually not impossible to have NAT states that persist indefinitely.
I've worked with people to whom I've had to explain that bits are bits, and just as in the audiophile world where there's a huge amount of money and energy spent to try to convince people that pricey bits are somehow better than cheap bits, so unless you process them differently, they're bits and are exactly the same.
There's this tendency to believe and follow trends, even when there'e ample evidence of how these trends aren't much more than marketing fluff and more often than not lead to dead ends. It's fine if others want to believe the marketing fluff, but when they want me to change my workload to support something because it's "all the rage", I have to put it in terms they understand ("You love Go? Great. Now what if I told you to use Rust because it's all the rage, and I expect you to rewrite everything in Rust?")
It's unfortunate that students aren't taught more history of computing. If they were, they'd likely learn that computing is computing, and that things don't really change that much, and that if you make and use things that aren't different for gratuitous reasons, they'll still be around and relevant years from now.