I doubt it. I predict in a few years, maybe sooner, one/some of the AI companies buying up the supply will either have achieved their goal or collapsed, and then the market will be flooded with a glut of memory driving prices low again. Or, conversely, the demand stays high for a sustained period of time and the suppliers just increase supply. There's no hard bill of materials/technical reasons for the memory prices to be this high, unlike 20+ years ago.
And in the meantime, major buyers (government, big orgs) adjust by extending the planned lifespan of their computers, and upping the IT wage budget a bit to support that. That adjustment probably won't go away after supply returns.
Im always shocked how much good IT equipment is shoved into the trashbin:
At a lot of companies I could make a great deal - either for using it on my own or selling it on Ebay later on.
Big Corporations offen trash IT equipment thats only 3 - 4 years old. And there is no recycling etc. Very sad.
Big corporations tend to send old hardware through the surplus marketplace. There's lots of 3-4 year old corporate computers for sale. Often, the company leases the computers and then the lessor will sell them when they're returned.
As long as it's working (and not gross), why do you need a new monitor? My current monitor is a 2010 model, I think I got it around 2013. I don't know what a new monitor would do for me, other than have a worse aspect ratio, cause Dell stopped making 30" 16:10 monitors.
In theory, yes. However a lot of these monitors are still 2560x1440 and are 30”+. The ppi is quite low. I’m looking for 4k and something that looks similar enough to the M4 MBP I’m working on. A lot of these just don’t look good as they used to.
1440p is good enough that you aren't going to see individual pixels - just sit far back enough from the screen and use reasonable font hinting (Mac users are sadly out of luck here, but even then 2160p/4K is overkill).
> A machine from 5 years ago feels just as fast as a brand new machine.
Except you can't install Windows 11 on it, and the org has to trash it anyway to keep up with security requirement (I know people on that line of work, they're all angry about it)
AI companies aren't buying RAM, they are buying the Wafers themselves. Then they are making special AI stuff. So the RAM never exists, and there will be no glut memory coming. Maybe some DDR5 will dribble out, but HBM isn't something we can use (at the moment).