Hacker Newsnew | past | comments | ask | show | jobs | submit | kd913's commentslogin

If this is true, I feel teh wifi alliance have a tonne to answer for the ewaste they generate.

WPA3 moved from symmetric AES to ECDH which is vulnerable to Quantum. Gonna be a tonne of IOT inverters waste.


WPA3 moved from PBKDF to ECDH. AES CCMP and GCMP are still the underlying block ciphers in WPA3 with some other extensions for China

For what it's worth, cryptography engineers were generally not happy with the Dragonfly PAKE, and PQC was a legitimate concern even in 2012.

Just yesterday I used an IoT device with WEP as the only WiFi option. Needless tosay, I use the wired connection.

The say the 's' in IoT stands for secure, and from my experience that is true. Pretty much nothing is getting thrown out, because it isn't secure.


WPA3 was announced in 2018 [0]. I don't think it's reasonable to blame them for not anticipating the next decade of cryptographic research.

...but even if they had, what realistically could they have done about it? ML-KEM was only standardized in 2024 [1].

also, the addition of ECDH in WPA3 was to address an existing, very real, not-theoretical attack [2]:

> WPA and WPA2 do not provide forward secrecy, meaning that once an adverse person discovers the pre-shared key, they can potentially decrypt all packets encrypted using that PSK transmitted in the future and even past, which could be passively and silently collected by the attacker. This also means an attacker can silently capture and decrypt others' packets if a WPA-protected access point is provided free of charge at a public place, because its password is usually shared to anyone in that place.

0: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#WPA3

1: https://en.wikipedia.org/wiki/ML-KEM

2: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#Lack_of...


Does it matter if an attacker can decrypt public wifi traffic? You already have to assume the most likely adversary (e.g. the most likely to sell your information) is the entity running the free wifi, and they can already see everything.

It is precisely because the operator of the wifi is not necessarily the adversary a user may be most concerned about. They may be, but they are not the only one. They are the one you know can be, but they aren't the only one.

> You already have to assume the most likely adversary is the entity running the free wifi

why do you have to assume that?

you're at Acme Coffeeshop. their wifi password is "greatcoffee" and it's printed next to the cash register where all customers can see it.

with WPA2 you have to consider N possible adversaries - Acme Coffee themselves, as well as every single other person at the coffeeshop.

...and also anyone else within signal range of their AP. maybe I live in an apartment above the coffeeshop, and think "lol it'd be fun to collect all that traffic and see if any of it is unencrypted".

with WPA3 you only have to consider the single possible adversary, the coffeeshop themselves.


Because it's a near certainty (at least in the US) that businesses will spy on you to the extent that they can, but it's actually incredibly rare to be around a nerd with Wireshark? Things like facebook used to not use https long after public wifi was ubiquitous and you could easily sniff people, and it basically didn't matter. Now nearly everything uses TLS so it really doesn't matter. Actually most public wifi I encounter has no security.

> Actually most public wifi I encounter has no security.

that was also one of the things fixed [0] in WPA3.

it sounds like you don't consider it relevant to your personal threat model. but the experts in charge of the standard apparently thought it was important to have in general.

0: https://en.wikipedia.org/wiki/Opportunistic_Wireless_Encrypt...


Apple is about to deprecate the iphone 11/SE 2020 version. Am gonna repurpose them as webcams given the 12MP camera put in there is arguably better than the brand new ones they put on new macs.

The phone now has a limited lifespan though because of this prior stupidity where eventually am gonna get into spicy pillow territory. At that point the phone prematurely dies.

We are going into a period where we are throwing away devices with 12mp+ cameras, and processors arguably faster than most desktops. It was arguable when the phones were old and legacy, but at this point the cameras on there are stupidly good.

We need these phones to be repurposed for a second life and actually capture their manufacture energy costs.

Frankly, if Apple allowed old iphones to be used for server usage, it is kind of crazy how efficient per dollar that would be.


ARM is a bloody financial hand grenade.

10% of the stock is floated.

90% of the stock is owned by Masa who used it for collateral for his 18 billion loan for Stargate. THat is against 33 banks who have a strong incentivise to dump in a margin call situation.

Their revenues are circular for the last 4 years, with 30% growth purely coming from Softbank shuffling their own money.

They are gonna be the canary in the coal mine for when the AI bubble implodes.


The US job stats were revised down for 2025 to 181k, but somehow the Country gained 130k in January?

Is anyone looking at this and the CBO figures and not just realising the government is straight lying about the figures?

Gonna believe Powell and Waller on this one.


They worked out because there was an excess of energy and water to handle it.

We will see how the maths works out given there is 19 GW shortage of power. 7 year lead time for Siemens power turbines, 3-5 years for transformers.

Raw commodities are shooting up, not enough education to cover nuclear and SMEs and the RoI is already underwater.


My cynical take is that it'll works out just fine for the data centers, but the neighbouring communities won't care for the constant rolling blackouts.


Okay but even in that case the hardware suffers significant under utilisation which massively hits RoI. (I think I read they only achieve 30% utilisation in this scenario)


Why would that be the case if we assume the grid prioritizes the data centers?


That is not a correct assumption. https://ig.ft.com/ai-power/

Reports in North Virginia and Texas are stating existing data centres are being capped 30% to prevent residential brownouts.


That article appears to be stuck behind a paywall, so I can't speak to it.

That's good for now, but considering the federal push to prevent states from creating AI regulations, and the overall technological oligopoly we have going on, I wonder if, in the near future, their energy requirements might get prioritized. Again, cynical. Possibly making up scenarios. I'm just concerned when more and more centers pop up in communities with less protections.


Buy gold.

Current US debt to gdp is 124%, 38.6 trillion. Japan too at 230-240%.

Bond markets in both are looking seriously unhealthy (Japan going via a Liz Truss moment at present).

If the AI bubble falls over, the US government is going to have to print 5 trillion to cover the bubble at least. The only option there is inflate away anyone holding cash.

If hte AI succeeds and people are replaced, the US government faces a massive fiscal cliff of a loss of tax receipts. They won't be able to service the debt and again will be forced to inflate away.

To service current debt projects, AI growth needs to return some 3.2-3.5%, it is currently 0.5%.

Bonds, equities, USD, and housing are all risk assets right now.


I think ARM is the one to watch. Softbank only has a 10% float of their stock.

90% of their stock is being used as collateral against 33 banks for 18 billion stargate loan to OpenAI.

Given Japanese bond markets right now, 30% circular financing, if the AI narrative falls, ARM is gonna blow up.


I am still a little skeptical about utilisation rates. If demand is so extreme, wouldn't we see rental prices for H100/A100 prices go up or maintain? Wouldn't the cost for such a gpu still be high (you can get em 3k used).


On "runpod community cloud" renting a 5090 costs $0.69/hour [1] and it consumes about $0.10/hour electricity, if running at full power and paying $0.20/kWh.

On Amazon, buying a 5090 costs $3000 [2]

That's a payback time of 212 days. And Runpod is one of the cheaper cloud providers; for the GPUs I compared, EC2 was twice the price for an on-demand instance.

Rental prices for GPUs are pretty darn high.

[1] https://www.runpod.io/pricing [2] https://www.amazon.com/GIGABYTE-Graphics-WINDFORCE-GV-N5090G...


A 5090 gaming card is a different beast to the 80gb ai cards. That one was 40k usd so for renting that to hit 1.50 dollar per hour is interesting.


The defacto install of github CLI on ubuntu systems appears to be snap which is owned by some random dude...


There is AMD's onload https://github.com/Xilinx-CNS/onload. It works with Solarflare, Xilinx but also generic NIC support via AF_XDP.


The price of doing that is losing OS controls over emitted packets. For servers fine. Browsers not so much.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: