Hacker Newsnew | past | comments | ask | show | jobs | submit | reflexe's commentslogin

To add a little bit: the reason for that (or at least one of them) is strict aliasing rules: If the compiler had to assume that a write to an int* might change the value of a double&, that’ll cause it to avoid some optimization and maybe even perform expensive reads.

This is the reason that you are not allowed to alias a variable with another type (can be disabled using -fno-strict-aliasing) [1].

However, one of exceptions is char and std::byte. The compiler is not allowed to assume that a write to char& won’t affect the value of a double& for example [2].

[1]: https://www.gnu.org/software/c-intro-and-ref/manual/html_nod...

[2]: https://en.cppreference.com/w/cpp/language/reinterpret_cast....


Or even better: turn off the device. Cracking cold/BFU (before first unlock) devices is not supported in many cases by tools like Cellebrite [1].

[1] https://discuss.privacyguides.net/t/updated-cellebrite-iphon... : support matrix from 2024, in many cases only AFU (after first unlock) is supported.


According to nvidia’s 2025 annual report [1], 34% of their sales for 2025 comes from just 3 customers.

Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].

This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).

To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.

[1] https://s201.q4cdn.com/141608511/files/doc_financials/2025/a..., page 155: > Sales to direct Customers A, B and C represented 12%, 11% and 11% of total revenue, respectively, for fiscal year 2025.

[2] same, page 116: > Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.


I have lots of skepticism about everything involved in this, but on this particular point:

It's a bit like TSMC: you couldn't buy space on $latestGen fab because Apple had already bought it all. Many companies would have very much liked to order H200s and weren't able to, as they were all pre-sold to hyperscalers. If one of them stopped buying, it's very likely they could sell to other customers, though there might be more administrative overhead?

Now there are some interesting questions about Nvidia creating demand by investing huge amounts of money in cloud providers that will order nv hardware, but that's a different issue.


Its probably not very likely that if a large buyer pulled out, NVIDIA could just sell to other customers. If a large buyer pulls out, that's a massive signal to everyone else to begin cutting costs as well. The large buyer either knows something everyone else doesn't, or knows something that everyone else has already figured out. Either way, the large buyer pulling out signals "I don't think the overall market is large enough to support this amount of compute at these prices at current interest rates" and everybody is doing the same math too.


Or they might build another factory and fulfill all the orders they were previously unable to fulfill and increase their share even more.

Or US administration suddenly allows exports of top-tier to China and they get more whales on their order book.

It's all guess work, that's why their share price is high.


None of those customers can afford to cancel their orders. OpenAI, Google and Meta cannot afford to get cheap on GPUs when presumably they believe GAI is around the corner. The first company to achieve GAI will win because at that point all gains will become exponential.

All the AI companies are locked in a death loop where they must spend as much money as possible otherwise everything they invested will immediately become zero. No one is going to pay for an LLM when the competitor has GAI. So it's death loop for everyone that has become involved in this race.


I don't know why you are being downvoted. What you said makes sense to me but I understand I know very little about how companies think. Can someone with a differing point of view elaborate?


No idea why the downvotes, these are valid points. I still don’t fully agree with it:

1. There are alternatives to nvidia: these 3 companies are probably developing their own alternative to NVIDIA, at some point they will switch to their solution or to competitors (for example: google used TPUs to train Gemini 3 [1], with no nvidia GPUs, despite being a pretty large Nvidia customer).

2. The market seems to be consolidating: for example Apple has decided to use Google Gemini for their new Siri [2]. I’m not an export (or future teller), but I think it increases the chance that other companies might follow and get off the AI race.

3. I am sure that OpenAI and related companies would want to sustain these kind of orders, but I am not sure it is possible without more and more funding, and I don’t know if even Sam himself know to estimate how many GPUs they will be able to buy from Nvidia in 2026.

[1] https://x.com/JeffDean/status/1886852442815652188

[2] https://blog.google/company-news/inside-google/company-annou...


My favorite east Germany escape stories is the escape is train driver’s Harry Deterling, that just drove the train into wall (which was not fully a wall by then if I understood correctly) https://www.chronik-der-mauer.de/en/chronicle/_year1961/_mon...


What is up with the name?

Edit: Kinkora implies that it has something to do with kinks, at least that was my first impression.

My guess is that it means something in another language, but maybe this is not a good first association that you would want for a AI image generation product that can be used in a professional setting.



Since not everyone here is a native English speaker, it would be good to explain what you mean about the name.


Basically everybody who speaks English will assume it's a pornography site.

(Not unwarranted because pornography is so far the only commercial niche for generative AI.)


A few days ago there were a bunch of people on Twitter making fun of some ad that was apparently ai-generated.


I read it more as "kin-kora", TBH.


Also unfortunately reminiscent of https://en.wikipedia.org/wiki/Kincora_Boys%27_Home


We’ve noticed that the name creates unintended associations for some users, especially in English, and that’s not what we want to emphasize going forward.

We’re actively discussing a rebrand to better reflect the creative and model-focused direction of the product.


That’s fair feedback.

We’ve noticed that the name creates unintended associations for some users, especially in English, and that’s not what we want to emphasize going forward.

We’re actively discussing a rebrand to better reflect the creative and model-focused direction of the product.


Yep, my first thought too. Substituting the "s"in Sora with "kink" definitely leads one to that conclusion.


The site description says it allows NSFW.


I think that reading all of the information from the SSD should “recharge” it in most cases. The SSD controller should detect any bit flips and be able to correct them.

However, this is implementation detail in the SSD FW. For Linux UBI devices, this will suffice.


Also, FYI for the one person here who uses raw nand flash: run ubihealthd (https://lwn.net/Articles/663751/).

It will trigger reads in random areas in flash, and try ti correct any errors found.

Without it, the same issue as in the original article will happen (even if the device is powered on): areas in the NAND were not read for long time will have more and more errors, causing them to be non recoverable.


Hopefully it will make Qualcomm behave more like Arduino and not the opposite. Qualcomm is one of the worse companies I have had the pleasure to work with.

Their support model is hellish and they provide very little information and documentation, so usually you’ll end up doing a lot of guessing and reverse engineering. They will tell you to sign a contract with one of their “design partners”, but even they can’t get answers for basic questions.

Seriously, if they want more small cap companies working with them they have to treat them better, I worked with them as a small company and as a larger company and in both cases their support was basically non existent even if we were buying chips from them for more than 10m$ a year.


Qcom is a corporate behemoth, much like Oracle. In the immortal words of Bryan Cantrill, it is a lawnmower and if you stick your hand in it you'll get it chopped off.


<removed by me>


Tie that chip to a beamformer (silicon labs have a few) and you have a phased array radar, which is a radar that does not move at all (pretty cool in my opinion)

Also, 15usd is not cheap for this kind of chip. You can buy a full wifi 7 rf/modem or a 4 core arm64 soc with this kind of money.


You can't use an external beamformer with this chip; it has the antenna built into the package itself. The chip doesn't have pins for RF input/output to bypass the built-in antenna.

60GHz radar is very different from WiFi. 15USD actually seems about right for the functionality this chip offers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: