Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> the work product is usually a CoreML model.

What work product? Who is running models on Apple hardware in prod?



An enormous number of people and products. I'm actually not sure if your comment is serious, because it seems to be of the "I don't, therefore no one does" variety.


Enormous compared to what? Do you have any numbers, or are you going off what your X/Bluesky feed is telling you?


I'm super not interested in arguing with the peanut gallery (meaning people who don't know the platform but feel that they have absolute knowledge of it), but enough people have apps with CoreML models in them, running across a billion or so devices. Some of those models were developed or migrated with MLX.

You don't have to believe this. I could not care less if you don't.

Have a great day.


I don't believe it. MLX is a proprietary model format and usually the last to get supported on Huggingface. Given that most iOS users aren't selecting their own models, I genuinely don't think your conjecture adds up. The majority of people are likely using safetensors and GGUF, not MLX.

If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.


I didn't know the entire ML world is defined by what appears in HuggingFace


I never attributed the entire ML world to Huggingface. I am using it to illustrate a correlation.


Cite a source? That CoreML models are prolific on Apple platforms? That Apple devices are prolific? Search for it yourself.

You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.

>how iOS users actually use their phone.

What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.


> That CoreML models are prolific on Apple platforms? That Apple devices are prolific?

correct and non-controversial

> An enormous number of people and products [use CoreML on Apple platforms]

non-sequitur

EDIT: i see people are not aware of

https://en.wikipedia.org/wiki/Simpson%27s_paradox


[flagged]


[flagged]


Can you share a example of apps you mean any maybe it would clear up any confusion?


Any iPhone or iPad app that does local ML inference?


Yes please tell us which apps those are


Wand, Polycam, smart comic reader, Photos of course. Those are just the ones on my phone, probably many more.


The keyboard. Or any of the features in Photos.app that do classification on-device.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: