Hacker Newsnew | past | comments | ask | show | jobs | submit | DiabloD3's commentslogin

The article doesn't actually give a coherent answer on why.

People would generally claim "lazyness", as that is the Apple way. Why fix code when you can just sell new phones?

The actual answer is plausible deniability. Closed source software often leaks metadata in hard to discover ways so governments can deprive citizens of their rights under the law, and then claim "whoops, we didn't clean up correctly, our bad!".

Apple, like every other major tech company, goes along with it when nudged in the right direction.


Ollama is quasi-open source.

In some places in the source code they claim sole ownership of the code, when it is highly derivative of that in llama.cpp (having started its life as a llama.cpp frontend). They keep it the same license, however, MIT.

There is no reason to use Ollama as an alternative to llama.cpp, just use the real thing instead.


If it’s MIT code derived from MIT code, in what way is its openness ”quasi”? Issues of attribution and crediting diminish the karma of the derived project, but I don’t see how it diminishes the level of openness.

FOSS licensing can only exist in terms of Copyright. Without Copyright, you cannot license FOSS. If something has an incorrect Copyright attribution, then the license can be viewed as invalid until this deficiency has been corrected (obv. depending on local laws, etc).

On top of this, it would not be unreasonable for the numerous authors of llama.cpp to issue DMCA takedown requests if Ollama is unwilling to correct it.


I don't think it does, but llama.cpp does, and can load models off HuggingFace directly (so, not limited to ollama's unofficial model mirror like ollama is).

There is no reason to ever use ollama.


> I don't think it does, but llama.cpp does

I just checked their docs and can't see anything like it.

Did you mistake the command to just download and load the model?


As a sibling comment answered you, it is `-hf`.

And yes, it downloads the model, caches it, and then serves future loads of that model out of the cache if the file hasn't changed in the hf repo.


So I'm summary: no, it does not have an equivalent command either.

-hf ModelName:Q4_K_M

Did you mistake the command to just download and load the model too?

Actually that shouldn't be a question, you clearly did.

Hint: it also opens Claude code configured to use that model


sure there's a reason...it works fine thats the reason

Advertising, mostly.

Ollama's org had people flood various LLM/programming related Reddits and Discords and elsewhere, claiming it was an 'easy frontend for llama.cpp', and tricked people.

Only way to win is to uninstall it and switch to llama.cpp.


Works fine for me on mobile.

Using Chrome? Firefox and derivatives is a no-go.

I will warn you, Ubuntu is basically dead now.

Canonical announced that they are no longer using Debian as a base, but the unvetted packages compiled and uploaded by random people on Snap.

Please switch to Linux, but find a distro that actually wants you as a user.


> Canonical announced that they are no longer using Debian as a base, but the unvetted packages compiled and uploaded by random people on Snap.

Citation very much needed for this claim.


As somebody who has been around linux almost for as long as it exists, i must say that is a very strong statement.

In real life: systemd IS useful, Wayland is becoming (has become?) the default, ubuntu is the most popular desktop distro family.


In my experience, Snap is frustrating to use, buggy and is opinionated in ways I don't like.

It's also a weird choice for servers running Ubuntu. I recall some CLI utilities being moved to Snap and you can't install them with apt anymore.


Ubuntu on servers has always been "a choice", Debian is definitely the preferable of the two. Even on desktops, I'd sooner suggest Debian or Mint than Ubuntu. Ubuntu is a dead distro coasting on a reputation 15(+) years out of date.

(And it used to be that Ubuntu was still a defensible choice for maximizing the chance of getting help online, but LLMs have effectively neutralized this advantage.)


Mint still uses Xorg, so it's outdated. I tried it recently, it wasn't working with my iGPU+dGPU (nothing exotic, just a regular PC), and all the other distros already went to Wayland so nobody was talking about this online. I feel bad for anyone who gets convinced to switch from Windows to Mint, being told it's the easy one. The fix was to just install Ubuntu.

Maybe Xorg is inherently better than Wayland, but that doesn't matter, the ship has sailed and the community evidently doesn't have time to properly support both.


I genuinely don't think Xorg is a deal breaker for newbs and I would characterize dual-GPU as at least slightly exotic, maybe because I've never owned such a computer, but that's a fair enough point. Personally I think the polish of Cinnamon makes it the best recommendation for somebody new, and I know a whole lot of people start with that and have a sufficiently good experience that they stick with linux (while maybe moving on to other distros.)

It's not exactly dual GPU, just the Intel CPU has integrated graphics as usual. I'm not surprised if you don't have that, but it has to be the most common desktop setup, and quite common on high-end laptops. Was giving black screen after wake. Probably a solution exists somewhere, but even if I found it, the fact that this was broken out of the box and didn't have a clear fix was already reason enough not to trust it.

The GUI layout of Cinnamon vs KDE vs w/e seems like the main thing people argue about, but it doesn't matter compared to this. Anyone who even knows what an OS is enough to go install Linux will figure out how to use whatever GUI you give them, provided it works. The bar needs to be at making sure stuff isn't straight up broken.


To be honest I haven't owned a dGPU in almost 20 years, but I've been lead to understand that most users with them use them all the time and ignore their iGPU, unless they're laptop users, in which case they might have to use Nvidia's proprietary drivers from what I understand; the installation of which is something Mint makes straight forward for novices, or so I've been lead to understand. Maybe I'm wrong about some of that.

I definitely agree that KDE vs Cinnamon probably doesn't matter. But I'm afraid I don't think particularly highly of any KDE-first distro; it's great from, for example, OpenSUSE, but that's not a distro I'd recommend to new users for other reasons.

The problem I've got with Ubuntu is they keep doing weird shit like submitting desktop searches to Amazon or putting ads in the motd. They're an erratic organization and I think it's a mistake to send new users in their direction. Mint may not be perfect, but I think it's broadly inoffensive and mild, a good distro to leave a good first impression on a new user fumbling through the process themself.


So Debian is a no because it ships with KDE?

Imo Ubuntu deserved to lose its users when they switched to Unity, not because Unity sucks (it does) but because it's unacceptable for a newbie-focused OS to rug-pull its entire GUI like that for any reason. But it's still #1, so realistically the leader is going be either Ubuntu or something corp-supported like SteamOS.


> So Debian is a no because it ships with KDE?

I don't think this is a problem at all. I tend to install Debian from the command line (Arch-style), but from what I remember GNOME is the default DE. DEs are largely a matter of opinion, but I find GNOME to be more polished overall. I do use a few extensions however to recreate a desktop-centric UX (Dock, boot to desktop and a few other tweaks).


Debian is a soft no, because despite being an excellent distro, it defaults to GNOME or the user has to deliberately choose something else, which is a problem for giving distro recommendations to noobs because whe you start tacking on stuff like "and make sure you enable the..." their eyes start to glaze over and you risk them thinking the whole affair sounds more complicated than it really is.

I mean I really do love Debian, if not OpenSUSE I would be using Debian now, but it's not a great distro to suggest for absolute novices.


He's not wrong though, the amount of Snap stuff you have to remove in a fresh install is starting to get a bit annoying (I usually remove at least the Snap versions of Firefox and Thunderbird and replace them with binaries from Mozilla - they will still self-update).

You don't have to remove them though, it works fine.

You are right, the snap versions mostly work fine. It's just that there are a lot of annoyances due to the nature of Snap packages (slowness, increased disk space requirements, problematic integration with the rest of the system...), but it definitely is possible to live with them.

My Gentoo system is fully systemd and Wayland based from the start. Might sound like heresy to some users, but it was my decision from the start as I liked how they worked, that they are the future, and that you don’t have to wrangle shell scripts for building an OS. I had used systemd a lot via many Ubuntu servers before, so that helps.

> Canonical announced that they are no longer using Debian as a base

When was that? I don't disagree that it appears to be the case (especially with replacing coreutils/sudo/etc and the... varied approach to .deb vs snaps) but I'm not aware of them saying it explicitly in those terms?


Is your name a reference to the Blizzard game? If so, I worked on that :)

You're not wrong, but tbh I'd move upstream to Debian. I use Termux on my phone (Z Fold) with Debian and XFCE, and have been extremely pleased with the performance. Combined with a folding keyboard and some AirNeo's, it's become a fantastic micro-development system that fits in a hand bag.

Not that I don't like Arch, it has a very few (subtle!) things that Ubuntu has solved recently, like eGPU hotplugging


Nope, my nick predates Blizzard Entertainment, Co. I chose it as my nick on IRC in early 1991, and have used it everywhere ever since.

If that means that package versions for commonly used tools are less than a decade old in the future that's probably a good thing though ;)

Sorry but this comment is part of the reason anyone should rightfully be scared to switch to Linux. Not only do you have to pick one of 999 distros, but every choice is wrong according to someone. Which one do you recommend, and is it the kind that will throw random issues or be called evil?

Debian, if you need a rock-stable system; Fedora for cutting edge.

These are good choices, also consider Arch if you want the most agency over what goes into your system. That being said, you can also build a very minimal system with Debian from the command line with the arch-install-scripts. It's just that Debian stable will freeze relatively old packages for the sake of avoiding breaking updates, such as changes in configuration files that require manual intervention. On a gaming rig however, you will typically want to avoid Debian as you want the latest drivers, latest Proton/Wine, etc. as the performance uplift can be substantial and compatibility keeps improving.

For the most agency over my system I prefer Qubes OS. I use Debian and Fedora inside VMs. Their mnimized versions are available from the Qubes repositories.

Yeah those are fine choices, but someone is going to say why they're actually horrible, just like Ubuntu which is also fine

Something I never liked about this game is its showing it in your browser at your chosen font size.

Chrome (assuming you're using Chrome) draws it a specific way. This does not match how Freetype (using typical tuning) or DirectWrite draws it. Chrome's choices in font renderer tuning and blending makes it kind of split the difference between Windows-style and OSX-style, and isn't native to either.

What it should be doing is showing you lossless screenshots of actual in-app renders at different sizes. Some in Chrome (to represent the Electron apps), some in DirectWrite, some in OSX post-Retina, etc.

Some fonts look amazing at larger sizes, but are unreadable at smaller ones. Some perform exceptionally well at smaller sizes. Some look great on every font renderer but OSX's, but some only look right on OSX and look bad everywhere else.

I've sorta played this game with myself, in a semi-objective way: take a bunch of fonts, ignore the subjective art nature of them, and throw them at a bunch of common renderers and see what the optimal size is, and then sort by smallest legible size.

If we define Fira Code, the most popular code font out there, as the bare minimum, 8 of the ones I tested beat it, while 17 were worse.

https://github.com/Diablo-D3/dotfiles/blob/master/fontsizes....


Totally agree, the same fonts at the same pixel sizes often look massively different in different environments. I -love- macOS’ native font rendering, but have been unsuccessfull in emulating it on Linux :/

Freetype can render almost indistinguishably from OSX post-Retina.

Set `FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"`, and then also enable (S)light hinting on LoDPI, or None on HiDPI. Also, disable subpixel and use greyscale only.

OSX looks the way it does because they use excessive stem darkening combined with incorrect gamma blending. GDI, WPF/UWP/WinUI, most ClearType/DirectWrite consumers, GTK, and most browsers also do incorrect gamma blending as well.

Qt is the exception. It enables stem darkening by default, but then uses correct gamma blending. Unfortunately, this is the objectively correct way of rendering, everyone else is doing it wrong.

The only thing you can't do is OSX gamma blends incorrectly for 1.8 on a 2.2-2.4 screen. Everyone else blends incorrectly for sRGB/2.2 on a 2.2-2.4 screen. Light on dark's obscene behavior on OSX shouldn't be replicated.

If you want the opposite, and make it look like Windows, force stem darkening off (use the above env, but set both to 1), and set hinting to Full to make it look like pre-Cleartype, or Medium to make it look like DirectWrite.


Since they fired the entire Arc team and a lot of the senior engineers already updated their Linkedins to reflect their new positions at AMD, Nvidia, and others, as well as laying off most of their Linux driver team (GPU and non-GPU), uh...

WTF?


You are exaggerating, right? They didn't really fire the entire Arc team did they? I couldn't find a source saying that.


Nope, no exaggeration.

The news that Celestial is basically canceled already hit the HN front page, as well as Druid has been canceled before tapeout.

Celestial will only be issued in the variant that comes in budget/industrial embedded Intel platforms that have a combined IO+GPU tile, but the performance big boy desktop/laptop parts that have a dedicated graphics tile will ship an Nvidia-produced tile.

There will be no Celestial DGPU variant, nor dedicated tile variant. Drivers will be ceasing support for DGPUs of all flavors, and no new bug fixes will happen for B series GPUs (as there is no B series IGPUs; A series IGPUs will remain unaffected).

They signed the deal like 2-3 months ago to cancel GPUs in favor of Nvidia. The other end of this deal is the Nvidia SBCs in the future will be shipping as big-boy variants with Xeon CPUs, Rubin (replacing Blackwell) for the GPU, Vera (replacing Grace) for the on-SBC GPU babysitter, and newest gen Xeons to do the non-inference tasks that Grace can't handle.

There is also talk that this deal may lead to Nvidia moving to Intel Foundry, away from TSMC. There is also talk that Nvidia may just buy Intel entirely.

For further information, see Moore's Law Is Dead's coverage off and on over the past year.


You may be a bit too credulous. There has been a "leak" or "rumor" that Intel's GPU initiatives are canceled about once every three months, for over two years. Yet Intel continues to release new SKUs and make new product announcements. Just last month they announced a new data center GPU product (an inference-focused variant of Jaguar Shores).

I can't see the future, but I can see patterns: the media that reports straight from the industry rumor mill LOVES this "Intel has cancelled its GPUs" story, for whatever reason. I have no particular love for Intel (out of my six current systems, my only Intel box is a cheap NUC from 2018), but at this point, these rumors echo the old joke about economists who "accurately predicted the last nine out of two recessions".


ah, so this is MLID. yeah i'll wait for the announcement.


MLID has been saying Arc was cancelled since before the first Alchemist cards were released.


MLID is a terrible information source.


The idea that Intel's foundry could replace TSMC is hilarious. No. Maybe a gamer-focused mid-market card based on 30-series.


Pat spent a lot of money on foundry to catch up.


This is a chip they've had lying around for a while. It's the same architecture as used in the Arc B580 that launched at the end of 2024; this is just a slightly larger sibling. Intel clearly knew that their larger part wouldn't make for a competitive gaming GPU (hence the lack of a consumer counterpart to these cards), but must have decided that a relatively cheap workstation card with 32GB might be able to make some money.


Now if they launched the 32GB workstation card in 2024 with cheap RAM it would have been a success.


Still seems crooked to sell a GPU that is already lost their driver team and will get no new meaningful updates.


Does it need a huge driver team pushing out big updates in order to be suitable for the kind of Pro use cases it's targeted at? They're explicitly not going after the gaming market so they don't need to be on the treadmill of constant driver updates delivering workarounds and optimizations for the latest game releases.

They're still going to be employing some developers for driver maintenance for the sake of their iGPUs, and that might be enough for these cards.


I didn't know this. Have they officially given up on building discrete GPUs? Is this a last gasp of Arc to offload decent remaining architectures at a lower price than nvidia?

It is crazy to me that a world newly craving GPU architecture for AI, and gamers being largely neglected, that Intel would abandon an established product line.


It does sound like a very Intel choice though.


> It is crazy to me that a world newly craving GPU architecture for AI, and gamers being largely neglected, that Intel would abandon an established product line.

You still need to fab it somewhere. Intel's fabs have been plagued with issues for years, the AI grifters have bought up a lot of TSMCs allotments and what remains got bought up by Apple for their iOS and macOS lineups, and Samsung's fabs are busy doing Samsung SoCs.

And that unfortunately may explain why Intel yanked everything. What use is a product line that can't be sold because you can't get it produced?

Yet another item on my long list of "why I want to see the AI grift industry burn and the major participants rotting in a prison cell".


Man, I love 20+ year old memes. The early Internet was a better place.


Por qué no los dos?


Claro


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: