Hacker Newsnew | past | comments | ask | show | jobs | submit | mgaunard's commentslogin

Does it support C++?

Ops people are typically more useful given you probably already have devs.

As far as I can tell Google Gemini has the best overall integrations (Android, WearOS, Google Home) with the only voice recognition that actually works (Gemini Live).

Anthropic Claude has the best integrations with coding; what would make sense is for them to focus on that segment.

Other AI companies don't have anything really compelling. Meta has a model that's fully open-source, but then that's not particularly useful outside of helping them remain somewhat relevant, but not market-leading.


If you haven’t used codex with gpt-5.3-codex (high or xhigh) you are missing out. Claude is still good at conversations but boy I can have codex go at a problem and it does better than Claude almost all the time. Front end and product UX Claude is slightly better but given the very very generous limits of codex, they are the best bang for buck

this is my experience as well, just cancelled my claude subscription as I'm tired of it the 5 hour window being filled up within 30 minutes of use, and not even fixing the problem that codex finds almost immediately. also found for frontend that gemini 3.1 pro is better than the rest if you really play with it.

How is gemini 3.1 doing in agentic harnesses? Did they catch up?

Absolutely this. Codex right now is the superior coding model.

Has it been sped up at all? Last time I used codex (which was with 5.1 I think), it was pretty slow. I mean, it did a fantastic job at figuring out hard bugs across multiple languages ("why is this image not lining up in this server-rendered template?"; Python, JS, CSS, and the template lang) but it took quite a long time. Long enough that I wouldn't want to use it for anything but the most complex things.

5.1 is 100 years old in AI world.

The tide seems to be shifting on codex, but also lets not forget openai has a brand that none of the others have - it's _the_ AI.

Sure Google can go against that, but it's openai is definitely in a much better spot. It's pretty important for a consumer market.


This thing about openai brand is changing fast. In the dev circles I'm part of, everybody dislikes OpenAI and prefer Claude. How long it'll take for the same to happen with the normies?

You see, in my circles (me included), people are shifting _to_ codex since 5.3 codex came out.

The only places where I hear people say claude is better is: - Frontend design - Random computer use tasks

But people trust codex for large scale architecture and changes


I use Claude for work and Codex for private use due to already having a Plus subscription.

I can't say that I have noticed that 5.3-Codex is much better, but it's definitely on par with Opus 4.6, and its limits for $25/months is comparable to Max x5 at 1/4th of the cost (not to mention pay-per-token which we use at work). Claude Code is generally a much better experience though.


The thing is though, Google Gemini is pretty good and it's not super hard to switch to and, the real moat, Google can just keep improving, integrating Gemini, and gathering customer while just waiting for OpenAI to go bankrupt. Basically, everyone on the planet has to pay OpenAI to keep them in business. If they don't get the vast majority of the market OpenAI can't pay their bills. Google is going to just starve OpenAI out.

I’m not sure. I started using Codex last week. Codex, at $20/mo is a very good value.

Indeed! For now let's enjoy it as much as we can. The VC-subsidized price of $20 won't last eternally I'm afraid

> Anthropic Claude has the best integrations with coding; what would make sense is for them to focus on that segment.

the problem with coding is the value is really in the harness and orchestration both of which are accessible to the opensource community. ClaudeCode isn't that big of a deal unless Anthropic makes it so that you can only access the models that ClaudeCode uses through ClaudeCode. If not, then projects like pi and opencode have the advantage in the long run. Also, these harnesses being node modules (of all things) make them very easy to reverse engineer with the help of... claudecode ironically.


> Anthropic Claude has the best integrations with coding

Disagree, Codex is neck and neck with A/ on coding front


Model available != open source

"Think of the children" is merely a political argument to get a law to be popular among normal people.


Cookware that isn't flat is usually because it is warped, which happens if you mistreat your cookware with thermal shock.


Most convection ovens are electric, not sure why you needed the gas...


In 1985, a microwave was affordable and portable, if you had to do a rapid flit. If a house was set up for gas, then the stove (which typically was left in a house) was gas.

If you were illegally occupying a house scheduled to be demolished, with only electricity and a judges order to quit over your head, what would you do? Go and buy a full oven or cook in a microwave?


I wouldn't be squatting in the first place, and I personally always invested in improving whatever place I was living in even if just renting.

So it's difficult for me to relate to those issues.


I do not have a microwave, but I remember having one, and never managed to intuitively use it to iterate on my cooking.

Meanwhile, throw stuff in the pan, move it around, adjust the temperature, add in some stuff as it goes, is a much more interactive type of cooking that is much more likely to take me where I want to go (tasty food).


It depends what you're trying to make. There are two things I almost always cook from scratch in the microwave, and that's trifle sponge (because I don't care if the sponge cake is going to be a bit dry and heavy, because I'm about to break it all up, mix it with diced fruit, and pour some sherry and quite a lot of jelly over it) and onion paste for curry.

If you want to make curry from scratch you can either do the whole thing in one pan and get "homestyle" curry - which is good - or you can make an onion paste by either cooking a very mildly spicy but ultimately rather bland onion soup for an hour to make the "base gravy", or by just chopping three or four onions and sticking them in the microwave on full blast for ten minutes before mooshing them with the hand blender.

Then you just bloom your spices in a bit of oil, chuck in some garlic and ginger paste (literally about the same amount of peeled garlic cloves and peeled ginger root mooshed up with the blender in a little oil and water) and let it bubble a bit, chuck in whatever veg and meat you're adding, and then slowly start adding your onion gloop, and boom, restaurant-style curry.

If you make the garlic and ginger paste in advance, and precook the meat a little (beef kind of wants to be stewed until it's tender, and then you can fire in the stock it's stewing in) then you can knock out an incredibly tasty curry in the same amount of time it takes to cook the rice.

And that's how restaurants do it, because you're not going to wait two hours for a homestyle curry to cook off properly.


I think most traditional software engineers do indeed understand what transformations compilers do.


I'd wager a lot of money that the huge majority of software engineers are not aware of almost any transformations that an optimizing compiler does. Especially after decades of growth in languages where most of the optimization is done in JIT rather than a traditional compilation process.

The big thing here is that the transformations maintain the clearly and rigorously defined semantics such that even if an engineer can't say precisely what code is being emitted, they can say with total confidence what the output of that code will be.


> the huge majority of software engineers are not aware of almost any transformations that an optimizing compiler does

They may not, but they can be. Buy a book like "Engineering a Compiler", familiarize yourself with the Optimization chapters, study some papers and the compiler source code (most are OSS). Optimization techniques are not spell locked in a cave under a mountain waiting for the chosen one.

We can always verify the compiler that way, but it's costly. Instead, we trust the developers just like we trust that the restaurant's chef are not poisoning our food.


They can't! They can fairly safely assume that the binary corresponds correctly to the C++ they've written, but they can't actually claim anything about about the output other than "it compiles".


Yea, the pervasiveness of this analogy is annoying because it's wrong (because a compiler is deterministic and tends to be a single point of trust, rather than trusting a crowdsourced package manager or a fuzzy machine learning model trained on a dubiously-curated sampling of what is often the entire internet), but it's hilarious because it's a bunch of programmers telling on themselves. You can know, at least at a high level of abstraction, what a compiler is doing with some basic googling, and a deeper understanding is a fairly common requirement in computer science education at the undergrad level

Don't get me wrong, I don't think you need or should need a degree to program, but if your standard of what abstractions you should trust is "all of them, it's perfectly fine to use a bunch of random stuff from anywhere that you haven't the first clue how it works or who made it" then I don't trust you to build stuff for me


I think you're mistaken on that. Maybe me and the engineers I know are below average on this but even our combined knowledge of the kinds of things _real_ compilers get up to probably only scratches the surface. Don't get me wrong, I know what compilers do _in principle_. Hell I've even built a toy compiler or two. But the compilers I use for work? I just trust that the know what they're doing.


Not in any great detail. Gold vs ld isn't something I bet most programmers know rigorously, and thats fine! Compilers aren't deterministic, but we don't care because they're deterministic enough. Debian started a reproducible computing project in 2013 and, thirteen years later, we can maybe have that happen if you set everything up juuuuuust right.


They also realize that adding two integers in a higher level language could look quite different when compiled depending on the target hardware, but they still understand what is happening. Contrast that with your average llm user asking it to write a parser or http client from scratch. They have no idea how either of those things work nor do they have any chance at all of constructing one on their own.


why not just use a spreadsheet?


Valid question. Mostly because I wanted linking and couldn't be bothered to lookup vlookup. Naturally, the first alternative approach I considered was build a terminal app.


Sizing only sucks because diet and exercise habits changed since the sizes were introduced.


But then why haven't the sizes been updated? We've been gaining weight slowly over decades, they've had time.


We're all still collectively telling ourselves we'll get back in shape any day now.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: