Yeah I also keep thinking this. I don't see LLMs reliably producing code that is up to my standards. Granted I have high standards because I do take pride in producing high quality code (in all manner of metrics). A lot of the time the code works, unfortunately only for the most naive, mechanical definition of "works".
I think your argument is predicated on LLM coding tools providing significant benefit when used effectively. Personally I still think the answer is "not really" if you're doing any kind of interesting work that's not mostly boilerplate code writing all day.
Define interesting. In my experience most business logic is not innovative or difficult, but there are ways to do it well or ways to do it terribly. At the senior levels I feel 90% of the job is deciding the shape of what to build and what NOT to build. I find AI very useful in exploring and trying more things but it doesn’t really change the judgment part of the job.
How much of software programmer work is interesting? A fraction of a percent? I'd argue most of us including most startups work on things that help make businesses money and that's pretty "boring" work.
> mapping of specific nerves to intensity of feeling on the CNS would imply complex hardcoding of something which is much easier to solve with "this place important, have more nerves"
Not saying your answer is right or wrong, but I don't think this is a sufficient explanation. If the body can differentiate areas enough to produce more nerves in one area, then it could plausibly instead produce fewer nerves which inherently produce a stronger signal - just as we have nerves which respond differently to different stimuli (e.g. heat, light, etc). Also it could be neither and we kinda randomly ended up with what we have because no option was strongly disadvantageous at the time.
Half-serious reason: because with each C++ version, we seem to get less and less what we want and more and more inefficiency. In terms of language design and compiler implementation. Are we even at feature-completeness for C++20 on major compilers yet? (In an actually usable bug-free way, not an on-paper "completion".)
Feature complete is a pretty hard goal to reach. It sounds like "added all the features" but is closer to "bug compatible across compilers" (not saying there are bugs just that recent versions have removed a lot of wiggle room for implementations)
Also modules was a lot and was kind of the reason it took so long. They are wonderful and I want them but proper implementations (even with many details being implementation defined) required a lot of work to figure out.
Most of the time all the compilers get ahead of the actual release but in that case there were so many uncertainties only rough implementations were available beforehand and then post release they had to make adjustments to how they handled incremental compilation in a user facing way effectively.
The compiler design is definitely becoming more complicated but the language design has become progressively more efficient and nicer to use. I’ve been using C++20 for a long time in production; it has been problem-free for years at this point. It is not strictly complete, e.g. modules still aren’t usable, but you don’t need to wait for that to use it.
Even C++23 is largely usable at this point, though there are still gaps for some features.
Funny how gcc seems to be the top dog now, what happened to clang? Thought their codebase was supposed to be easier and more pleasant to work with? Or maybe just more hardcore compiler devs work on gcc?
Since forever, I've been pretty stubborn with managing my tech/digital stuff myself, even if it's kind of a pain, and this kind of shit reinforces that belief. Other entities, especially big companies, cannot be trusted with your tech. The best place for my hardware is in my house; software, in my own git repo; data, in plain files on my hard drives. The fewer hands that touch it, the better. Just let me have it my own way, please.
I don't even think you're necessarily wrong overall, but damn does the photographer in me want to strongly disagree with this:
> But phone software is so advanced now that it takes real talent and skill to replicate the perceived quality of what users get with their cell phone's software automatically
I don't know man, what you get out of a DSLR/mirrorless is just on another planet compared to a phone camera... The raw quality, detail, and richness of a photo captured with a big sensor and big lens is something special.
Phone photos can look superficially good. And for some photo styles this is enough. But when I look at a phone photo, I'm often left lacking a "draw you in" factor, because so much of the detail and lighting is more or less faked through software. There is no ambience, no mood.
Same! For me this has had the unfortunate side effect of ruining non-art photo-taking. Someone shows me a regular phone photo and the photographer part of my brain is thinking "why did you even take this...?" Yes I'm a terrible friend
I used to take a lot of photos and then cull them afterwards before editing. It worked, although I was often loving shooting and dreading editing, because before doing any actual editing, I'd have 200 photos to sift through.
After a lot of practice, I became better at culling in my head, before even taking the photos. This has shifted my relationship with photography to more of a cognitive exercise, with a different set of enjoyments. I take far fewer photos overall - often I go out with a camera and don't even take a single shot. Editing is more enjoyable because there's less to do and I already know what edits I want. It's less naively fun but more contently fulfilling.
On the point of film, I agree, although I won't say it's necessarily a bad thing overall. Just a bit silly when people try to claim film is somehow "superior" or whatever.
Film is absolutely a cultural experience for many people shooting it today. The main argument I have to confirm this is to consider that most people's photos are not good, to start with. (Talking about the average joe, not pro photographers.) So any comment about film's technical capabilities is moot. You can take bad photos on film or digital. Also you can take good photos on film or digital! Unless you're really doing some good experimental photography, you gotta admit that the film motivation is vibes.
Also on editing applications, Lightroom does have pretty good all round features , which is hard to find elsewhere. For example Darktable technically works, but the UI is poor, the performance is poor, and it's generally slower to achieve the same results.
If someone wants to make an open source Lightroom clone, I'll be all for it!
Ah true. In niche cases one technology can be better than the other, sure. I was more talking about people who buy a disposable film camera and think they're Ansel Adams... (For the record, I think the same about people who buy old point and shoot digitals. Totally a fad now)
reply