Love that you needed to make it clear that it is humans that can explain themselves..
Employees can only be held accountable with severe malice.
There is a good chance that the person actually responsible (eg. The ceo or someone delegated to be responsible) will soon prefer to have AIs do the work as their quality can be quantified.
> I am never letting junior to midlevels into my team again
My point is, you control the experience level of the engineers on your team. The fact that you can say you won't let junior or midlevels on your team proves that.
You do not have that level of control with LLMs. Anthropic and OpenAI are roughly the same quality at any given time. The rest are not useful.
Eh. You want a good mix of experience levels, what really matters is everyone should be talented. Less experienced colleagues are unburdened by yesterday’s lessons that may no longer be relevant today, they don’t have the same blind spots.
Also, our profession is doomed if we won’t give less experienced colleagues a chance to shine.
Do you not find that depressing and sad? Do you never work with enthusiastic and talented junior developers at the start of their careers? Do you not enjoy interacting with them?
I think it would be more depressing taking in exited junior developers, spending years of their life not believing that they are growing into any real career.
> ... the start of their careers
It is exactly this assumption I am challenging.
What comes next, I don't know - and I am not trying to kid myself or any others that I am well suited as a mentor for person starting out their career in the current environment.
We should be making sure everyone has internet access, but hosting some basic pages is about 1000x cheaper, so no I don't think free internet access should come before that.
You get 80% done in 20% of the time. The LLM shrinks that 20%. So a 100 task maybe takes 5 hours instead of 20 which is great. But the remaining 80 hours are not as improved. So a 100hr job takes ~85 hours which is very good.
This is in-line with Googles study showing about a 10% productivity increase and other research I’ve read. I suspect this will increase with more integrations and workflow adaptations.
But even after power tools changed how quickly carpenters can frame and rough-in a house, the finishing work (which uses power tools too) still takes the majority of the time.
It might seem so, but it is not an AI promoted post. The book was finished with the AI tools, but a bulk of it was written by myself plus the structure and direction.
And I am human, who first finished a similar course roughly 20 years ago, worked as a TA and taught students programming and algorithms
> The troubling thought I had is that AI does not displace the technicians, or the vending machines. It replaces the manager.
This is really why ai will have a more profound impact on the society: it is fundamentally changing the hierarchy of conpetence we have gotten so accustomed to.
Why the difference that I’ve seen the exact opposite? It brutally reinforces it. It’s no longer the ability to do a task that is valuable, it’s the ability to understand what tasks need to be done.
Employees can only be held accountable with severe malice.
There is a good chance that the person actually responsible (eg. The ceo or someone delegated to be responsible) will soon prefer to have AIs do the work as their quality can be quantified.
reply