Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> that is decidedly not happening

Regardless of anything else it’s extremely too early to make such claims. We have to wait until people start allowing “AI agents” to make autonomous blackbox decision with minimal supervision since nobody has any clue what’s happening.

Even if we tone down the SciFi dystopia angle not that many people really use LMMs in non superficial ways yet. What I’m most afraid of would be the next generation growing without the ability to critically synthesize information on their own.



Most people - the vast majority of people - cannot critically synthesize information on their own.

But the implication of what you are saying is that academic rigour is going to be ditched overnight because of LLMs.

That’s a little bit odd. Has the scientific community ever thrown up its collective hands and said “ok, there are easier ways to do things now, we can take the rest of the decade off, phew what a relief!”


> what you are saying is that academic rigour is going to be ditched overnight

Not across all level and certainly not overnight. But a lot of children entering the pipeline might end up having a very different experience than anyone else before LLMs (unless they are very lucky to be in an environment that provides them better opportunities).

> cannot critically synthesize information on their own.

That’s true, but if we even less people will try to so that or even know where to start that will get even worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: