From your earlier comment, your curiosity was more about what happens after we run out.
In your question you stated the running out as a given fact ("When" we run out, not "if").
If that was what you wanted to say I can't tell you, but that's definitely how it was received and thus you also got the harsh response. Since it reads a lot like doomsday thinking.
(Example: Does that mean when we run out of oxygen there are no more humans?
Yes, my curiosity was about when we run out, because I didn’t know if we would run out. That was the whole point of the question. Have some leniency, we’re not all experts about everything.
I agree, I like some of the directions the fork would go and dislike some. The apparent, fork, publish on HN, then change (and the change showing not a lot of understanding) makes me throughly question the legitimacy and long term stability of it.
That's not entirely the case in Germany. Applicants need to give a lecture which is public. Usually members of the student union will be present and will have a say later within the hiring committee about the quality of teaching.
But I do agree that the ability to produce and procure research is not at all coupled with the ability to teach.
Time of delivery would be the biggest factor. Today we can send multiple quick messages to anyone, at that time people had to batch big discussions in a long articulated text, since it would take days to it arrive.
I guess the closest to that nowadays would be blog articles, RFC discussions or long-form email threads.
You are aware where neural nets originated? Who came up with the concept of backpropagation to train them?
Sure, nowadays this whole field is pushed forward by industry. But I would argue that for most technological advancements the foundations are laid in traditional academia.
Of course if you are not in academia you will only ever get in touch with the things which work out and get picked up by industry, completing the impression that only industry is doing valuable stuff. Insert suvivorship bias meme here
The earliest neural network work on perceptrons was done in an applied contract lab for the US Navy, it was not done in an academic setting.
Backpropagation has been reinvented multiple times, because it is a basic application of the chain rule. The earliest recognizable usage of it is in control theory at NASA during the Apollo program.
It's a mistake to be dismissive of academic work which has been very important, but it's equally a mistake to think that academia is the sole source of foundational work.
I guess the point of a three year window is to be able as an ecosystem to at some point adopt new language features.
When you have some kind of ecosystem rule for that, you can make these upgrade decisions with a lot more confidence.
For example in my project I have a dependency on zstandard. In 3.14 zstandard was added to the standard library. With this ecosystem wide 3 year support cycle I can in good confidence drop the dependency in three years and use the standard lib from then on.
I feel like it just prevents the ecosystem from going stale because some important core library is still supporting a really old version, thus preventing other smaller libraries from using new language features as well, to not exclude a large user base still on an old version.
For the 20 years war you are probably talking about: I wouldn't call significant civil unrest in opposition of the war "getting bored"
reply