I should clarify, that I don't interview interns on those topics, nor do I
expect them to be proficient in them either. I wasn't either at that time. It
came up a couple of times that I started to wonder if there is generally a
difference in your average computer science degree when comparing the US and
Europe. All these answers are incredibly interesting, so thank you!
Personally, I still think that you should be exposed to those for a computer
science degree. Here is why:
1. Universities have, and maybe always have, evolved to meet the demands of the
labor market and most jobs will touch on those topics in one way or the other.
2. Generally, you need to persist data, whether it's on a drive, in a data
store and so on. Having heard of different data stores and maybe differences in
query languages, seems very relevant. This doesn't mean you know how to write
your own database.
3. Which processor these days does not have more than one core? Even in
languages like Python or Ruby, data races can result in subtle bugs. Having
some idea around that maybe access to shared resources needs to be protected
is useful.
4. Whether it's writing code in microservice architecture or integrating an
API, we make network requests. Having an idea how this might look different,
for HTTP, TCP or UDP provides a lot of context to make better decisions.
To me the idea of a formal degree, is some level of exposure, so that even when
you have never touched those things years later, you have some reference in
your head to start looking it up and refreshing your memory.
I also agree, that a computer science degree doesn't mean you become a software
engineer and so it might not make sense to force everyone to take those
classes, but then again, see point 1. Alternatively, which classes would be more suited or a viable alternative to those if you have to make a choice?
The common connection between programming for databases, for networks, and for multithreads/cores/processors is that all of them are about concurrency at different levels of abstraction.
So American CS curricula do emphasize teaching concurrency as a theme, as well as expose students to all levels of HW/SW abstraction (architecture, OS, networks, etc.). It's just that there's less emphasis on the specific trends because those are expected to change over time anyways. Also, CS is diverse, you wouldn't necessarily expect a quantum computing student or an AI student to know industrial level practices for concurrent programming.
Personally, I still think that you should be exposed to those for a computer science degree. Here is why:
1. Universities have, and maybe always have, evolved to meet the demands of the labor market and most jobs will touch on those topics in one way or the other.
2. Generally, you need to persist data, whether it's on a drive, in a data store and so on. Having heard of different data stores and maybe differences in query languages, seems very relevant. This doesn't mean you know how to write your own database.
3. Which processor these days does not have more than one core? Even in languages like Python or Ruby, data races can result in subtle bugs. Having some idea around that maybe access to shared resources needs to be protected is useful.
4. Whether it's writing code in microservice architecture or integrating an API, we make network requests. Having an idea how this might look different, for HTTP, TCP or UDP provides a lot of context to make better decisions.
To me the idea of a formal degree, is some level of exposure, so that even when you have never touched those things years later, you have some reference in your head to start looking it up and refreshing your memory.
I also agree, that a computer science degree doesn't mean you become a software engineer and so it might not make sense to force everyone to take those classes, but then again, see point 1. Alternatively, which classes would be more suited or a viable alternative to those if you have to make a choice?