This turned into a bit of a rant, but hopefully it does add something to the discussion.
You make some interesting points, but in my (limited) anecdotal experience, C is a pretty awful language.
Of all the languages I half-way know, I think I'm still least comfortable with C. Why not start out by teaching assembler? Especially post 32bit x86, assembler (intel syntax) is rather pleasant. It allows highlighting lots of things that are commonly hidden in C: calling conventions, different types of strings (eg: C vs Pascal) etc.
C is obviously very useful as a "real-world" language. If you need to add a driver you need to Linux, or patch a bug in your shell (or shell utility) etc -- knowing at least enough C to be dangerous can be very helpful. You'll likely be able to put C to good use re-implementing critical sections of python/ruby/whatever code. But IMNHO C still remains a rather awful language.
For some reason, I feel that pointers are easy in Pascal (and assembler) and hard to grasp in C -- as far as I can tell this is entirely a syntactical issue for me.
In the future (post rust 1.0), perhaps I'd recommend assembler, rust and some other high-level language (or two...). I find it hard to recommend Pascal -- while compilers exist, it is a little too much of a "teaching language" -- and a little too far on the outside of everything. Ada might be an interesting option.
Perhaps assembler > C > python. Or assembler > SBCL (Steel bank common lisp) -- because it integrates well with assembler, allowing the user to easily "peek under the covers".
C is awfully low level -- and yet also surprisingly high-level. There's no sane, canonical way to handle text (that is utf8/unicode -- not merely ascii). I'm not entirely sure C++ is much better in this regard, but I still think Storstroup's contrast between "beginner C" and "beginner C++" from 1998 is interesting[1]. I'd be curious what kind of production level code a beginning C programmer produces, and for what kind of problems.
Fundamentally though, I'm not sure there is a good "one way fits all". A lot of people will be served fine with only Python or Ruby (or Julia). Thinking too much about how the computer works can also be a hindrance, I think. Getting actual work done, does seem to hinge on joining together proven, useful tools. Be that tool Berkley DB, SQlite, Postgres, levledb, Riak, redis -- or text files (for example). Or protobuf or json or xml. Etc.
At any rate, I fail to see, from the point of "understanding how the computer works" -- it helps very much to learn (just) C. Without assembler, I think most of the useful lessons are lost -- and I'm not convinced the number of lesson most obvious in C -- but not covered by assembler on one end, and a high level language on the other -- are that many?
For programmers familiar with Ruby or Python, my experience is that the main points of confusion are all around what is happening in memory. What is a Ruby String/Array/Hash/etc., _really_? That sort of thing.
It might seem trivial to you, but the idea that you have a bunch of 1s and 0s in memory that the computer somehow "knows" is an integer or a character or whatever else is a HUGE intellectual hurdle for people who haven't considered it before. For that reason, I think it's a mistake to put a student in a situation where they're grappling with both memory and things like calling conventions simultaneously and for the first time.
I can't comment on Pascal, but I don't think C++ is an accurate reflection of the computer's own perspective on how work is happening. The main thing C does is make the way your program interacts with memory much more visceral.
There's the added benefit that the most popular interpreters for Ruby and Python (MRI and CPython, respectively) are written in C. This gives the teacher an opportunity to connect concepts in C _directly_ to already-understood concepts from another context.
All in all, I think the debate about which language(s) to learn is overstated, albeit one that programmers love to have for tribal reasons. The same concepts are embedded in all of them in different ways. The important thing is extracting, isolating, and abstracting those concepts.
You make some interesting points, but in my (limited) anecdotal experience, C is a pretty awful language.
Of all the languages I half-way know, I think I'm still least comfortable with C. Why not start out by teaching assembler? Especially post 32bit x86, assembler (intel syntax) is rather pleasant. It allows highlighting lots of things that are commonly hidden in C: calling conventions, different types of strings (eg: C vs Pascal) etc.
C is obviously very useful as a "real-world" language. If you need to add a driver you need to Linux, or patch a bug in your shell (or shell utility) etc -- knowing at least enough C to be dangerous can be very helpful. You'll likely be able to put C to good use re-implementing critical sections of python/ruby/whatever code. But IMNHO C still remains a rather awful language.
For some reason, I feel that pointers are easy in Pascal (and assembler) and hard to grasp in C -- as far as I can tell this is entirely a syntactical issue for me.
In the future (post rust 1.0), perhaps I'd recommend assembler, rust and some other high-level language (or two...). I find it hard to recommend Pascal -- while compilers exist, it is a little too much of a "teaching language" -- and a little too far on the outside of everything. Ada might be an interesting option.
Perhaps assembler > C > python. Or assembler > SBCL (Steel bank common lisp) -- because it integrates well with assembler, allowing the user to easily "peek under the covers".
C is awfully low level -- and yet also surprisingly high-level. There's no sane, canonical way to handle text (that is utf8/unicode -- not merely ascii). I'm not entirely sure C++ is much better in this regard, but I still think Storstroup's contrast between "beginner C" and "beginner C++" from 1998 is interesting[1]. I'd be curious what kind of production level code a beginning C programmer produces, and for what kind of problems.
Fundamentally though, I'm not sure there is a good "one way fits all". A lot of people will be served fine with only Python or Ruby (or Julia). Thinking too much about how the computer works can also be a hindrance, I think. Getting actual work done, does seem to hinge on joining together proven, useful tools. Be that tool Berkley DB, SQlite, Postgres, levledb, Riak, redis -- or text files (for example). Or protobuf or json or xml. Etc.
At any rate, I fail to see, from the point of "understanding how the computer works" -- it helps very much to learn (just) C. Without assembler, I think most of the useful lessons are lost -- and I'm not convinced the number of lesson most obvious in C -- but not covered by assembler on one end, and a high level language on the other -- are that many?
[1] http://www.stroustrup.com/new_learning.pdf