Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You are starting with an axiom that I'm talking nonsense and then bending everything I write in that direction instead of reading what I write. E.g.,

"I regret to say that that's not as easy as it sounds. There are issues of tractability, and more importantly, there's this not-quite-cottage industry of software development, which needs good-enough languages to use _today_."

I didn't say it was "easy", and clearly my point is progress and not current software development. Sure, for my project, I'm typing in code in Visual Basic .NET. For writing code today, all things considered for my context, that is about the best option.

But what I'm typing in is hardly different from what anyone might have typed in all the way back to Algol.

Here's the point: People type in all that code, and then what? Can some software go over it, report properties, do some transformations with known, useful properties? Not very much. Such things won't be better for Visual Basic than they were for Algol or anything between.

So, they type in the code, and then all they have is just that code. They can desk check it, test it, try it, revise it, etc., but it's all still digging the Panama Canal with a teaspoon one canal at a time. That is, there's no real automation and no significant exploitation of mathematical properties that could lead to automation.

Or, return to most of the rest of engineering where we have specifications of resistors, capacitors, inductors, transistors, fans, copper wire, steel sheets, aluminum bars, etc. with ohms, farads, density, tensile strength, electrical conductivity, etc. Okay, now compare with some DLL or its source code: What engineering properties do we have. Essentially none. It's like building with iron before we knew about tensile strength.

Sure, with some such research progress, eventually programmers would benefit. Obviously the intention is real progress in software productivity. We're not going to get such productivity by more of the same that got us C, C++, Java, Python, Visual Basic .NET, C#, etc.

Also for another of your points, I'm not talking about anything like branch prediction or deep pipelining. That's basically what to do with with the hardware to execute an existing instruction set.

Also the point is not math or not. The point is progress. Math will be necessary but not sufficient. That is, just stuffing in some math won't yield more than the old trick of putting two extra, unused transistors in a radio to claim nine transistors instead of seven.

Your point that there's a lot of good CS to do without 'mathematizing' the field is not promising for research or significant progress.



> Sure, for my project, I'm typing in code in Visual Basic .NET. For writing code today, all things considered for my context, that is about the best option.

Maybe you should try something other than VB.NET, then. There are ways of constructing correct code, if you're patient enough, that is. If you're using the CLR, why not use F# instead of VB? That'd be a big step towards being able to prove some things about your code. Or if you really want to take out the big guns, go for Coq, Isabelle, Agda, Epigram, etc.

> Can some software go over it, report properties, do some transformations with known, useful properties?

Yes, in many cases. There's a whole lot of work in static code analysis, and refactoring tools. It's not perfect (the halting problem being non-decidable and all that) but there have been _significant_ improvements since the Algol60 days, and that's not counting functional languages and type-theoretic approaches.

> Or, return to most of the rest of engineering

That goes a bit off topic, but at the stage we are in, for most practical purposes, "software engineering" is not really engineering. Except maybe when done by NASA, but then again, that's not a practical approach either.

> Also for another of your points, I'm not talking about anything like branch prediction or deep pipelining. That's basically what to do with with the hardware to execute an existing instruction set.

Oh, but you were talking about compilers. Modern optimizing compilers have to take those things into account, among many other things.

> Clearly compiling and executing a program are necessarily mathematically something, understood or not, powerful or not. For progress, we need to understand the subject mathematically.

(from your previous post) My point was that very significant progress has been made in many fields, even if not necessarily formalized.

> Also the point is not math or not.

Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true.

> The point is progress.

And I've agreed with you on this. More and better maths can help advance CS. But we knew that already.

> Your point that there's a lot of good CS to do without 'mathematizing' the field is not promising for research or significant progress.

I contend that there has been significant progress in many CS areas without 'mathematizing' them. That is a fact. I also stated, in my previous post, that I agree that maths could help improve this progress. I think my problem with your position is that you're talking in absolutes in topics where those absolutes clearly don't hold.


"Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true."

I'm exaggerating, but, still there is a reasonable point here. CS is about some 'science', and call that the mathematical part where we have some solid material worth being called 'science', For the rest, call that 'computer practice' or some such.

The upside of my view is that for some serious progress we're going to have to use some serious methodology. So, I'm proposing if not mathematical physics envy then applied math envy. Applied math didn't take on how to design the display lights in a scientific pocket calculator although without the lights the thing wouldn't work.

My view is not the most extreme: Last year I communicated with a CS prof whose position was that CS is looking for the 'fundamentals of computation'. Hmm .... It sounds like he believes that the P versus NP question should be right at the top of the list, and I don't. I'll settle for anything that is solid and a contribution, even if small, to the 'science' and not just to current practice.

LIkely many people here know the current state of programming language research much better than I do. If that field has gotten nicely mathematical with some solid material, great. The progress from Fortran, Cobol, Algol, Basic, PL/I, Pascal, C, C++, etc. was pragmatic and important and of enormous value to the economy but not much progress in a 'science' of programming languages, and that progress, with poor methodology, has slowed as we might have expected.

Maybe I'm saying that, in 1920, if we wanted a really good airplane, then maybe we should set aside the wood, linen, and glue and go do some aerodynamiocs calculations, discover Reynolds number, and discover that those really thin wings were a mistake. Or, observational astronomy was just a lot of curiosity until Newton came along and made progress in understanding the universe. The practical chemists had discovered a LOT, but by applying quantum mechanics they made HUGE progress.

If we are going to make the huge progress we want in computing, then history suggests that we can't be just pragmatic and that "theory is under-rated". We can't expect that chemistry will do much to help CS, but the obvious tool is math, to turn CS, the science, part into some applied math.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: