Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think there's validity to your argument, but you come across as a bit arrogant. A few comments to your post:

>The CS profs make a mess out of math.

Properly done CS _is_ maths. Maybe most of it is not in the branches you're interested in, but the core of CS is maths (think Turing, Church, Von Neumann, Codd, Hoare, Martin-Löf, etc.).

>it's a strain to call 'flows on networks' computer science.

I've always heard it called part of operations research, which happens to be an rather useful field to teach CS students. Yes, it is taught at probably a very basic level, but it is still useful as taught. And saying that:

>So, CS wants to hijack flows on networks. Not really good.

is disingenuous at best. Actual networks, like the one you're using now to post about your discontent with CS are basically run on computers, and are, not surprisingly, part of what CS people worry about. There's a whole field of very applied CS that's a direct application of some basic operations research results.

>For both teaching and research, CS needs to clean up its act in math.

In a general sense, yes, it would be good. Maybe you also need to read some better CS stuff?

>I have an example: Some years ago I took a problem in practical computing and derived some new probabilistic math for a solution.

You go on to rant how editors of CS journals weren't smart enough to understand your paper. That might well be the case, but it is obvious that you were not any smarter, sending a paper which is pure maths in nature to several CS journals, when the "problem in practical computing" was merely an application of your result, and not the main point of your paper. If you were a physicist working on some very deep results of quantum mechanics that happen to have a direct application to, say, molecular biology, you would still send it to a physics journal, not a biology journal, wouldn't you?

>Here's what they need to do:...

That's a good plan for learning a bunch of maths, not CS.



"Properly done CS _is_ maths."

Okay, then you would agree with my thought on this thread that the math departments should just take over CS! The CS would get some much better math, and the math departments would get a big head and tummy ache from the much needed contact with applications. Fine with me.

"is disingenuous at best."

No. That chemistry uses some group theory for molecular spectroscopy and makes a mess out of it, which it has been known to do, doesn't mean that a mathematician who sits in a chair with plastic fake leather shouldn't comment. So, yes, IP routing has been based on some network shortest path algorithms. So, they used some operations research material since really it was operations research that first dug into networks that deeply -- yes, I know, there was a Nobel prize in economics for some work by Kantorovich (?) in the late 1940s on the transportation problem and, earlier, work in circuit theory as in Kirchhoff. Still, e.g., with Ford and Fulkerson and then Jack Edmonds, network shortest path and flows on networks should be attributed to operations research. If IP routing got to use it, then fine. Once I used Cunningham's version of the network simplex algorithm on a problem in allocating sales forces in a tricky situation: The problem looked like just integer linear programming, but if looked again it was just network flows where can get integer solutions for no extra effort. That doesn't mean that network flows should be part of courses in selling!

But CS is not just using network flows but calling the subject part of computer science. That's hijacking! Or it justifies my "Not really good.".

But, such things will happen. There is no law that says that have to be an operations research person to publish in an operations research journal or to publish about networks in a computer science journal. The NSF even works to encourage 'cross-cutting'.

Since my writing is so 'succinct', maybe I wasn't clear! If CS wants to take a field like network flows for their own, then actually do a competent job of it and don't mess it up. So, when there's is need, as there is, for the simplex algorithm, and not just using it but calling it part of CS, then do a competent job with the simplex algorithm. If probabilistic analysis of the performance of algorithms is to be part of CS, then CS should do a competent job with the probability. Else, CS is diluting the quality of these subjects. I argued elsewhere on HN that CS has diluted the meaning of dynamic programming, also essentially hijacked by CS.

If you want to say that CS done well is math, then they should do the math well.

When I was in grad school, I took some courses in measure theory and functional analysis from a department not the pure math department. And I took some statistics. When the statistics got to sufficient statistics, which is from the Radon-Nikodym theorem of measure theory, the prof totally blew it. I just walked out of the class; mostly I knew the material at the elementary level anyway, and at the level I wanted to learn the material the course was a waste of my time. Well, in the courses in measure theory and functional analysis, the profs maintained quality only slightly short of Bourbaki. They didn't have less than a beautifully polished proof for anything. Any hint or suggestion that there was a logical gap was a very big deal in the department. When I walked out of the stat course, the department dumped on me. Then I dumped back on the mess made of the R-N theorem. The next year the stat prof was gone. That department, not pure math, took the math fully seriously: A prof who blew a good proof of one important theorem was looking for a job. The situation was SERIOUS. We weren't a pure math department, but we did the math in rock solid ways. The department also had some operations research: Similarly. The math was rock solid. E.g., at one point in the Kuhn-Tucker conditions, I objected that a proof had used but not assumed continuous differentiability. I worked up a proof with just differentiability, and my proof was in the course the next semester.

So, broadly, just because we were interested in some math topics not common in pure math departments, and interested in applications, didn't mean we did the math at a lower level of quality. We were as close to Bourbaki as anyone. It can be done. CS should do it, too. Sorry 'bout that. Low quality is not good; I called out some low quality. I was fully correct to do so. The problem is not my arrogance but their low quality.

"In a general sense, yes, it would be good. Maybe you also need to read some better CS stuff?"

So, we are in agreement, "it would be good".

For my reading, that's about me, and that's not appropriate. The issue is the book CLRS or whatever it is called. I'm not the issue.

But for some "better CS stuff", you mentioned von Neumann. Right: In my favorite source of the R-N theorem, the proof is by von Neumann. In my current project, there's some math at the core, some original, and some due to von Neumann. I credit him and Kolmogorov.

For the paper I sent, I sent it to an appropriate journal. The motivation for the research was a problem in computer science. The main application for the original part of the research was just to that problem. That there was some original math at the core of the paper was part of the work as in your remark that CS should be math.

You are agreeing with my main point that CS should clean up its math and then calling me arrogant for making this point. Can't have it both ways.

For the course of study I outlined, what important about CS does that omit! I know; I know; it omits heap sort! And AVL trees. Gee, looks like that course of study needs more, maybe a lecture, maybe even two, of an hour each?

Do the math. For the CS, do that in the exercises and footnotes!


>Since my writing is so 'succinct', maybe I wasn't clear! If CS wants to take a field like network flows for their own, then actually do a competent job of it and don't mess it up

I really hope you were being sarcastic when referring to your writing as succinct. Anyway, who is claiming anything for their own? If some problem in CS leads to some new result in operations research, then a paper will be published in a nice operations research journal, and everyone will be happy. If the math is not good enough, the paper will not be published, and the author will have to rework it. Now maybe your claim is that CS journals with a focus on operations research are not sufficiently rigorous to your taste, but that's a different matter altogether.

>Okay, then you would agree with my thought on this thread that the math departments should just take over CS!

No, I don't. Firstly, CS is in itself a branch of maths, but in practice, most of the stuff that CS people do is of basically no interest to pure mathematicians, and vice-versa. Hence, I don't see any point in having a math department take over a CS department. Secondly, there are many things that normally fall within the purview of CS departments which are not at all related to maths (in any meaningful, abstract way), such as OSs, computer architecture courses, software engineering, compilers, etc. While all of those have a basis in maths, they certainly don not belong under a maths department. Of course it could be argued that there should be some "Applied CS department" or "Software Engineering" department or some such that would take those courses and research areas out of the "Pure CS" field, but that would end up doing a disservice to students and researchers alike.

>For the paper I sent, I sent it to an appropriate journal.

Sorry, but I disagree. If none of the reviewers, and none of the editors, or anyone they knew was able to understand your paper, either you wrote something incomprehensible, or you submitted it to the wrong venue. I don't doubt that you're a math genius who rocks measure theory and functional analysis, but I find it hard to believe that it an on-topic paper flew over the head of all the editorial staff and reviewers in any given area.

>You are agreeing with my main point that CS should clean up its math

Only up to a point. Better maths in CS research could be an improvement, as long as it doesn't veer into non-CS areas just for the sake of doing maths. I don't agree at all that CS students should master mostly unrelated areas of pure maths, though, as there are much more relevant issues within CS (plus its "applied" branches) and only so much time students can spend at the university.

>For the course of study I outlined, what important about CS does that omit! I know; I know; it omits heap sort! And AVL trees. Gee, looks like that course of study needs more, maybe a lecture, maybe even two, of an hour each?

I would think Donald Knuth would be in disagreement with you there. Also, your suggested program veered into pure maths for things that are 99.99999% out of scope in CS, while neglecting seemingly all applications of CS.

Also, I called you arrogant because that's how you come through in your writing. As I said, I don't doubt you're a very gifted mathematician, but from what you just wrote here, I think your grasp of actual CS topics is not so great.


Part II

For the CS topics you listed, e.g., programming languages, that field has for decades just cried out for some good math but gotten nearly none! Clearly compiling and executing a program are necessarily mathematically something, understood or not, powerful or not. For progress, we need to understand the subject mathematically. E.g., given a programming language statement, what are its mathematical properties so that given a sequence of two such statements, what are the mathematical properties? Can't see any? Okay: Change programming languages until can. Then with the properties, what can we conclude about the program? Anything useful, say, about correctness, running time, storage utilization, approaching limitations? Given some properties, can we have some code transformations with some known properties? Are some such useful? If not, can we define some useful programming languages that admit such transformations?

There are some simple, illustrative, pregnant cases that just leap off the screen: Write a collection of routines for sort by surrogate, i.e., finding a permutation, maybe honoring sort 'stability', applying a permutation to an array, chasing through the fact that each permutation is a product of disjoint cycles, etc. Then with such routines, will see some properties that are close to algebraic, e.g., where some combinations of the routines are the identity transformation, where one routine is the algebraic inverse of another, where we might have commutativity and/or associativity, or are equivalent to other combinations, etc. Can we preserve sort 'stability'? So, we see that we should have an 'algebra' of those routines and be able to derive known properties. Then we see that due to 'boundary' limitations, etc., our algebra is only roughly correct. So, maybe we should fix that.

This is old stuff. It has leaped off the screen for decades. We want some known properties we can use to derive new, useful properties. Instead all we get is programming languages that compete in some absurd 'beauty contest' or that 'encourage' programming 'styles' seen as 'helpful'. Sicko.

E.g., it has been common now in server farms to be concerned about reliability. So, we set up systems so that when system A gets too busy or fails, then system B takes over, etc. However, what we've seen, including at some of the most important farms, for decades, up to this year, is that these intuitive approaches far too easily go "splat" in the mud: They are 'unstable', have problems that propagate, and are surprisingly UNreliable.

So, we need some new math of 'reliability' for such server farms, math that can write us some guarantees. That math will start with the real problems, have definitions, theorems, and proofs, and, if successful, some solid tools for building reliable server farms.

All across computing, CS needs to give us progress, and for that they need to proceed essentially only mathematically.

It's the math. The CS is the math. Server farm reliability is just the application of the math and is not the CS. Ignoring math, CS is "A little people. A silly people."! Until CS gets serious about the math, it is writing checks its methodology can't cash.

To get serious about math, CS first has to learn LP well!

The would you believe probability? How about stochastic processes? Any stochastic processes in server farms and networks? Any uses of the axiomatic derivation of the Poisson process? Any roles for one of the strongest inequalities in math, the martingale inequality and the associated convergence theorem? Measure preserving transformations as in ergodic theory -- I'll answer that one, "Yes!". The renewal theorem and loads at server farms? How 'bout a little in probability?

Understanding now?


>Understanding now?

Not really, as your raving all over the place and still not making a lot of sense, argument-wise.

I guess I'll just agree to disagree, I feel bad for making your write so much.

Just a few of parting shots:

>Clearly compiling and executing a program are necessarily mathematically something, understood or not, powerful or not. For progress, we need to understand the subject mathematically.

If you think that there has been no progress in these areas, I'm sorry, but you really don't know what you are talking about. Could that progress have been faster, had the maths behind it been formalized? Maybe so, yet I don't see that many mathematicians working on branch prediction on deep pipelining processor architectures, and plenty of other things that lowly CS-ers have worked out. Want to see more maths in all of CS? Maybe you need to bring them to it, in the form of useful tools that can deal with fast moving targets.

>what are the mathematical properties? Can't see any? Okay: Change programming languages until can.

I regret to say that that's not as easy as it sounds. There are issues of tractability, and more importantly, there's this not-quite-cottage industry of software development, which needs good-enough languages to use _today_.

> this is old stuff. It has leaped off the screen for decades. We want some known properties we can use to derive new, useful properties.

You mean like Agda, Epigram, and countless other formal systems for program specification, derivation and correctness proving?

> Instead all we get is programming languages that compete in some absurd 'beauty contest' or that 'encourage' programming 'styles' seen as 'helpful'. Sicko.

This coming from a mathematician is strange to hear. Elegance and expressiveness are important when programming. It makes for succinctness and less bugs, and improves programmer productivity (which down here in the real world, means $$$).

>How about stochastic processes? Any stochastic processes in server farms and networks?

How about them? Plenty of very CS-oriented research based on stochastic processes. I should know, I did my doctoral research within a 45 person team on networks and network modelling, where most of the big guys were mathematicians. These people (not me, I do very applied stuff in a very bastardized field) breathe and live Markov processes and queuing systems. I don't see what you're complaining about here.

>E.g., it has been common now in server farms to be concerned about reliability. So, we set up systems so that when system A gets too busy or fails, then system B takes over, etc. However, what we've seen, including at some of the most important farms, for decades, up to this year, is that these intuitive approaches far too easily go "splat" in the mud: They are 'unstable', have problems that propagate, and are surprisingly UNreliable.

The folks over at Google would like a word or two with you on this.

> Until CS gets serious about the math, it is writing checks its methodology can't cash.

Maybe, and yet it seems to be doing not at all bad, as far as I can see. Would more and better maths be useful? Surely so. Would relegating CS to "footnotes" make any sense? We surely disagree there.

There's a whole continuum in CS, from the highly abstract and "math-y" to very applied stuff that has a very faint, if any, relation with what mathematicians normally worry about, and yet is critical for actual applications. I just find it odd that you don't see that fact.

There is a trend towards more maths in plenty of CS topics that were once just "practical stuff", so maybe the day will come when maths departments will branch out into real-world CS, and CS students will need more math, as you'd like. Given that we've been doing "maths" for 3K years, and CS for less than 100, you might need to wait a bit, though.

Anyway, thanks for the discussion, and have a nice weekend.


You are starting with an axiom that I'm talking nonsense and then bending everything I write in that direction instead of reading what I write. E.g.,

"I regret to say that that's not as easy as it sounds. There are issues of tractability, and more importantly, there's this not-quite-cottage industry of software development, which needs good-enough languages to use _today_."

I didn't say it was "easy", and clearly my point is progress and not current software development. Sure, for my project, I'm typing in code in Visual Basic .NET. For writing code today, all things considered for my context, that is about the best option.

But what I'm typing in is hardly different from what anyone might have typed in all the way back to Algol.

Here's the point: People type in all that code, and then what? Can some software go over it, report properties, do some transformations with known, useful properties? Not very much. Such things won't be better for Visual Basic than they were for Algol or anything between.

So, they type in the code, and then all they have is just that code. They can desk check it, test it, try it, revise it, etc., but it's all still digging the Panama Canal with a teaspoon one canal at a time. That is, there's no real automation and no significant exploitation of mathematical properties that could lead to automation.

Or, return to most of the rest of engineering where we have specifications of resistors, capacitors, inductors, transistors, fans, copper wire, steel sheets, aluminum bars, etc. with ohms, farads, density, tensile strength, electrical conductivity, etc. Okay, now compare with some DLL or its source code: What engineering properties do we have. Essentially none. It's like building with iron before we knew about tensile strength.

Sure, with some such research progress, eventually programmers would benefit. Obviously the intention is real progress in software productivity. We're not going to get such productivity by more of the same that got us C, C++, Java, Python, Visual Basic .NET, C#, etc.

Also for another of your points, I'm not talking about anything like branch prediction or deep pipelining. That's basically what to do with with the hardware to execute an existing instruction set.

Also the point is not math or not. The point is progress. Math will be necessary but not sufficient. That is, just stuffing in some math won't yield more than the old trick of putting two extra, unused transistors in a radio to claim nine transistors instead of seven.

Your point that there's a lot of good CS to do without 'mathematizing' the field is not promising for research or significant progress.


> Sure, for my project, I'm typing in code in Visual Basic .NET. For writing code today, all things considered for my context, that is about the best option.

Maybe you should try something other than VB.NET, then. There are ways of constructing correct code, if you're patient enough, that is. If you're using the CLR, why not use F# instead of VB? That'd be a big step towards being able to prove some things about your code. Or if you really want to take out the big guns, go for Coq, Isabelle, Agda, Epigram, etc.

> Can some software go over it, report properties, do some transformations with known, useful properties?

Yes, in many cases. There's a whole lot of work in static code analysis, and refactoring tools. It's not perfect (the halting problem being non-decidable and all that) but there have been _significant_ improvements since the Algol60 days, and that's not counting functional languages and type-theoretic approaches.

> Or, return to most of the rest of engineering

That goes a bit off topic, but at the stage we are in, for most practical purposes, "software engineering" is not really engineering. Except maybe when done by NASA, but then again, that's not a practical approach either.

> Also for another of your points, I'm not talking about anything like branch prediction or deep pipelining. That's basically what to do with with the hardware to execute an existing instruction set.

Oh, but you were talking about compilers. Modern optimizing compilers have to take those things into account, among many other things.

> Clearly compiling and executing a program are necessarily mathematically something, understood or not, powerful or not. For progress, we need to understand the subject mathematically.

(from your previous post) My point was that very significant progress has been made in many fields, even if not necessarily formalized.

> Also the point is not math or not.

Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true.

> The point is progress.

And I've agreed with you on this. More and better maths can help advance CS. But we knew that already.

> Your point that there's a lot of good CS to do without 'mathematizing' the field is not promising for research or significant progress.

I contend that there has been significant progress in many CS areas without 'mathematizing' them. That is a fact. I also stated, in my previous post, that I agree that maths could help improve this progress. I think my problem with your position is that you're talking in absolutes in topics where those absolutes clearly don't hold.


"Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true."

I'm exaggerating, but, still there is a reasonable point here. CS is about some 'science', and call that the mathematical part where we have some solid material worth being called 'science', For the rest, call that 'computer practice' or some such.

The upside of my view is that for some serious progress we're going to have to use some serious methodology. So, I'm proposing if not mathematical physics envy then applied math envy. Applied math didn't take on how to design the display lights in a scientific pocket calculator although without the lights the thing wouldn't work.

My view is not the most extreme: Last year I communicated with a CS prof whose position was that CS is looking for the 'fundamentals of computation'. Hmm .... It sounds like he believes that the P versus NP question should be right at the top of the list, and I don't. I'll settle for anything that is solid and a contribution, even if small, to the 'science' and not just to current practice.

LIkely many people here know the current state of programming language research much better than I do. If that field has gotten nicely mathematical with some solid material, great. The progress from Fortran, Cobol, Algol, Basic, PL/I, Pascal, C, C++, etc. was pragmatic and important and of enormous value to the economy but not much progress in a 'science' of programming languages, and that progress, with poor methodology, has slowed as we might have expected.

Maybe I'm saying that, in 1920, if we wanted a really good airplane, then maybe we should set aside the wood, linen, and glue and go do some aerodynamiocs calculations, discover Reynolds number, and discover that those really thin wings were a mistake. Or, observational astronomy was just a lot of curiosity until Newton came along and made progress in understanding the universe. The practical chemists had discovered a LOT, but by applying quantum mechanics they made HUGE progress.

If we are going to make the huge progress we want in computing, then history suggests that we can't be just pragmatic and that "theory is under-rated". We can't expect that chemistry will do much to help CS, but the obvious tool is math, to turn CS, the science, part into some applied math.


Part I

"Anyway, who is claiming anything for their own?"

From CLRS and much more, it is fully fair to say that CS is "claiming" that linear programming (LP) is part of CS.

Beyond what I've mentioned so far about LP in CLRS, there is a much bigger reason to claim that CS is "claiming" LP: The history is that operations research wrestled with integer LP (ILP). There is a claim that for a while G. Dantzig, the inventor of the simplex algorithm, looked at ILP and guessed that he would be able to modify the simplex algorithm to handle integers easily. Of course, decades later, no one has!

So, slowly the image appeared: In worst case, for exactly optimal solutions, the ILP algorithms had running time exponential in the size of the input data. So, they were like total enumeration. Bummer. So, then, asymptotically, polynomial would be faster, and Jack Edmonds said that a 'good' algorithm had polynomial running time in the worst case.

Klee and Minty showed that with one of the most popular pivot rules, simplex had exponential running time; the assumption has been that with any pivot rule, simplex is exponential.

Practice showed that with the usual practical problems on an LP with m constraints, the running time was about 3m iterations and, thus, polynomial on average.

K. Borgward did some 'random geometry' and showed that on average simplex would have polynomial running time, explaining the 3m performance.

Eventually Shor, Khachiyan, etc. showed a polynomial algorithm for linear programming based on cutting planes for ellipsoids. Alas, apparently whenever the polynomial algorithm was faster than simplex, both ran too long to be practical!

So, for the first test of the importance of polynomial algorithms, the effort fell face down in the mud: Simplex is exponential worst case and polynomial average case, and the polynomial algorithms are essentially never faster in a practical way!

So, dear Bell Labs was working on designing communications networks, discovered the role of ILP, encountered the worst case exponential execution time, and got with the theory of NP-completeness and wrote:

Michael R. Garey and David S. Johnson, 'Computers and Intractability: A Guide to the Theory of NP-Completeness', ISBN 0-7167-1045-5, W. H. Freeman, San Francisco, 1979.

Here can see the role of the 'satisfiability' (SAT) problem -- it is in NP-complete.

Presto! CS, long interested in 'computational complexity', especially running time, and, of course, regarding SAT as part of both EE and CS but not really operations research or optimization, takes NP-completeness as its own!

In money, students, profs, department sizes, headlines, etc., poor, little operations research, optimization, and even all of applied math can't compete with CS. So, CS grabs LP, ILP, network flows, NP-completeness, etc. along with SAT!

Okay, CS, it you are going to hijack all of 'combinatorial optimization', then realize that progress will require some good math research, and step up to the math.

Now back to this thread: So far, as in CLRS, CS is having trouble even writing with high quality about the simplex algorithm. So, putting combinatorial optimization and NP-completeness in the butter finger, dirty fingernails, unwashed hands of CS is a bad move for the future.

That's some of the larger story about the significance of CS hijacking topics from applied math.

Am I torqued? Sure: CS grabs the ball, claims to want to run with it, and then falls into the mud. I'm calling them out on it. It needs to be done.

"Throw me the ball! Throw it to me! I can score .... Splat." CS, even at MIT, too often can't even write with high quality about the simplex algorithm. REALLY big BUMMER.

Just like on a ball team, just ain't throwing CS the ball anymore. Again, they need to clean up their act in math. As CS hijacks LP, simplex, ILP, network flows, combinatorial optimization, and NP-completeness, they are getting into some of the hardest math research in all of history. For that they will need to be good at, would you believe, a junior level ugrad course in LP, and even there, even the MIT profs, need to 'pull up their grade'.

There's a war story: In some parts of 'enterprise software', a user gets to enter 'constraints'. E.g., let's figure out what we are going to do in the factory today. We know that want all the trucks loaded by 5 PM. Want all the painting done by noon so that it can be dry by 3 PM in time to be packed. Only have 2000 gallons of paint so whatever we do today has to use no more than the 2000 gallons. Etc. So, not really optimizing but are just 'satisfying'.

So, seeing this enterprise software problem, CS started the field of 'constraint logic programming'. Great! Hot new field! Charging forward!

Only a few weeks later, CS discovered that writing software to take just any collection of constraints and satisfy all of them is not so easy. I mean, even permitting using things as awesomely powerful as neural nets, AI, and, even, genetic programming, it wasn't so easy!

A few weeks after that they discovered that even if all the constraints are linear, it's not so easy! Soon they learned a lesson from operations research: They were looking for a feasible solution to an LP, and generally finding if there is a feasible solution is about the same 'difficulty' as finding an optimal solution. So, really, the linear 'constraint satisfaction' problem is not much different than just LP. Sorry guys. Better luck next time. Pick yourselves out of the mud and hit the showers.

So, SAP got interested in a French company that was interested in R. Bixby's LP software C-PLEX.

And the hot, new CS field of 'constraint logic programming' got to look a bit silly. Splat.

On my research paper, grow up! We're talking adult level activity here! No longer is it fair to plead that haven't covered that material yet! Instead, the goal is progress in research. If start with a computing problem, then the research is 'CS'. We do the research the best way we can. If we have a powerful, clean, rock solid solution with some new math theorems and proofs, then okay. So be it. That's just part of 'research'. If CS doesn't yet know this material, then maybe they will have to learn it.

Parts of academic finance learned about stochastic integration. Parts of mathematical physics learned about that math also along with Riemannian geometry and Hilbert space theory. Parts of chemistry learned about group representation theory. Parts of chemical engineering learned about ILP and dynamic programming.

Actually, reviewing my paper was not so difficult: I suspect that the editor in chief walked the paper to the math department, found a good probabilist, and got told "The math looks fine; I don't know what it means for CS.". Then he walked to the CS department, found an appropriate guy, and got told "It looks good for CS, but I can't say if the math is correct.". Done. How hard is that?

Grow up: The math is fair game. Not only that, the math is the main tool for the future of CS. If you didn't realize that, then you can say that you learned it first here. I can understand why one wouldn't get this lesson from CLRS!

Net big picture: Math is by a wide margin the oldest, deepest, most solid, most powerful, and most general subject in all of academics. Moreover, nearly uniformly, in all fields, the best progress is from 'mathematization' of the field. Get used to it! For CS, do the math. For the applications to computing, do that in the footnotes or the exercises!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: