Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As long as it still allows for arbitrary-length vectors - which I can't imagine it not - this would be impossible due to the halting problem. Or I guess, maybe you could derive a lower bound on memory usage, but you could not get an upper bound/exact number.


What about allocation is only possible when you can prove the upper limit. So this is possible

    if (sz < 1e6)
        vec = vector<int>(sz)
    else
        throw "Size exceeds upper limit"


I'm saying you could know "it will use at least X memory", but you could never know "it will use at most X memory" unless you seriously cripple the language's capabilities


Maybe there could be a compiler flag. You let any program compile normally, but if you enable the flag, it only allows programs to compile if the maximum memory usage can be computed, and if there are under the limit you specify.

That means the language isn't always crippled, but you can get compiler enforcement for certain embedded programs.


It would effectively become a stack-only language (heap allocations' sizes would always have to be known at compile time, just like on the stack). I could see that serving an interesting special subset of use-cases, but I was under the impression we were talking about a general programming language, which that would not be.


For example MISRA C disallows dynamic memory allocations completely and still pretty complex applications have been written to that spec. Similar guidelines are pretty common in other high reliability or safety critical software specifications. Another example is "JPL Institutional Coding Standard for the C Programming Language", which specifies "Do not use dynamic memory allocation after task initialization"


Fortran didn't have dynamic memory allocation until Fortran 90. If you wanted to run a problem larger than the original code's author had anticipated, you needed to recompile the source with larger values. This could indeed be an annoying restriction. But you might be surprised how much software was successfully written in Fortran.


Ada/SPARK does not allow for dynamic allocations, all of them must be proven at compile time.


This is Ada behaviour when doing stack allocations actually.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: