Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a former academic gone mercenary, I love that programmers of all skill levels can get into organization, language and specification through Python. It's a wonderful gateway to learning, and in this case, it seems to have done exactly what Python was specified to do!

For my own projects, I've used pretty much all the fad languages from BASIC to Pascal to C. I've used industry-adopted languages like Tcl, Java and Perl. I've experimented with Lisp and Erlang.

After all is said and done, I like Python/Cython/C. It gives me the option to use a glue language for the Windows programming I've done, which is perfect. It gives me a compiled (albeit ctypes-restricted) subset language for writing hashing/storage/whatever functions. It gives me a super-awesome library set that comes built-in.

I know it'll go away someday, but I'm glad to have worked in Python for as long as I have.



Python is remarkable for having a dozen or so major language features disallowed by fiat (removal of the GIL, unrestricted anonymous functions, TCO, braces) but still being extremely powerful, expressive, and used by programmers of all skill levels.

It's difficult to be obscure in Python. Yet if I really need to do something jinky like remapping an outside library's methods, I can do it with a few lines of slightly-uglier code. That's a very hard balance to make.

Good design involves a lot of "no". I don't think languages are an exception.


Minor correction. The removal of the GIL is not disallowed by fiat. The removal of the GIL was attempted (via granular locking) and destroyed single threaded performance. That's unacceptable for a variety of reasons. In the same thread, Guido has repeatedly stated that the GIL was a tradeoff, and he would welcome anything that helped lessen it's impact/remove it should it be proven not to significantly harm Python's primary function.

If someone showed up with a patch tomorrow which allowed free threading without seriously breaking python, or crippling single threaded performance, or making extension modules impossible to write, I could very well see it getting in. In reality, doing this requires a serious reworking of the interpreter (ala unladen swallow) to make it even feasible. There's a lot of things the GIL actually helps, for example the ease of writing c-based extension module, unfortunately they come with a price.

The price is/was seen as an acceptable tradeoff, and I think it has served it well, even if if it disappeared tomorrow no one would shed a tear.

Also, braces aren't a feature :p


Minor correction. The removal of the GIL is not disallowed by fiat.

To piggyback: don't forget that CPython ≠ Python. The GIL is an implementation detail of CPython, not a feature of the language. Other Pythons -- Jython, for one -- have no GIL.


True enough.

BTW, if you'd like to chat more about concurrency, I'm around all weekend (although I'm late getting to the University of Belgrano today). As my comments during your talk yesterday indicated, I think you're pretty off-base arguing that people aren't trying out interesting concurrency solutions in Python — ZODB is a (somewhat broken) STM that's been production-stable for many years, and Stackless is specifically targeted at actor-based modeling.

So I think people in the Python community have actually been way out in front on these issues. It's just that most people use Python for production work, not research, so the researchy stuff doesn't make it into the mainline.


> It's difficult to be obscure in Python.

It's occasionally difficult to be obscure with Python syntax. It's not at all difficult to be obscure with anything in Python above that level. Even then, if you do weird things with metaclasses or decorators, it's easy to be obscure with Python.


Decorators are an odd case -- I find that they make code dramatically clearer at the call-site, but that the decorator definitions get hairy fast (and get worse the more life-saving they are). They ended up being one of the best macro systems ever devised (they're applied at compile-time!).


> they're applied at compile-time!

I seriously don't under stand that statement. Decorators are callables that return callables, nothing else, nothing special. Decorator Syntax is just sugar.

Decorators can be applied any time including runtime. I'm not even sure what you mean by "compile time" in regards to Python, import time?


They're applied at parse-time, and they completely mask the def that they are applied to.

Yes, the function being used as a decorator could just as easily be called at runtime, and that does break the similarity to macros somewhat.


Only decorator syntax happens somewhere around parse-time (which I call import time, but meh).

> completely mask the def that they are applied to.

@decorator reassign to the name of the def. But if there are other refs to that def they do not change and are not "masked". It is no different than assignment. You know that these two are identical, right?

  @foo
  def bar():
    pass
  
  def bar():
    pass
  bar = foo(bar)
Also this is a perfectly valid decorater that doesn't mask the original func

  def deco(func):
    # do something terrible clever here
    return func

decorator syntax != decorators is my main point. But by conflating the two I can begin to understand your statement.


I know it'll go away someday,

Aside from the obvious that just about everything will go away someday, is there a reason you believe this?

From my vantage point (albiet limited) python looks like it has staying power. It is easy to use, effecient enough (and with unladen swallow in development it may get a lot more effecient soon), and currently seems to be gaining ground instead of losing. Eventually I am sure that it and any other language you can possibly name will get supplanted, but I am not aware of any particular reason to expect that any time remotely soon.


is there a reason you believe this?

http://en.wikipedia.org/wiki/Anicca

python looks like it has staying power

So did the Roman Empire, JOVIAL, FORTH, and PL/1.


Heh, first to be clear, I agree that very few things are truly permanent. In fact, I said that in my original question. What I said is that I do not think Python will be going away any time soon, and your examples play nicely into that theme.

The Roman Empire lasted for many centuries, Jovial, Forth, and PL/1 all had very long runs. In fact, they are all still used in some limited capacities today.


From my vantage point

I might have dated myself above by mentioning Tcl and Pascal, but I've been programming for a few decades. Without ruining the cosmic punchline: all things go away surprisingly fast.


You must be old. Basic hasn't been a fad since the 80s.


BASIC was never a 'fad' - it was frequently all you had (other than machine code). Remember, no internet, no CDs, even no modems for the vast unwashed such as myself - just a tape deck, 16K ROM and a TV.

The first language where I got to choose it was Lisp for the Sinclair QL, ~ 1985.


You must have an interesting definition of 'old'.


I assume he means VB(.NET).


Now that this conversation is closed up, I'll date myself further. I do not mean Visual Basic (nor Victoria Bitter.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: