Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
An interview with Barbara Liskov (quantamagazine.org)
160 points by theafh on Nov 20, 2019 | hide | past | favorite | 58 comments


> In my version of computational thinking, I imagine an abstract machine with just the data types and operations that I want. If this machine existed, then I could write the program I want. But it doesn’t. Instead I have introduced a bunch of subproblems — the data types and operations — and I need to figure out how to implement them. I do this over and over until I’m working with a real machine or a real programming language. That’s the art of design.

Love the way she put this. Good software design is a process of coming up with layers of vocabularies that most expressively describe what needs to be described, until you hit the bedrock of something that already exists that you can hand the rest off to.


Which is why I prefer working with a strongly typed functional language. Domain Driven Development (well, my version of it) is the first step. Get all the types right. The necessary operations should be obvious by this point. Code them up! Write unit tests as you go along. You're likely to do some refactoring as your operations materialize, and unit tests are going to help you stay safe.


That's the fun part. Too often forgotten is that, now that you've invented a new vocabulary, you need to teach it to others, or you're just speaking your own private dialect.


I like the way you just phrased it even better!


That's more succinctly defined as abstraction.


The key point is the top-down approach. Many programmers come up with their abstractions by thinking "what can I make?", but it's often better to think "what do I want to make, and in a perfect world, how would I want to express that?", and going down from there.


SICP called it design by wishful thinking; I always liked that.


You really should approach from both ends.

You have to consider the hardware if you want any amount of efficiency. You should consider your 'wishful thinking', because those wishes sometimes come true (not always, but often enough that its worthwhile to think about the ideal)


If you're interested in computer history, here's a transcript of a long, in-depth interview with Barbara Liskov in 2016 about how she got into computing, the CLU project, and many other aspects of her career. Quite inspiring!

https://amturing.acm.org/pdf/LiskovTuringTranscript.pdf


Such a waste of a good title. Should have been:

"An interview with Barbara Liskov. Accept no substitutes."


It would have been a great title.

But the pedants like me would have crept up to say "but Liskov tells us we can accept substitutes".


You know, I've heard about the Liskov substitution principle but have never really equated it with anyone (especially living). Its nice to see that some pioneers in our field is still alive and well.


"Barbara Liskov invented the architecture that underlies modern programs."

Basic explanation of this claim: she invented CLU, the first language without goto statements.


I don't think your statement is intended to be dismissive, but it is a bit... sparse.

Related to CLU: It introduced a lot of concepts together in one language. "Key contributions include abstract data types,[6] call-by-sharing, iterators, multiple return values (a form of parallel assignment), type-safe parameterized types, and type-safe variant types." [1]

Liskov also has other contributions[2]. Per the article, she says herself in the early days there were a lot of big problems to solve, so it was somewhat easier to make a "big" contribution.

It's interesting to look at where many modern concepts came from. It would be interesting to dig in to these a little more. Time is the one resource that's hard to come by though.

[1] https://en.wikipedia.org/wiki/CLU_(programming_language) [2] https://en.wikipedia.org/wiki/Barbara_Liskov


Reading about the history of so many major design decisions coming from one person also makes me wonder how different modern languages would be without the contributions of that person. Would we be more or less where we are, but a couple years behind? Would we have gone a different route with some of these decisions?


"The first language without goto's" is quite obviously HUGE! On it's own.


The original title was just clickbait that said something like 'an interview with the architect of modern software' so I was giving a very sparse overview that was still not just a mystery title.


Mostly she is known for the Liskov substitution principle (LSP), contravariance of argument types and covariance of the return type. Many OO languages/designers still don't get that right. https://en.wikipedia.org/wiki/Liskov_substitution_principle

"FUNCTIONS THAT USE POINTERS OR REFERENCES TO BASE CLASSES MUST BE ABLE TO USE OBJECTS OF DERIVED CLASSES WITHOUT KNOWING IT." - https://web.archive.org/web/20151128004108/http://www.object...



Both links you provided are about emphasis. GP was not using caps for emphasis, they were using caps in a direct quote. I totally get where you're coming from on the yelling thing, but it doesn't apply to GP's comment by my reading of the guidelines.


A charitable reading would allow that if the guidelines are saying to not use all caps for emphasis, then it's guiding against all caps in general. There's a reason they are called "guidelines" and not "rules."


It's not for emphasis, it's a literal citation from the URL.


Ok, but it's the convention here not to post entire sentences in uppercase, even if they appeared that way in the original. It's the same with titles; some titles are typeset in uppercase, or come from older documents where that was the convention. We edit those for the same reason: in the context of a text-based internet forum, allcaps is like yelling.


Yes, generally I'm with you not to shout or let other people shout. But in this case I thought it was appropriate to let him shout.


Except in direct literal quotes.


isn't it also the convention not to bicker about the rules in the comments?


Except in direct literal quotes.


The article doesn't seem to mention https://en.wikipedia.org/wiki/Liskov_substitution_principle at all, which I think is her greatest accomplishment, formulating it.


pg (amongst others) talked a lot about this. Middle layer language design, eDSL, language oriented programming ala racket.

Fun to see liskov mention that when so few, if at all, of the oo World thought this way.


Liskov is one of the few programmers who truly understood OO-programming waaaaayyyy back in the 80s.

https://dl.acm.org/citation.cfm?doid=62139.62141

And later developed into the Liskov Substitution Principle (`94): https://www.cs.cmu.edu/~wing/publications/LiskovWing94.pdf

Over the next 20 years, people would be writing about bad examples like "Class Car extends class transportation", but the fundamentals of good OO-programming were laid out by Liskov pretty early on...


The thing with pioneers is that most of the followers in the decades after have no clue about what they meant.


Back in the day, I treasured my copy of Liskov and Guttag's "Program Development in Java". I enjoyed its discussion of a disciplined programming style centered around maintaining invariants in your data structures. It was also the only book I've seen which talked about modeling the state of your design using a graphical notation inspired from Alloy. I wanted more exposure to this material so I even purchased a copy of their earlier work (which is also excellent). Point being, I have a lot of respect for her.

Like some of the other comments, I loved how she expresses her thoughts about computing in this article, e.g: "In my version of computational thinking, I imagine an abstract machine with just the data types and operations that I want..." (reminds me of Norvig/Graham talking about Lisp). But I don't like this article overall because I think it takes cheap shots at men in computing.

"I’m worried about the divorced couple in which the husband publishes slander about the wife, including information about where she lives." When I first encountered this statement, I thought it was being unnecessarily specific but didn't think anything else of it. After finishing the rest of the article, I think it was a purposeful choice, and in the current environment, I don't think it's a fair statement to make. A male making the opposite statement would likely be called out, so I'm calling her out here.

"At Berkeley, I was one of one or two women in classes of 100. No one ever said, “Gee, you’re doing well, why don’t you work with me?” I didn’t know such things went on. I went to graduate school at Stanford. When I graduated, nobody talked to me about jobs. I did notice that male colleagues, like Raj Reddy, who was a friend of mine, were recruited for academic positions. Nobody recruited me." She didn't say this explicitly, but my interpretation is that she's using this as evidence of a gender bias in recruiting. Clearly, gender bias is one possibility, but what about giving people the benefit of the doubt? There are other more charitable explanations (somebody who understands conditional probability better than me could comment on this situation from a probabilistic perspective).

"Back then, advisers placed graduates through deals with departments around the country.

Yes, but nobody made deals for me. In the ’90s, I went back to Stanford for a department celebration. A panel of the old professors, without knowing what they were doing, described the old boy network. They said, “Oh, my friend over there told me that I’ve got this nice young guy you should hire.” It was just how it was. They were clueless. They talked about a young woman who did so well because she married a professor! Clueless. Another colleague had a pinup in his office. I asked him, “What’s that pinup in your office?” Clueless." I love her directness here, but on the other hand her choice of words suggests to me that she has a chip on her shoulder, something I wasn't expecting from somebody so accomplished!

This reminds me of how greatly our beliefs affect our perception of the world, like in the beginning of a relationship when everything seems perfect, but after a fight, suddenly even the smallest things about your partner start to annoy you. The quirkyness of their laugh, or the way they chew their food – how did I miss how annoying that was before? But in reality they didn't change at all – only my perceptions did, because I no longer liked them.

"Even so, is it correct that there were approximately 1,000 faculty members when you started at MIT, only 10 of whom were women?

That was my recollection.

So there was progress, but …" I don't like the use of the word "progress" here, but I realize that is potentially another highly flammatory discussion. Here, it's just more evidence that this discussion is politically charged.

"My sense is that all scientific fields have failed to recognize some foundational contributions by women.

In the 10 years before I was head of computer science at MIT, the department identified only one woman worth hiring. When I was the head [from 2001 to 2004], I hired seven women. We didn’t scrape the bottom of the barrel. All three junior women I hired are outstanding. There was a long period of time where women were not considered at all." Is that last sentence meant to be taken literally? I would be surprised if it were true. I'm ignorant of the goings-on at these levels, but I have faith that the majority of people involved in these decisions (around the time that she's referring to) were not sexist (the mere fact that she became the department head is evidence of this). I would also be interested in more details about why she was successful in hiring women compared to the previous head.

The big tech companies are making an effort to hire more women. One way they do this is by recruiting heavily at events where there are a preponderance of women. Somehow, this is seen as not being discriminatory, but I disagree. It's like there's two separate lines of applicants, and by choosing to focus on one, the people in the other line are hurt (and to be clear, my position is that I'd rather gender not be part of the equation at all, at least for tech jobs).

I could go on about the article, but I think I've made my basic point.

I also realize that I'm subject to the same biases that I allude to above. I too have a chip on my shoulder because I believe I was unfairly accused of being a sexist at work. As a result, I've become highly sensitive around this topic (e.g. I followed the James Damore events very closely) and I too view things through an (even more) distorted lens.


you are in fact the one who sounds like you have a chip on your shoulder.


OP says as much, which I respect, even if I don't share his position. It's just a rough subject these days.


Liskov's work on programming languages is (of course) well known. She then switched research topic to distributed systems and among other things developed the "Practical Byzantine Fault Tolerance" (PBFT) algorithm.

I thought this was interesting, because like 10 years ago I had never really heard of her distributed systems work, and kindof thought of Liskov as a one-hit-wonder. But now in the last few years PBFT is suddenly pretty hyped, because it's a key part of a bunch of new blockchains, so maybe she was far ahead of the curve...


She's done both for a long time. She came up with the replication protocol that Lamport termed Paxos a year before Lamport did (she called it Viewstamped Replication). That was 1988.


Meta: Until I clicked this link, I had no idea who Barbara Liskov is.

Perhaps re-title this to "An interview with Barbara Liskov, The Architect of Modern Algorithms"


Not every title should explain itself. On HN, it's good for readers to work a little. https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...

Among other reasons, it interrupts the reflexive circuits and gives time for the reflective ones to kick in. https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...


Liskov is one of the more famous people in the history of computing, which is right in HN's wheelhouse.

I don't know who half the people are who are mentioned on the HN front page. On the one hand, I'd like it if every person mentioned in a title had a description of who they are. On the other hand, that's overkill for a lot of famous people, so you've gotta draw a line somewhere.


For me it was "The only Liskov I know is from the liskov substitution principle" and then I discovered this is the same Liskov, which was rather neat.


Poor journalism and writing ruined an otherwise thoughtful piece of influential computer scientist.

The root cause probably is that the author does not understand the ideas well enough, so he/she is urged to overblown the implications...


Yeah, "You came of age professionally during the development of artificial intelligence" in the opening question told me right off the bat that something was off with the interviewer. Some version of AI research has been around nearly as long as computers themselves, but the late 60s weren't exactly a focal point like, say, the 80s were. Plus, her primary contribution to computer science has little to do with AI directly. It felt like misunderstanding and/or a wide reach for a hot topic.


Perceptron, which preceded both the minsky induced AI winter and the current hype around deep learning was in fact invented in 1958.


In this case the author's background is "PhD in Mathematics at Dartmouth College, Master of Arts in Teaching Mathematics at Smith College, and BA in Anthropology at Bard College."

With that background I wouldn't expect expectise in programming or programming language history. It seems like the latter is actually sort of an actual academic field, albeit a bit small.

https://www.amazon.com/History-Programming-Languages-Thomas-... (including a chapter on CLU/Barbara Liskov) is a fantastic collection of papers from the "ACM/SIGPLAN Second History of Programming Languages Conference".). It was my first purchase from Mr Bezos' UK online book store some time in 1999 or so; I think.

This is her paper from that book:

https://dl.acm.org/citation.cfm?id=1057826


Then it is baffling that this article comes to be...


Maybe I wasn't clear. The article author has zero academic experience in programming languages, afaik. Barbara Liskov has, though.


Hmmm...

"The language, CLU (short for “cluster”), relied on an approach she invented — data abstraction — that organized code into modules."

Let's ignore how preposterous it is to claim you invented data abstraction and move to the module claim... that a simple search for "who discovered programming modules" shows:

"The Modula programming language ... by Niklaus Wirth, the same person who designed Pascal. The main innovation of Modula over Pascal is a module system, "

And it seems others were thinking the same way: 'After you won the Turing Award, a comment appeared online saying, “Why did she get this award? She didn’t do anything that we didn’t already know.”'

Not sure what to think here but I never heard CLU was the first to add modules. I'll just leave it at that.


> Let's ignore how preposterous it is to claim you invented data abstraction

It's not exactly fair to blame a researcher for the overhyped claims made on their behalf by a popular science writer.


And it seems others were thinking the same way: 'After you won the Turing Award, a comment appeared online saying, “Why did she get this award? She didn’t do anything that we didn’t already know.”'

This is not only distasteful but false.

First, modules in Modula were primarily about scope not abstraction.

Liskov pioneered a core aspect of abstraction which, among other things, allows modules to rely on abstract interfaces rather than their implementations (or a type or class rather than its subtypes or subclasses).

Abstraction only works reliably if any provably true property of a type is also provably true about its subtype.

Want to learn more? You’ll find it referenced in the literature as the Liskov Substitution Principle.

https://dl.acm.org/citation.cfm?doid=197320.197383


When did ML gain its module feature (which does enable the equivalent to data abstraction)? From a very casual search, the earliest references I can find to it are from the early 1980s, so well after CLU - but the very first versions of ML itself were being worked on around the same time.


That’s a good question, I don’t know the history well enough to comment on that.

Much more important than CLU itself — groundbreaking as it was — was Liskov’s formalization of substitutability as a rigorous principle one can use to reason about the quality or correctness of an arbitrary abstraction (or model thereof).


I'm pretty sure this was around 1983/1984 See the papers "Modules for standard ML" 84, by Dave Macqueen, MacQueen had previously written a paper Modules for Hope (81) which likely influenced it.

Unfortunately the SML-Family doesn't have links to these in it's history section afaict. But ML languages didn't get modules until the standardization effort that produced standard ML.


If I recall correctly, ML first appeared as part of the Edinburgh LCF theorem prover in 1979 and it did not have modules at that time.


To echo what pulisse said:

Please let's be careful to distinguish anything Liskov claims about her own work from claims made on her behalf by a non-specialist author. Most times the latter happens, the actual subject ends up wincing over the inaccuracy. To get that wrong as a reader is to repeat the error.


To be fair, the first version of Modula was released around the same time as CLU, and the latter additionally includes a number of modularity-enhancing features which were quite influential on later programming languages. So I'm inclined to endorse the claim made in the article.


The "simple search" would be wrong. Wirth says [https://inf.ethz.ch/personal/wirth/Articles/Modula-Oberon-Ju...]:

> I encountered the language Mesa [… which] allowed to develop parts of a system – called modules – independently

Mesa was first released to the public in 1976, so CLU might have been slightly earlier.

Modules as a concept predate both Mesa and CLU. David L Parnas was one of the popularizers of the concept, and in one of his early papers on the subject [https://www.win.tue.nl/~wstomv/edu/2ip30/references/criteria...] his quotes show that the term was well established by 1970.

But CLU was one of the first programming language to feature abstract data types, iterators, classes, and templates. Even in the mid-80s, when I first heard about it, it seemed shockingly modern.


Modula took its module system from Mesa, after Niklaus Wirth spent its first sabbatical year at Xerox PARC, he did not invent modules.

And as others have already point out, the word "modules" is the same, but not the semantics.


I'm not sure its preposterous. Based on what I've read she and her team focussed on teasing out abstraction and modularity features in CLU based on experience building previous large/medium scale software systems.

It doesn't preclude others inventing before, after or at the same time. Nor the same ideas being explored in other fields earlier with other terminology or focus.

I'm sure there is some field of human endeavour that has explored and documented the same ideas in a different context much earlier.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: