[PRL] [plt-internal] Video of Clojure talk

Sam TH samth at ccs.neu.edu
Thu Oct 2 14:33:49 EDT 2008


On Thu, Oct 2, 2008 at 2:02 PM, Shriram Krishnamurthi <sk at cs.brown.edu> wrote:
> Part I: It's a Lisp; it has lots of data structures under a unified
> type (I didn't quite catch how these are all treated similarly while
> preserving their distinctive properties -- maybe that was the
> interesting part);

That certainly is part of the interesting part.  More interesting to
me was the presence of multiple, high-performance, purely functional
data structures.  Clojure is a much less mutation-heavy language than,
say, PLT Scheme.


> Btw, about the absence of tail calls, he repeats this fact in a bit
> more detail at around 63:30, saying:
>
>  SISC Scheme does tail calls on the JVM, right?  It does it with this
>  whole other programming infrastructure.  They're calling Scheme is
>  nothing like Java's, right?  You have to pass additional things, or
>  trampoline, or whatever...Clojure does not do that.  Clojure has
>  pedal-to-the-metal calling conventions that match Java's.  So I
>  don't have tail recursion because you can't do that unless the JVM
>  does that.
>
> Maybe we should have written "Continuations from Generalized Stack
> Inspection" as "Tail-Calls from Generalized Stack Inspection".

I don't think that he'd be willing to take the performance hit from
that either.  His point is that if you want a loop on the JVM, you
can't write it with tail calls and get the performance you want.  It's
a problem with trying to implement functional languages on the JVM,
but I sympathize with his choice.

> So he has RECUR for loops.  He says something that sounds
> contradictory (65:00) about tail calls -- I don't quite get it.

What I think you're referring to is him saying that this:

(defn tail-loop [n]
    (if (= 0 n) "whee" (recur (- n 1))))

runs in constant stack space, even though the `loop' form is not used.

> The concurrency is at 19:45 in Part II.  That's where this whole
> presentation gets interesting.  So skip to there.  (Sadly he never
> repeats questions, the questions are inaudible, and most of the
> interesting bits here seem to be in the responses to questions.  He
> has some random babble about type systems.)

If you flag points where you'd like me to try to remember the context
of the answer, I can probably do that (as for the bit at 65:00 above).

> Punch line: he implements an STM.

Well, yes, but not just that.  He restricts mutation to in
transactions, reads can happen outside of transactions, he has some
additional concurrency features that are more like message passing.
Plus the commute operator, which is the first new idea I've seen in TM
in a while.

> The COMMUTE construct is nice.  Unfortunately, there is no checking
> that the operations actually are commutative, and they may not be.
> [Btw, the CSCW (Computer-Supported Collaborative Work) folks have a
> huge amount of theory and examples for making surprising things
> commutative.]

Do you have some examples or something I could google?  The Wikipedia
article seems to be about HCI, and I could sort of imagine how it
would be related, but some pointers would be useful.

Overall, I think the talk is interesting for a few reasons:

1. He has a very clear vision of what *he* wants from his language,
and how to get it, which I think is a rare condition in language
designers.
2. His datastructure library made me envious.
3. His ideas about concurrency were the first ones I've seen that
seemed notable in several years.
-- 
sam th
samth at ccs.neu.edu



More information about the PRL mailing list