Arc Forumnew | comments | leaders | submitlogin
Adding internal definitions by extending `do`
3 points by aw 2178 days ago | 13 comments
I was curious, has anyone implemented "internal definitions", that is, a way to introduce a variable like `let`, but without having to indent the code that uses it?

For example, I may have a function like:

``` (def foo () (let a (...) (do-something a) (let b (... a ...) (do-something-else b) (let c (... b ...) etc...)))) ```

and the indentation keeps getting deeper the more variables I define. `withs` can help, but only if I'm not doing something in between each variable definition.

In Racket, this could be written

``` (define (foo) (define a (...) (do-something a) (define b (... a ...)) (do-something-else b) (define c (... b ...)) etc...) ```

In Racket, this is called "using define in an internal definition context"; it's like using `let` to define the variable (the scope of the variable extends to end of the enclosing form) except that you don't have to indent.

In Arc, I'd like to use `var`, like this:

``` (def foo () (var a (...)) (do-something a) (var b (... a ...)) (do-something-else b) (var c (... b ...)) etc...) ```

In Arc 3.2, a `do` is a macro which expands into a `fn`, which is a primitive builtin form. We could swap this, so that `fn` was a macro and the function body would expand into a `do` form.

Thus, for example, `(fn () a b c)` would expand into `($fn () (do a b c))`, where `$fn` is the primitive, builtin function form.

Many macros such as `def` and `let` that have bodies expand into a `fn`s, and this change would mean that the body of these forms would also be a `do`.

For example, `(def foo () a b c)` expands (roughly) into `(assign foo (fn () a b c)`; which in turn would expand into `(assign foo ($fn () (do a b c)))`

Thus, in most places where we have a body where we might like to use `var`, there'd be a `do` in place which could implement it.

Why go to this trouble? `var` can't be a macro since a macro can only expand itself -- it doesn't have the ability to manipulate code that appears after it. However `do`, as a macro, can do whatever it wants with the code it's expanding, including looking to see if one of the expressions passed to it starts with `var`.

As macro, this could be an optional feature. Rather than being built in to the language where it was there whether you wanted it or not, if you didn't like internal definitions you wouldn't have the load the version of `do` which implemented `var`.

A further complication is what to do with lexical scope. If I write something like,

``` (let var (fn (x) (prn "varnish " x) (var 42)) ```

I'm clearly intending to use `var` as a variable. Similar to having lexical variables override macros, I wouldn't want `var` to become a "reserved keyword" in the sense that I now couldn't use it as a variable if I wanted to.

The Arc compiler knows of course which lexical variables have been defined at the point where a macro is being expanded. (In ac.scm the list of lexical variables is passed around in `env`). We could provide this context to macros using a Racket parameter.



3 points by rocketnia 2176 days ago | link

"`var` can't be a macro since a macro can only expand itself -- it doesn't have the ability to manipulate code that appears after it"

It can't be a regular Arc macro, but Arc comes with at least two things I'd call macro systems already: `mac` of course, but also `defset`. Besides those, I keep thinking of Arc's destructuring syntax and reader syntax as macro systems, but that's just because they could easily be extended to be.

I would say it makes sense to add another macro system for local definitions. As a start, it could be a system where the code (do (var a 1) (var b 2) (+ a b)) would expand by calling the `var` macro with both the list (a 2) and the list ((var b 2) (+ a b)).

But I think that's an incomplete design. All it does is help with indentation, and I feel like a simpler solution for indentation is to have a "weak opening paren" syntactic sugar where (a b #/c d) means (a b (c d)):

  (def foo ()
    (let a (...)
    #/do (do-something a)
    #/let b (... a ...)
    #/do (do-something-else b)
    #/let c (... b ...)
      etc...))
The "#/" notation is the syntax of my Parendown library for Racket (and it's based on my use of "/" in Cene). I recently wrote some extensive examples in the Parendown readme: https://github.com/lathe/parendown-for-racket/blob/master/RE...

(For posterity, here's a link specifically to the version of that readme at the time of this post: https://github.com/lathe/parendown-for-racket/blob/23526f8f5...)

While I think indentation reduction is one of the most important features internal definition contexts have, that benefit is less important once Parendown is around. I think their remaining benefit is that they make it easy to move code between the top level and local scopes without modifying it. I made use of this property pretty often in JavaScript; I would maintain a programming language runtime as a bunch of top-level JavaScript code and then have the compiler wrap it in a function body as it bundled that runtime together with the compiled code.

In Racket, getting the local definitions and module-level definitions to look alike means giving them the same support for making macro definitions and value definitions at every phase level, with the same kind of mutual recursion support.

They go to a lot of trouble to make this work in Racket, which I think ends up making mutual recursion into one of the primary reasons to use local definitions. They use partial expansion to figure out which variables are shadowed in this local scope before they proceed with a full expansion, and they track the set of scope boundaries that surround each macro-generated identifier so that those identifiers can be matched up to just the right local variable bindings.

The Arc top level works much differently than the Racket module level. Arc has no hygiene, and far from having a phase distinction, Arc's top level alternates between compiling and running each expression. To maintain the illusion that the top level is just another local scope, almost all local definitions would have to replicate that same alternation between run time and compile time, meaning that basically the whole set of local definitions would run at compile time. So local definitions usually would not be able to depend on local variables at all. We would have to coin new syntaxes like `loc=` and `locdef` for things that should interact with local variables, rather than using `=` and `def`.

Hm, funny.... If we think of a whole Arc file as being just another local scope, then it's all executed at "compile time," and only its `loc=` and `locdef` code is deferred to "run time," whenever that is. This would be a very seamless way to use Arc code to create a compilation artifact. :)

-----

1 point by akkartik 2176 days ago | link

"While.. indentation reduction is one of the most important.. that benefit is less important once Parendown is around."

One counter-argument that short-circuits this line of reasoning for me: the '#/' is incredibly ugly, far worse than the indentation saved :) Even if that's just my opinion, something to get used to, the goal with combining defining forms is to give the impression of a single unified block. Having to add such connectors destroys the illusion. (Though, hmm, I wouldn't care as much about a trailing ';' connector. Maybe this is an inconsistency in my thinking.)

-----

2 points by rocketnia 2175 days ago | link

"the '#/' is incredibly ugly, far worse than the indentation saved"

I'd love to hear lots of feedback about how well or poorly Parendown works for people. :)

I'm trying to think of what kind of feedback would actually get me to change it. :-p Basically, I added it because I found it personally helped me a lot with maintaining continuation-passing style code, and the reason I found continuation-passing style necessary was so I could implement programming languages with different kinds of side effects. There are certainly other techniques for managing continuation-passing style, side effects, and language implementation, and I have some in mind to implement in Racket that I just haven't gotten to yet. Maybe some combination of techniques will take Parendown's place.

---

If it's just the #/ you object to, yeah, my preferred syntax would be / without the # in front. I use / because it looks like a tilted paren, making the kind of zig-zag that resembles how the code would look if it used traditional parens:

  (
    (
      (
  /
  /
  /
This just isn't so seamless to do in Racket because Racket already uses / in many common identifiers.

Do you think I should switch Parendown to use / like I really want to do? :) Is there perhaps another character you think would be better?

---

What's that about "the impression of a single unified block"? I think I could see what you mean if the bindings were all mutually recursive like `letrec`, because then nesting them in a specific order would be meaningless; they should be in a "single unified block" without a particular order. I'm not recommending Parendown for that, only for aw's original example with nested `let`. Even in Parendown-style Racket, I use local definitions for the sake of mutual recursion, like this:

  /...
  /let ()
    (define ...)
    (define ...)
    (define ...)
  /...

-----

1 point by akkartik 2175 days ago | link

My objection is not to the precise characters you choose but rather the very idea of having to type some new character for every new expression. My personal taste (as you can see in Wart):

a) Use indentation.

b) If indentation isn't absolutely clear, just use parens.

Here's another comment that may help clarify what I mean: https://news.ycombinator.com/item?id=8503353#8507385

(I have no objection to replacing parens with say square brackets or curlies or something. I just think introducing new delimiters isn't worthwhile unless we're getting something more for it than just minimizing typing or making indentation uniform. Syntactic uniformity is Lisp's strength.)

I can certainly see how I would feel differently when programming in certain domains (like CPS). But I'm not convinced ideas that work in a specific domain translate well to this thread.

"What's that about "the impression of a single unified block"? I think I could see what you mean if the bindings were all mutually recursive like `letrec`, because then nesting them in a specific order would be meaningless; they should be in a "single unified block" without a particular order. I'm not recommending Parendown for that, only for aw's original example with nested `let`."

Here's aw's original example, with indentation:

    (def foo ()
      (var a (...))
      (do-something a)
      (var b (... a ...))
      (do-something-else b)
      (var c (... b ...))
      etc...)
The bindings aren't mutually recursive; there is definitely a specific order to them. `b` depends on `a` and `c` depends on `b`. And yet we want them in a flat list of expressions under `foo`. This was what I meant by "single unified block".

(I also like that aw is happy with parens ^_^)

Edit: Reflecting some more, another reason for my reaction is that you're obscuring containment relationships. That is inevitably a leaky abstraction; for example error messages may require knowing that `#/let` is starting a new scope. In which case I'd much rather make new scopes obvious with indentation.

What aw wants is a more semantic change where vars `a`, `b` and `c` have the same scope. At least that's how I interpreted OP.

-----

2 points by rocketnia 2174 days ago | link

"What aw wants is a more semantic change where vars `a`, `b` and `c` have the same scope. At least that's how I interpreted OP."

I realize you also just said the example's bindings aren't mutually recursive, but I think if `a` and `c` "have the same scope" then `b` could depend on `c` as easily as it depends on `a`, so it seems to me we're talking about a design for mutual recursion. So what you're saying is the example could have been mutually recursive but it just happened not to be in this case, right? :)

Yeah, I think the most useful notions of "local definitions" would allow mutual recursion, just like top-level definitions do. It was only the specific, non-mutually-recursive example that led me to bring up Parendown at all.

---

The rest of this is a response to your points about Parendown. Some of your points in favor of not using Parendown are syntactic regularity, the clarity of containment relationships, and the avoidance of syntaxes that have little benefit other than saving typing. I'm a little surprised by those because they're some of the same reasons I use Parendown to begin with.

Let's compare the / reader macro with the `withs` s-expression macro.

I figure Lisp's syntactic regularity has to do with how long it takes to describe how the syntax works. Anyone can write any macro they want, but the macros that are simplest to describe will be the simplest to learn and implement, helping them propagate across Lisp dialects and thereby become part of what makes Lisp so syntactically regular in the first place.

- The / macro consumes s-expressions until it peeks ) and then it stops with a list of those s-expressions.

- The `withs` macro expects an even-length binding list of alternating variables and expressions and another expression. It returns the last expression modified by successively wrapping it lexical bindings of each variable-expression pair from that binding list, in reverse order.

Between these two, it seems to me `withs` takes more time and care to document, and hence puts more strain on a Lisp dialect's claim to syntactic regularity. (Not much more strain, but more strain than / does.)

A certain quirk of `withs` is that it creates several lexical scopes that are not surrouded by any parentheses. In this way, it obscures the containment relationship of those lexical scopes.

If we start with / and `withs` in the same language, then I think the only reason to `withs` is to save some typing.

So `withs` not only doesn't pull its weight in the language, but actively works against the language's syntactic regularity and containment relationship clarity.

For those reasons I prefer / over `withs`.

And if I don't bother to put `withs` in a language design, then whenever I need to write sequences of lexical bindings, I find / to be the clear choice for that.

Adding `withs` back into a language design isn't a big deal on its own; it's just slightly more work than not doing it, and I don't think its detriment to syntactic regularity and containment clarity are that bad. I could switch back from / to `withs` if it was just about this issue.

This is far from the only place I use / though. There are several macros like `withs` that I would need to bring back. The most essential use case -- the one I don't know an alternative for -- is CPS. If I learn of a good enough alternative to using / for CPS, and if I somehow end up preferring to maintain dozens of macros like `withs` instead of the one / macro, then I'll be out of reasons to recommend Parendown.

-----

2 points by akkartik 2174 days ago | link

"A certain quirk of `withs` is that it creates several lexical scopes that are not surrounded by any parentheses. In this way, it obscures the containment relationship of those lexical scopes."

That's a good point that I hadn't considered, thanks. Yeah, I guess I'm not as opposed to obscuring containment as I'd hypothesized ^_^

Maybe my reaction to (foo /a b c /d e f) is akin to that of a more hardcore lisper when faced with indentation-based s-expressions :)

Isn't there a dialect out there that uses a different bracket to mean 'close all open parens'? So that the example above would become (foo (a b c (d e f]. I can't quite place the memory.

I do like far better the idea of replacing withs with a '/' ssyntax for let. So that this:

  (def foo ()
    /a (bind 1)
    (expr 1)
    /b (bind 2)
    (expr 2)
    /c (bind 3)
    (expr 3))
expands to this:

  (def foo ()
    (let a (bind 1)
      (expr 1)
      (let b (bind 2)
        (expr 2)
        (let c (bind 3)
          (expr 3)))))
It even feels like a feature that '/' looks kinda like 'λ'.

But I still can't stomach using '/' for arbitrary s-expressions. Maybe I will in time. I'll continue to play with it.

-----

2 points by rocketnia 2174 days ago | link

"Isn't there a dialect out there that uses a different bracket to mean 'close all open parens'? So that the example above would become (foo (a b c (d e f]. I can't quite place the memory."

I was thinking about that and wanted to find a link earlier, but I can't find it. I hope someone will; I want to add a credit to that idea in Parendown's readme.

I seem to remember it being one of the early ideas for Arc that didn't make it to release, but maybe that's not right.

---

"I do like far better the idea of replacing withs with a '/' ssyntax for let."

I like it pretty well!

It reminds me of a `lets` macro I made in Lathe for Arc:

  (lets
    a (bind 1)
    (expr 1)
    b (bind 2)
    (expr 2)
    c (bind 3)
    (expr 3))
I kept trying to find a place this would come in handy, but the indentation always felt weird until I finally prefixed the non-binding lines as well with Parendown.

This right here

    (expr 1)
    b (bind 2)
looked like the second ( was nested inside the first, so I think at the time I tried to write it as

       (expr 1)
    b  (bind 2)
With my implementation of Penknife for JavaScript, I finally started putting /let, /if, etc. at the beginning of every line in the "block" and the indentation was easy.

With the syntax you're showing there, you at least have a punctuation character at the start of a line, which I think successfully breaks the illusion of nested parens for me.

---

It looks like I implemented `lets` because of this thread: http://arclanguage.org/item?id=11934

And would you look at that, ylando's macro (scope foo @ a b c @ d e f) is a shallow version of Parendown's macro (pd @ foo @ a b c @ d e f). XD (The `pd` macro works with any symbol, and I would usually use `/`.) I'm gonna have to add a credit to that in Parendown's readme too.

Less than a year later than that ylando thread, I started a Racket library with a macro (: foo : a b c : d e f): https://github.com/rocketnia/lathe/commit/afc713bef0163beec4...

So I suppose I have ylando to thank for illustrating the idea, as well as fallintothis for interpreting ylando's idea as an implemented macro (before ylando did, a couple of days later).

-----

2 points by rocketnia 2169 days ago | link

"Isn't there a dialect out there that uses a different bracket to mean 'close all open parens'? So that the example above would become (foo (a b c (d e f]. I can't quite place the memory."

Oh, apparently it's Interlisp! They call the ] a super-parenthesis.

http://bitsavers.trailing-edge.com/pdf/xerox/interlisp/Inter...

  The INTERLISP read program treats square brackets as 'super-parentheses': a
  right square bracket automatically supplies enough right parentheses to match
  back to the last left square bracket (in the expression being read), or if none
  has appeared, to match the first left parentheses,
  e.g.,    (A (B (C]=(A (B (C))),
           (A [B (C (D] E)=(A (B (C (D))) E).
Here's a document which goes over a variety of different notations (although the fact they say "there is no opening super-parenthesis in Lisp" seems to be inaccurate considering the above):

http://www.linguistics.fi/julkaisut/SKY2006_1/2.6.9.%20YLI-J...

They favor this approach, which is also the one that best matches the way I intend for Parendown to work:

"Krauwer and des Tombe (1981) proposed _condensed labelled bracketing_ that can be defined as follows. Special brackets (here we use angle brackets) mark those initial and final branches that allow an omission of a bracket on one side in their realized markup. The omission is possible on the side where a normal bracket (square bracket) indicates, as a side-effect, the boundary of the phrase covered by the branch. For example, bracketing "[[A B] [C [D]]]" can be replaced with "[A B〉 〈C 〈D]" using this approach."

That approach includes what I would call a weak closing paren, 〉, but I've consciously left this out of Parendown. It isn't nearly as useful in a Lispy language (where lists usually begin with operator symbols, not lists), and the easiest way to add it in a left-to-right reader macro system like Racket's would be to replace the existing open paren syntax to anticipate and process these weak closing parens, rather than non-invasively extending Racket's syntax with one more macro.

-----

1 point by akkartik 2174 days ago | link

What are the semantics of `lets` exactly? Do you have to alternate binding forms with 'body' expressions?

-----

2 points by rocketnia 2174 days ago | link

No, you can interleave bindings and body expressions however you like, but the catch is that you can't use destructuring bindings since they look like expressions. It works like this:

  (lets) -> nil
  (lets a) -> a
  (lets a b . rest) ->
    If `a` is ssyntax or a non-symbol, we treat it as an expression:
      (do a (lets b . rest))
    Otherwise, we treat it as a variable to bind:
      (let a b (lets . rest))
The choice is almost forced in each case. It almost never makes sense to use an ssyntax symbol in a variable binding, and it almost never makes sense to discard the result of an expression that's just a variable name.

The implementation is here in Lathe's arc/utils.arc:

Current link: https://github.com/rocketnia/lathe/blob/master/arc/utils.arc

Posterity link: https://github.com/rocketnia/lathe/blob/e21a3043eb2db2333f94...

-----

3 points by aw 2174 days ago | link

> What aw wants is a more semantic change where vars `a`, `b` and `c` have the same scope. At least that's how I interpreted OP.

Just to clarify, in my original design a `var` would expand into a `let` (with the body of the let extending down to the bottom of the enclosing form), and thus the definitions wouldn't have the same scope.

Which isn't to say we couldn't do something different of course :)

Huh, I thought I had fixed the formatting in post, but apparently it didn't get saved. Too late to edit it now.

-----

3 points by aw 2174 days ago | link

Post with formatting fixed:

I was curious, has anyone implemented "internal definitions", that is, a way to introduce a variable like `let`, but without having to indent the code that uses it?

For example, I may have a function like:

    (def foo ()
      (let a (...)
        (do-something a)
        (let b (... a ...)
          (do-something-else b)
          (let c (... b ...)
            etc...))))
and the indentation keeps getting deeper the more variables I define. `withs` can help, but only if I'm not doing something in between each variable definition.

In Racket, this could be written

    (define (foo)
      (define a (...))
      (do-something a)
      (define b (... a ...))
      (do-something-else b)
      (define c (... b ...))
      etc...)
In Racket, this is called "using define in an internal definition context"; it's like using `let` to define the variable (the scope of the variable extends to end of the enclosing form) except that you don't have to indent.

In Arc, I'd like to use `var`, like this:

    (def foo ()
      (var a (...))
      (do-something a)
      (var b (... a ...))
      (do-something-else b)
      (var c (... b ...))
      etc...)
In Arc 3.2, a `do` is a macro which expands into a `fn`, which is a primitive builtin form. We could swap this, so that `fn` was a macro and the function body would expand into a `do` form.

Thus, for example, `(fn () a b c)` would expand into `($fn () (do a b c))`, where `$fn` is the primitive, builtin function form.

Many macros such as `def` and `let` that have bodies expand into a `fn`s, and this change would mean that the body of these forms would also be a `do`.

For example, `(def foo () a b c)` expands (roughly) into `(assign foo (fn () a b c)`; which in turn would expand into `(assign foo ($fn () (do a b c)))`

Thus, in most places where we have a body where we might like to use `var`, there'd be a `do` in place which could implement it.

Why go to this trouble? `var` can't be a macro since a macro can only expand itself -- it doesn't have the ability to manipulate code that appears after it. However `do`, as a macro, can do whatever it wants with the code it's expanding, including looking to see if one of the expressions passed to it starts with `var`.

As macro, this could be an optional feature. Rather than being built in to the language where it was there whether you wanted it or not, if you didn't like internal definitions you wouldn't have the load the version of `do` which implemented `var`.

A further complication is what to do with lexical scope. If I write something like,

    (let var (fn (x) (prn "varnish " x)
      (var 42))
I'm clearly intending to use `var` as a variable. Similar to having lexical variables override macros, I wouldn't want `var` to become a "reserved keyword" in the sense that I now couldn't use it as a variable if I wanted to.

The Arc compiler knows of course which lexical variables have been defined at the point where a macro is being expanded. (In ac.scm the list of lexical variables is passed around in `env`). We could provide this context to macros using a Racket parameter.

-----

1 point by akkartik 2174 days ago | link

I recall a quote -- by Paul Graham, I think -- about how one can always recognize Lisp code at a glance because all the forms flow to the right on the page/screen. How Lisp looks a lot more fluid while other languages look like they've been chiseled out of stone. Something like that. Does this ring a bell for anyone?

-----