Arc Forumnew | comments | leaders | submitlogin
1 point by i4cu 24 minutes ago | link | parent | on: Ask AF: Ordered Tables?

relative to... its previous incarnation.

Microbenchmarks against a previous version only means they've made a relative improvement.

The benchmarks that would matter to us (or at least me) are:

1. how does it compare to an equivalent implementation without having to maintain insertion order.

2. how does it hold up under stress (larger data sets, with heavy load where gc/compaction have to occur)

Obviously none of this should matter to you as you've said your data load is low with no growth. So bobs your uncle.


Wow. I hadn't read that post before. And here I was thinking I'm too critical sometimes.

Oh right, Clojure has unhygienic macros too. Clojure symbols also have namespaces as in Common Lisp.

Isn't Common Lisp a language with a package system and unhygienic macros?

Common Lisp's approach is that the way a symbol is read incorporates information about the current namespace. That way usually all symbols, even quoted ones, can only have collisions if they have collisions within the same file, and this makes hygiene problems easier to debug on a per-file basis.

I don't think it's my favorite approach, but it could very well be a viable approach for Arc. I was using an approach somewhat like this in Lathe's namespace system, although instead of qualifying symbols at read time, I was qualifying each of them individually as needed, using Arc macros.


Some pretty good quotes from this thread: https://www.reddit.com/r/programming/comments/67gu9/take_the...

---

paulgraham: "Really? You've been mad at me for years for writing a new Lisp dialect? But new dialects are so common in the history of Lisp. I've probably used 20 in my life. And why be so attached to CL specifically?

In the old days, Lisp hackers always used multiple dialects, and basically tried to program as close to the platonic form of Lisp as they could modulo the flaws of whatever one they happened to be using. Don't things work that way now? Are there lots of people who are attached to CL specifically rather than Lisp generally?"

---

demoss: "What is annoying is that for 6 years now you have been building a following of people who go "Lisp is theoretically nice, but all the existing ones are SO full of onions! I'm going to wait for Arc to come out before I learn Lisp!""

---

death: "The cardinal rule of Lisp: don't reinvent, integrate."

paulgraham: "I don't know where you picked this up, but it seems the very opposite of the Lisp spirit to me. E.g. Steele and Sussman. Are you sure you didn't mean the cardinal rule of Java or something?"

2 points by rocketnia 3 hours ago | link | parent | on: Ask AF: Advantages of alists?

"No, I think I disagree there, assuming I'm understanding you correctly."

That's interesting.... How would you describe what quotation does, then, if you wouldn't say it lets you write certain data directly in the code?

---

In your data migration example, I notice you're reading and writing the data. You're even putting newlines in it, which suggests you might sometimes view the contents of that written data directly. If you're viewing it directly, it makes sense to want the code that generates it to look similar to what it looks like in that representation.

It's not always feasible for code to resemble data, but since that file is plain text with s-expressions, and since the code that generates it is plain text with s-expressions, it is very possible: First you can pretend they're the exact same language, and then you can use `quasiquote` for code generation.

You might not have thought of it in that order, but I think the cases where `quasiquote` fails to be useful are exactly the cases where it's hard to pretend the generated data is in the same language as the code generating it.

---

"I've always thought there's a deep duality between quasiquote and destructuring."

I've always thought it would be more flexible if the first element of the list were a prefix operation, letting us destructure other things like tables and tagged values.

I built the patmac.arc library to do this:

Current link: https://github.com/rocketnia/lathe/blob/master/arc/patmac.ar...

Posterity link: https://github.com/rocketnia/lathe/blob/7127cec31a9e97d27512...

One of the few things I implemented in patmac.arc was a `quasiquote` pattern that resembles Arc destructuring just like you're talking about.

Racket doesn't need a library like patmac.arc because it already comes with a pattern-matching DSL with user-definable match expanders. One of Racket's built-in match syntaxes is `quasiquote`.


For what it's worth, my approach here is pattern-matching. In Lathe Comforts for Racket I implement a macro `expect` which expands to Racket's `match` like so:

  (expect subject pattern else
    then)
  ->
  (match subject
    [pattern then]
    [_ else])
If Arc came with a similar pattern-matching DSL and `expect`, we could write this:

  (def map1 (f xs)
    (expect xs (cons x xs) ()
      (cons (f x) (map1 f xs))))
The line "expect xs (cons x xs) ()" conveys "If xs isn't a cons cell, finish with an empty list. Otherwise, proceed with x and xs bound to its car and cdr."

"Clojure has good interop with java and that's what made Clojure explosive. If we can do that with Arc/Racket then we are better off for it."

Do we ever expect Anarki values to be somehow better than Racket values are? If so, then they shouldn't be the same values. (The occasional "interop headaches" are a symptom of Anarki values being more interchangeable with Racket values than they should be, giving people false hope that they'll be interchangeable all the time.)

I think this is why Arc originally tossed out Racket's macro system, its structure type system, and its module system. Arc macros, values, and libraries could potentially be better than Racket's, somehow, someday. If they didn't already have a better module system in mind, then maybe they were just optimistic that experimentation would get them there.

Maybe that's a failed experiment, especially in Anarki where we've had years to form consensus on better systems than Racket's, and aligning the language with Racket is for the best.

But I have a related but different experience with Cene; I have more concrete reasons to break interop there.

I'm building Cene largely because no other language has the kind of extensibility I want, even the languages I'm implementing it in. So it's not a surprise that Cene's modules aren't going to be able to interoperate with Racket's modules (much less JavaScript's modules) as peers. And since the design of user-defined macros and user-defined types ties into the design of the modules they're defined in, Cene can't really reuse Racket's macro system or first-class values either.

2 points by akkartik 7 hours ago | link | parent | on: Ask AF: Advantages of alists?

> I bet we can at least agree, on a definitional level, that quotation is good for constructing data out of data that's written directly in the code.

No, I think I disagree there, assuming I'm understanding you correctly.

One common case where I used to use quasiquote was in data migrations, and there was never a macro in sight. I don't precisely remember a real use case involving RSS feeds and user data back in the day, but here's a made-up example.

Say you're running a MMORPG that started out in 2D, but you're now adding a third dimension, starting all players off at an elevation of 0m above sea level. Initially your user data is 2-tuples that look like this:

    (lat long)
Now you want it to look like this:

    (x y z)
..where x is the old latitude and z is the old longitude.

Here are two ways to perform this transform. Using quasiquote:

    (whiler (other-user-data ... (lat long) ...)  (read)  eof
      (prn `(,other-user-data ... (,lat 0.0 ,long) ...)))
And without quasiquote:

    (whiler (other-user-data ... (lat long) ...)  (read)  eof
      (prn (list other-user-data ... (list lat 0.0 long) ...)))
Hopefully that conveys the idea. Maybe the difference doesn't seem large, but imagine the schema gets more complex and more deeply nested. Having lots of `list` and `cons` tokens around is a drag.

I've always thought there's a deep duality between quasiquote and destructuring. Totally independent of macros.

1 point by akkartik 8 hours ago | link | parent | on: Ask AF: Advantages of alists?

Relying on the order arguments are evaluated in is always going to result in grief. Regardless of programming language. It's one of those noob mistakes that we've all made and learned from. I think we shouldn't be trying to protect people from such mistakes. I'd rather think about how we can get people to make such mistakes faster, so they can more rapidly build up the requisite scar tissue :)

So yes, we should document this, but not just in this particular case of tables. It feels more like something to bring up in the tutorial.

Edit: to be clear, I'm not (yet) supporting Kinnard's original proposal. I haven't fully digested it yet. I'm just responding to your comment in isolation ^_^


There's definitely a tension between being a concise language and being a safe language. Arc doesn't try to help newcomers avoid simple mistakes. It gives them enough rope to hang themselves, like with unhygienic macros. That's partly why I stopped using Arc to teach my students programming (http://akkartik.name/post/mu).
1 point by kinnard 9 hours ago | link | parent | on: Ask AF: Ordered Tables?

> "The microbenchmarks that we did show large improvements on large and very large dictionaries (particularly, building dictionaries of at least a couple 100s of items is now twice faster) and break-even on small ones (between 20% slower and 20% faster depending very much on the usage patterns and sizes of dictionaries)."
2 points by rocketnia 9 hours ago | link | parent | on: Ask AF: Advantages of alists?

I've replied separately about why I would say quasiquotation is only useful for code generation. In this reply I'll focus on the topic of the quirks we might have to deal with if we have Arc tables as quasiquotable syntax.

I think they're mostly unrelated topics, but I was using the quirks of tables in `quasiquote` to motivate keeping the number of quasiquotable syntaxes small and focused. Since I believe quotation is essentially only good for code generation (as I explain in more detail in the other reply), my preference is generally to focus the quasiquotable syntaxes on that purpose alone.

---

"In general it feels unnecessarily confusing to include long doc comments in code fragments here. We're already using prose to describe the code before and after."

Sorry, and thanks for the feedback on this.

There's a deeper problem here where my posts can get a bit long, with a lot of asides. :) I thought of those code examples as an aside or a subsection. If you were going to skim over the code, I wanted it to be syntactically easy to skim over the related prose at the same time.

This was something I felt was particularly worth skipping over. Ultimately, the quirks of using tables as syntax are mostly just as easy to put up with as the quirks of using tables for anything else. (I've gone to the trouble to make what I think of as non-quirky tables for Cene, but it's a very elaborate design, and I wouldn't actually expect to see non-quirky tables in Arc.)

Since I was only using these quirks to motivate why `quasiquote` would tend to be focused on code generation, I probably didn't invest enough space to fully explain what the quirks were. I'll try to explain them now....

---

"Those two fragments are the same?"

Whoops, those two fragments were supposed to be '(let i 0 `{,++.i "foo"}) and '(let i 0 `{,++.i "bar"}).

---

"Finally, both your examples seem to be more about side effects in literals? That is a bad idea whether it's a table literal or not, and whether it uses quasiquoting or not. Do you have a different example to show the issue without relying on side-effects?"

I don't know if I'd say the unquoted-key example depends on side effects, but the unquoted-value example very much does. Here it is again:

  (let x 0
    `{"foo" ,(= x 1) "bar" ,(= x 2)}
    x)
The quirk here is that the usual left-to-right evaluation order of Arc can't necessarily be guaranteed for table-based syntax, and if the evaluation order matters for any reason, it must be because of some kind of side effect.

Removing side effects from the language is a great remedy for this, but typically that kind of effort can only go so far. In an untyped language, we usually have to deal with the side effects of run time type errors and nontermination, even if we eliminate everything else:

  `{key1 ,(accidentally-cause-a-run-time-error) key2 ,(loop-forever)}
Even if we commit to programming without any run time errors or nontermination (perhaps enforcing termination with the help of a type system like that of Coq or Agda), we still have some cases like this where the order matters:

  `{key1 ,(compute-with-64TB-of-space) key2 ,(compute-for-800-years)}
A programmer in Arc or Racket might expect this program to reach a space limit relatively soon on machines with less than 64TB of space available, since Arc and Racket guarantee left-to-right evaluation order.

If the programmer actively intends for this program to fail fast, you and I will probably agree they would be better off sequencing the operations a little more explicitly, maybe like this:

  (let val1 (compute-with-64TB-of-space)
    `{key1 ,val1 key2 ,(compute-for-800-years)})
But suppose the programmer doesn't initially realize the program will fail at all. It only crosses their mind when they come back to diagnose bugs in their code, at which point they expect these expressions to evaluate from left to right because that's what Arc and Racket normally guarantee.

That's when they have to realize that the tables in their syntax have gotten in the way of this guarantee.

Simple solution: We clearly document this so people don't expect left-to-right evaluation order in this situation.

Alternative simple solution: We make tables order-preserving so they can be evaluated as expected.

That covers the unquoted-value example.

Now let's consider the unquoted-key example:

  '(let i 0
     `{,++.i "foo" ,++.i "bar"})
In this one, the quirk is that the two occurrences of ,++.i are expressed with the same syntax, so at read time the table would have two identical keys, even though the programmer may expect them to express different behavior.

While it looks like this example depends on side effects (in this case mutation), I'm not so sure it does. Here's an alternative example which shows the same issue without necessarily using side effects:

  '`{,(current-location) "foo" ,(current-location) "bar"}
This involves a hypothetical macro (current-location) which would expand to a string literal describing the filename, line, and column where it was expanded.

Is it a side effect? Maybe not; a file of code that used (current-location) would usually be semantically equivalent to a file that spelled out the same string literal by hand. In a language with separately compiled modules, both files might compile to the same result, which would make that semantic equivalence precise. In such a language, we typically wouldn't have any reason to mind if a module used (current-location) in its source code, even if we preferred to avoid it for some reason in our own code. This makes it into some kind of "safe" side effect, if it's even a side effect at all.

Nevertheless, within a single file, the expression (current-location) could look the same in two places but give different results.

That's where using `unquote` in table keys becomes quirky: The source code of two table keys may look identical (and hence cause a duplicate key conflict at the source code level) even if the programmer thinks of them as being different because they eventually generate different results.

Because of this quirk, the programmer may have to use some kind of workaround, like putting slightly different useless code into each key:

  '`{,(do 1 (current-location)) "foo" ,(do 2 (current-location)) "bar"}
Simple solution: We clearly document this so programmers can use that workaround with confidence. To help make sure programmers are aware of this documentation, we report descriptive errors at read time or at "quasiquotation construction time" if a table would be made with duplicate keys.

Alternative simple solution: We decide never to allow table keys to be unquoted. If a table key appears to be unquoted, the table key actually consists of a list of the form (unquote ...). We still report errors at construction time or read time so programmers don't mistakenly believe `{same-key ,(foo) same-key ,(bar)} will evaluate both expressions (foo) and (bar).

2 points by rocketnia 9 hours ago | link | parent | on: Ask AF: Advantages of alists?

"My immediate reaction is to disagree. A lot of the reason Lisp is so great is that quasiquotation is orthogonal to macros/metaprogramming."

Do you have particular reasons in mind? It sounds like you're reserving those until you understand what I'm saying with my quasiquoted table examples, but I think those examples are mostly incidental to the point I'm making. (I'll clarify them in a separate reply.)

Maybe I can express this again in a different way.

I bet we can at least agree, on a definitional level, that quotation is good for constructing data out of data that's written directly in the code.

I contend quotation is only very useful when it comes to code generation.

If there were ever some kind of data we could quote that we couldn't use as program syntax, then we could just remove the quotation boundary and we'd have a fresh new design for a program syntax, which would bring us back up to parity between quotation and code generation.

In a Lispy language like Arc, usually it's possible to write a macro that acts as a synonym of `quote` itself. That means the set of things that can be passed to macros must be a superset of the things that can be passed to `quote`. Conversely, since all code should be quotable, the set of things that can be passed to `quote` must be a superset of the things passed to macros, so they're precisely the same set.

This time I've made it sound like some abstract property of macro system design, but it doesn't just come up in the design of an axiomatic language core; it comes up in the day-to-day use of the language, too. Quoted lists that don't begin with prefix operators are indented oddly compared to practically all the other lists in a Lispy program. I expect similar issues arise with syntax highlighting. In general, the habits and tooling we use with the language syntax don't treat quasiquoted non-code as a seamless part of the language. So, reserving quasiquotation for actual code generation purposes tends to let it help out in the places it really helps while keeping it out of the places where it causes awkward and distracting editor interactions.


I actually like `#<void>`, because it makes more of a distinction between pure and impure functions.

I read a good blog post[0] recently on how not distinguishing makes it difficult to guess the behaviour of simple and short code snippets (in JavaScript, but the same could apply to Arc).

[0]: https://medium.com/@winwardo/the-principle-of-least-astonish...


First option makes sense.

Second option, part a: I actually find `~empty?` clearer than `some` in this case. Also `some` means something different in Arc.

Second option, part b: wait, then `(if xs ...)` would sometimes not do the opposite of `(if (no xs) ...)`!


Two points.

- the assumption baked into this argument is that cdr of an empty list returns an empty list. Switching nil to #f and letting empty list be truthy avoids this problem.

- Good names are important. ~empty? isn't really a fair characterization. Lumen uses (some? xs). There is also another way: Update `no` to be (or (is x nil) (empty x)), and then use (~no xs).


I tried to build a package manager for arc it's partially implemented.

I don't know, but I can't see how it would be feasible when any module or package could arbitrarily and globally redefine existing symbols, functions, operators, etc.

In my example above, making an empty list truthy would cause this change:

    (def map1 (f xs)
    "Returns a list containing the result of function 'f' applied to every element of 'xs'."
   -  (if xs
   +  (if (~empty? xs)
        (cons (f car.xs)
              (map1 f cdr.xs))))
We can argue how important this is, but disambiguation does definitely make Arc programs less terse.
1 point by kinnard 16 hours ago | link | parent | on: Ask AF: Ordered Tables?

I agree. It's a big jump for me to try to implement an insertion-ordered table w/e it ends up being called.

And it's unclear what the behavior should be: http://arclanguage.org/item?id=21066

1 point by kinnard 16 hours ago | link | parent | on: Ask AF: Ordered Tables?

For some reason I can't escape associating journal and "jour" (day) in French, journaling having to do with something done daily.

Maybe ledger => ltable which is insertion-ordered would be better but that has all sorts of other implications.

3 points by kinnard 16 hours ago | link | parent | on: Ask AF: Ordered Tables?

Ah, I overlooked that! I'm unfamiliar with that usage of journal in this context. Just looked it up.
2 points by i4cu 16 hours ago | link | parent | on: Ask AF: Ordered Tables?

Well he gave you a pretty good 'name' which is the topic (semantics I know :).

'jtable'

Journaling infers logging by insertion order.

so jtable, or log-table are good no?

This is kind of niggly stuff. Normally we create something before debates ensue about naming... :)

2 points by kinnard 16 hours ago | link | parent | on: Ask AF: Ordered Tables?

Know a more succinct term for insertion order?
2 points by i4cu 17 hours ago | link | parent | on: Ask AF: Ordered Tables?

I think he's just stating their use of the term 'order' is a poor choice when the word 'order' is in fact non-specific. i.e. People use 'order' to categorize things that are stable and have order predictability, but let's not pin the term 'order' to a single variant such as the insertion order.

http://arclanguage.org/item?id=18098 (Why should zero be truthy?)

tangentially related with some overlapping content.


> Users need JSON. JSON has true, false, null, strings, numbers, arrays, tables, and empty arrays.

JSON supports empty lists (arrays) and 'null'. Currently I can pass in () for the array and nil for null. How would this be managed?


Has anyone tried to pull off a module + package system with unhygienic macros?
1 point by kinnard 18 hours ago | link | parent | on: Ask AF: Ordered Tables?

Do you mean other possible orderings like those dependent on the implementation of the map/table/obj structure?

Or orderings dependent on keyvalue pairs not just keys?

More