> Now I try to minimize the use of margin, padding and float in my CSS.
Ah, could you elaborate on your technique? I may be missing out on some handy table attributes that you know. The main one I use is colspan - still use CSS padding to adjust most my spacing.
Not at all. I meant that if I find myself using more than the absolute simplest combinations of margin/padding/float, I consider using a table instead.
Readwarp may help triangulate on what I mean. The layout right now is simple enough that I just went with CSS. But if you inspect the logo with firebug, I didn't bother making things perfect with CSS. I just used tables.
If you're not curious, well then I'm sorry for posting this on Arc Forum! It really does have very little to do with Arc, but web development topics come up here often enough. And I feel so peculiarly attached to the community here, I think I just wanted to share the realization in case it could help anyone.
Still searching for the best lispy abstraction over html, css and javascript...
Well, there are definitely fewer conses in ((a b) 1 2) than in ((a 1) (b 2)), but ((a . 1) (b . 2)) has them both beat. (Maybe I'm missing something.)
As for being palatable to read and write, I actually disagree, even with the decrease in punctuation.
When writing a literal bucket, to add or remove a key-value pair, I'd have to edit in two places. Besides that, I'd encounter horizontal layout annoyances: I'd have to word-wrap the keys and the values in the same way in order to see the bindings clearly. And every time I added or removed a binding, I'd have to rewrap.
Reading is a bit more of a wash. When reading a bucket in debug output, it's harder to look up specific bindings but a lot easier to see what keys exist. I think I do those two things in about equal proportions (when I'm working on Penknife, at least).
> Well, there are definitely fewer conses in ((a b) 1 2) than in ((a 1) (b 2)), but ((a . 1) (b . 2)) has them both beat. (Maybe I'm missing something.)
I resorted to cons counting to confirm it for my own simple mind and... you're correct! ^_^
> In ar macros are implemented by vectors (same as Arc 3.1)
I did not know this. Are functions implemented by vectors too, or does it have to do with macros being annotated functions (i.e. that annotate is implemented as some sort of vector-wrapper)?
annotate is implemented as some sort of vector-wrapper
That's the truth. The result of (annotate 'foo 'bar) is #(tagged foo bar), where #(...) is the external representation of a Racket vector. Note that 'tagged here is just a symbol like 'foo or 'bar.
I wonder if Arc could instead use a custom Racket type so that we don't have quirky abstraction leaks when using Racket vectors. I never actually find myself wanting to use vectors, and if I did, I'd just [annotate 'vector _] them, but still. :-p
I did an experiment where annotate produced a list, of the form ('tagged x type1 type2 ...), making the tagged lists transparent objects to the arc system. In doing so, I extended the type system to include the original type of the object in the list and support more than one type, allowing a simple form of inheritance through coercion. I.e. if you annotated a list as type 'foo, until you defined an actual coercion method for converting 'foo to a cons it would be treated as a normal list, and you wouldn't lose any ability to work with it using existing operations.
The thing that annoys me the most about arc's current type system is that if you tag something, it becomes an unusable vector according to all existing arc code, so you need to re-implement or extend any function that you actually want to support the new type.
I think he made the additional change of delegating all functional calls to the "parent" type when they didn't explicitly handle the type at hand. This makes a lot of sense. If you have a list that's tagged 'foo, it should behave exactly like a normal list does until you've specified some diverging behavior for type 'foo.
Do you still have the code for your experiment, shader?
Like I said, it hardly matters to me 'cause of the [annotate 'vector _] workaround.
More worthwhile would be having custom 'iso hashing and stuff like that. ^_^ That's something that would probably benefit from a dedicated Racket type for tagged values, but that's not necessarily the only way to go about it. I'm still trying to figure out how to go about it for Penknife....
This is reminiscent of a comment at the bottom of arc.arc:
; solution to the "problem" of improper lists: allow any atom as a list
; terminator, not just nil. means list recursion should terminate on
; atom rather than nil, (def empty (x) (or (atom x) (is x "")))
It also reminds me of a feature in PicoLisp: symbols evaluate to nil by default instead of raising an undefined variable error.
:) This is a pretty good exhibition of features: https://github.com/akkartik/wart/blob/master/023conditionals.... You can see that the '? for optional' syntax hasn't changed, that $vars and o$vars are convenient and make several macros shorter. I suppose the fact of all args being optional is a little subtle and needs context to read between the lines.
But now I've let you put off trying wart again :) At some point I want to put a little webapp together to make reading wart code more convenient. So you can click 'next' from boot.lisp to go to 000.lisp (next in load sequence), tests are just a click away, etc. Really my only goal is to make it a pleasure to read, and I'd give an arm to hear experiences of lisp programmers reading it.
I haven't had any trouble with optional args (though this is the first time I've noticed them being useful). Or breaking constraints in general. I suspect the problem is not constraints I've broken but constraints I'm not even aware of.
I suspect that this change to association lists, together with functional position lookups [1], destructuring-bind and a judicious use of conses, could potentially eliminate the need for tables in Arc's core and:
- solve the optional arg problem [2]
- permit apply to be subsumed by the dot notation [3]
I still have a lot of details to work out before I can make a compelling case for this, though.
I don't see the point. o.o I'd much prefer to write '((a 1) (b 2)) rather than '((a . 1) (b . 2)) and destructure using (let (k v) ...) rather than (let (k . v) ...). Actually, I'd write '((a 1) (b 2)) as (objal a 1 b 2), but the destructuring issue is something I'd just deal with and fume over. :-p
Is there any downside?
Those are the downsides. :)
It's more efficient and more isomorphic to a hash table.
I think you save about 1/3 the conses when creating and adding to alists, so there is that.
But isomorphic to a hash table? The most official way we can compare them is with 'tablist and 'listtab, which use the list-of-two-element kind of alist.
Also, IIRC, Rainbow displays tables as #hash((a 1) (b 2)), and I couldn't be happier. There's so much " . nil" cruft when viewing big tables in official Arc.
could potentially eliminate the need for tables in Arc's core
Arc doesn't have enough table support. XP Keys are compared via Racket 'equal? (or via weirder methods in Rainbow and Jarc), and I haven't gone to the trouble to make tables that somehow dispatch via an extensible 'iso.
I want efficient lookup in big tables for the sake of Lathe's namespace system and Penknife's environments, and I'll get that by dropping to the underlying platform if I need to--I already do for weak tables--but I'd rather not. If official Arc ever removes table support, I hope it also adds 'defcall so I can put tables back in.
- solve the optional arg problem
- permit apply to be subsumed by the dot notation
How are those related? The only point of connection I see is that they're other things that could use dotted lists, but even that's not especially true for optional args. Did you mean to say that you suspect some change regarding dotted lists (or just the way we look at them) will help with both alists and these other cases?
I'm talking about the core notion of a hash table. It's composed of key-value pairs, not key-value "lists of two". :P This is a restatement of bogomipz's point made elsewhere in this thread.
> Arc doesn't have enough table support.
If alists were better supported, you could use them in place of tables in every case except where the utmost efficiency is required.
But I'm not sure it's even correct to frame this as an axioms vs. efficiency debate. Something I've learned from PicoLisp is that heterogeneous data structures slow down the general case by complicating memory allocation and garbage allocation. PicoLisp manages to be a fast interpreter (say what?), in part because it uses the cons cell for everything [1].
> I'd much prefer to write '((a 1) (b 2)) rather than '((a . 1) (b . 2)) and destructure using (let (k v) ...) rather than (let (k . v) ...).
I think this is a cosmetic issue that has to do only with our visual representation of cons pairs and Arc's incumbent ssyntax.
For example, if you changed the ssyntax so that a.b expanded to (a . b) instead of (a b), then these snippets would be more pleasant to write: '(a.1 b.2) and (let k.v ...) . I'm not actually proposing this particular solution, but it should illustrate my point that the issue is only syntactic/cosmetic.
> How are those related? [...] Did you mean to say that you suspect some change regarding dotted lists (or just the way we look at them) will help with both alists and these other cases?
Well I did say I still have some details to work out. ;)
I think your paraphrase is accurate. A "change regarding dotted lists (or just the way we look at them)" is what I was trying to express with "judicious use of conses" in the grandparent.
A pair is a list of two in English because English lists aren't nil-terminated. But Arc lists are, so we're talking about the difference between (key . val) and (key . (val . nil)).
I don't have a great answer to your triple/singleton question yet except to ask that you consider the following:
- The fundamental data structure of lisp is the cons pair, so perhaps pairs warrant some special treatment over singletons, triples, etc.
- The demand for associative arrays in general-purpose programming is far greater than that for any kind of triple-based data structure, which is why tables have their own type in Arc to begin with
Update: Cons pairs are so powerful that we've used them as the base for almost our entire language. And yet the associative array structure (which screams "pair"!) that we've made from them (i.e. alists) is so inadequate that we all outsource that functionality to tables instead. Around tables we've then developed the conveniences for syntax, etc.... Doesn't this seem a bit kludgy for The Hundred-year Language?
The main advantage of cons pairs, in my mind, is that they're all the same size, so it's easier to reason about them and memory-manage them on a low level. They're also just as powerful as they need to be to support an expressive language. But that doesn't make them ideal abstractions for exploratory programming, especially when an equivalent abstraction in the same language takes fewer characters to type out and is even better supported thanks to 'map, 'any, etc.
Yes, that makes sense. I may have gone somewhat overboard / overly dramatic in this subthread. :) I think I mostly just want alists to be more convenient. Need to think about this more...
I've been overly dramatic here too. I mostly wanted to help you make sure you were on a path that held water while giving you some hooks to convince me by... but I brought some external pet peeves into the mix and got worked up. XP Please do continue with your train of thought. ^^ Here's hoping the train mixes underwater hooks, or something.
It's something about Arc's built-in types that bothers me. They seem so adhoc. You have this beatiful axiomatic thing going on in the core with conses, and then suddenly tables enter the mix. From that point forward, odd utilities get defined with an if branch that checks for the table type.
In this thread, I've been worried about tables cluttering the core language and you about them not being well-supported enough. In truth, I think both of our concerns are legitimate (yours is for sure, because tables really are better than alists for some applications). The problem is that the present implementation doesn't do either of them justice.
I'd like to know what you think of this proposal: keep the core language definitions to symbols and conses. Then support each additional type in a dedicated file (e.g. numbers.arc, tables.arc, queues.arc). These types can either reach down into Racket to borrow one of its types (likely for numbers or tables) or be annotated constructs built from existing types (likely for queues, trees or alists), and then use the extend idiom to give them support in the various utilities and the reader.
"support each additional type in a dedicated file.. either reach down into Racket to borrow one of its types or be annotated constructs built from existing types, and then use the extend idiom to give them support in the various utilities.."
or defgeneric? 8-) I was moved by the same concerns you describe: I never want to see an (if (isa x 'table) ..) in arc code.
Agreed with both of you, but I'd go further: I don't want to see (isa x 'cons) or (isa x 'sym) either, if possible. I'd rather every type be treated as equally non-fundamental. Of course, s-expression syntax special-cases those types sorta intrinsically, but I'm sure 'defgeneric could be used along with one of aw's "access the Arc compiler from Arc" patches. ^_^
It might be difficult and/or impossible though, considering that 'defgeneric needs to be defined in terms of something. So does calling, since the most obvious way to specify a custom calling behavior is to give that behavior as a function to call! XD
Like so many other opinions of mine, this is something that's going into Penknife if at all possible, even if the core currently needs a bunch of rewriting to get it to work.
To fix the "'defgeneric needs to be defined in terms of something" issue, I'm currently considering having most things be built-in rulebooks, with rulebook being a built-in type if necessary.
For the calling issue, I'm going to have the built-in call behavior try certain hardwired things first, and only move on to the customizable calling rulebook if those don't work. I intend for it to be possible to replace the interaction environment with one that uses a different call behavior, so even that hardwired-ness should be kinda seamless with the language.
For now, these things are all hand-wavy, and I'm open to better ideas. ^^
> but I'd go further: I don't want to see (isa x 'cons) or (isa x 'sym) either, if possible. I'd rather every type be treated as equally non-fundamental. Of course, s-expression syntax special-cases those types sorta intrinsically
Wow, I'm really interested in whether there's a way to have s-expressions that don't special-case conses and symbols. shader's just-in suggestion [1] makes me think there might be a way to merge conses and symbols into a single type, though. Could it be possible?
Essentially, all you need to do is extend 'ac, since compiling is almost all that happens to Arc expressions. In the short term, there's no need to worry about whether a custom type represents a function call, a literal, etc. As long as it compiles, you can start returning it from macros or reader syntaxes.
In the long term, there may be other things that would be useful to extend, like 'ac-macex, 'ac-expand-ssyntax, and 'expand=. Also, it may be easier for a custom syntax type to support 'expand= if there's a separate utility it can extend in order to have all the functionality of a function call. That way it can be 'sref'ed.
Thanks for this guide. It should come in handy for me. :)
If I start messing around with Arc's internals too hard though, I may not be able to resist trying to turn it into an interpreter [1]. I'm too attracted to the notion of first-class environments, eval and fexprs lately. (In this case, I'd be extending eval rather than ac, correct?)
Or maybe I should just stop being such a damn purist. Have to take things one step at a time anyway. ac is a logical place to start.
Thanks for this guide. It should come in handy for me. :)
Well, I hope it actually works. :-p
If I start messing around with Arc's internals too hard though, I may not be able to resist trying to turn it into an interpreter.
Yeah. I would have just turned it into Penknife. >.>
I'm too attracted to the notion of first-class environments, eval and fexprs lately. (In this case, I'd be extending eval rather than ac, correct?)
Sure, but there's no interpreting 'eval to build on in Arc (unless you repurpose the macroexpander XD ). I'd find it easiest to approach by building it from scratch--hence kernelish.arc.
Or maybe I should just stop being such a damn purist. Have to take things one step at a time anyway. ac is a logical place to start.
It's all up to whatever you can figure out how to build on, I think.
Also, I'd tell you not to write off purism so quickly, but unfortunately I only like purism in irrational way. ^_^;
It's come under my radar before [1]. I've read some of the thread you linked to and some of what's on his github [2]. I like the general idea of giving ' and , more power to control evaluation, but I'm afraid I don't grok the language very well yet. :-/
Update: To clarify my confusion, the documentation talks a lot about closures (e.g. that ' does some kind of closure-wrapping), but I thought the language was supposed to be fexpr-based. I don't understand yet what fexprs have to do with closure-wrapping, but I really should study the language more closely.
Eight's documentation is in a terrible state (in part because there are still many things about which I've yet to make up my mind), so blame me for any confusion.
Here's the gist: Fexprs, like macros, take expressions as arguments (duh). Those expressions are made up of symbols (duh). Because a fexpr is evaluated at runtime, those symbols may already be bound to values when the fexpr is called. Eight keeps track of which symbol is bound to which value at the place the expression originated (where the programmer wrote it) --- even if you cons expressions together, or chop them into pieces. This eliminates the need for (uniq), but still allows for anaphoric fexprs when symbol-leaking is desired.
When I wrote the docs on github, I called an expression plus any accompanying bindings a 'closure' (even though it wasn't a function). I also didn't know the word 'fexpr'. I've read a few dozen more old lisp papers since then, and hopefully on the next go-round my vocabulary will be much improved.
"there might be a way to merge conses and symbols into a single type"
Interesting idea. This might help a lot with implementing lisp in strongly typed languages. I suppose atoms could just be cons cells with nil in their cdr slot. The only problem is then how do you get the actual value out of an atom, and what is it?
This might help a lot with implementing lisp in strongly typed languages.
Don't most of them have option types or polymorphism of some kind? If you've got a really rigid one, at least you can represent every value the lisp as a structure with one element being the internal dynamic type (represented as an integer if necessary) and at least two child elements of the same structure type and one element of every built-in type you'll ever need to manipulate from the lisp (like numbers and sockets). Then you just do manual checks on the dynamic type to see what to do with the rest. :-p
The only problem is then how do you get the actual value out of an atom, and what is it?
I say the programmer never gets the actual value out of the atom. :-p It's just handled automatically by all the built-in functions. However, this does mean the cons cell representation is completely irrelevant to a high-level programmer.
> I suppose atoms could just be cons cells with nil in their cdr slot.
Could they be annotated conses with symbol in the car and value in the cdr (initialized to nil)? nil itself could then be a cons with the nil symbol in the car and nil in the cdr. This should achieve the cons-symbol duality for nil that's usually desired. (Follow-up question: annotate is an axiom, right?)
I don't want to see (isa x 'cons) or (isa x 'sym) either
Totally with you there.
I don't want to get too hung up on 'purity'. It's ok to use tables in the core if you need them for defgeneric or something. It's ok to have a few isas early on. iso is defined as a non-generic bootstrap version in anarki before eventually being overridden, so stuff like that seems fine to me. I just want to move past the bootstrap process as quickly as possible.
iso is defined as a non-generic bootstrap version in anarki before eventually being overridden
Sure, that's an okay way to go about it. ^_^ Since I'm doing the Penknife core library stuff from the top down right now, I'm just writing things the way I want to write them, trying to determine what axioms I need before the core library is loaded. If the high-level axioms are defined in another lower-level library, that's just fine, but I don't know why I'd bother with that when I can just put them in the Arc part of the core. :-p
I think I had trouble digesting it a few months ago because it depended on so many utilities I was unfamiliar with: vtables, defmethod, pickles (and it compared with extend, which I didn't understand back then :-o ). Giving it another try...
vtables and pickles aren't utilities, just implementation details for defgeneric.
Basically vtables contains a hashtable for each generic function mapping a type to an implementation. "If len gets a string, do this. If it gets a table, do that." The body given to defgeneric sets up vtable entries for a few default types (cons, mainly :), and defmethod lets you add to vtables later.
If the generic function doesn't find an entry in vtables it falls back on searching the pickles table for a procedure to convert that type to cons, before retrying.
Let me know if this makes sense.
(names: I believe vtables comes from C++, and pickle is the python primitive for serialization)