Thought: is language design the right problem to be solving?
I've had an argument brewing in my head that what people consider language features is better thought of as IDE features. Compile time type checking, for example, ought to be part of the IDE as well as the language. Syntactic sugar also.
I would love to see the "best language" argument to be split up into two - "best language/IDE" and "best VM" arguments.
Stop it with the meta-decisions - learning Arc will take up one thorough afternoon. It's worth it because you'll start seeing ways of writing code in any language that mimics features Arc has.
What about "100 year language" don't people understand? PG's point isn't to make the language popular now, or in a decade... it's to make it popular 100 years from now.
Personally, I think it underestimates how different the world will be 100 years from now to talk about how programming languages will be then (just imagine what kind of "computers" we'll have then!), but I suppose the "100" number is arbitrary... I think PG just means generally looking at "long term", not "short term", and having a community is a short term advantage.
PG won't be alive in a 100 years.. and there's a difference between slowly searching for an optimal language and not working at arc at all.
It is a bit sour for the community if their effort are not affecting the progress of the language towards its goal.
To be fair, PG has a point though that n months isn't that much if Arc is supposed to be 'finished' in about 30 years from now. We might just have wrongly expected things to move at a somewhat faster pace.
Speaking of optimizing for "100 years", what are some design decisions which would make the language more flexible/easy to use that, while possibly sacrificing speed or efficiency, would make the language more useful in the long run?
It seems to me that computers (currently, it may not last) are doubling in speed rather frequently, so any efficiency problems in the language will quickly be outweighed by it's usability. That's partly why one of the most popular tools for making websites is ruby. It may not be fast, but it's cheaper to double the server capacity than hire more programmers.
If you're looking at long-term prospects, my understanding is that the idea that computers have "speed", which "matters", is actually temporary (should stop around 2040 if current trends continue). At that point you run into the thermodynamic limits of computation, where the question is how much energy you're willing to put into your computation to get it done.
However, I'm not sure that affects language design a great deal. It seems to me the goal of a language is to let you express algorithms in a concise and efficient manner, and I don't see that becoming unimportant any time soon.
Well, I think we're sort of agreeing with each other. You said that speed should be irrelevant, because language design was about describing algorithms concisely and efficiently; I said that we should try and think outside the box, by ignoring our sense of efficiency. If it didn't matter how the feature was implemented, what features could we come up with that could radically change how easy it is to code? Maybe there aren't any, but it is an interesting idea. That is not to say that I'm only interested in inefficient improvements ;)
Another possibility:
Lisp is an awesome family of languages partly because they have the ability to create higher and higher levels of abstraction via macros. "Complete abstractions" they were called in Practical Common Lisp. Are there any other means that could create "complete abstractions"? Or easily allow higher level abstraction?
Moore's law has actually been failing as of late. I suspect that exponential increase in computer capability is a feature of the yet-nascent state of computer technology. Assuming that it will continue into the foreseeable future is a bad idea. Performance is important; it's just not as important as some think, and in particular, it's not so much all-around performance as the ability to avoid bottlenecks that's important. This is why a profiler for arc would be nice, as would some easy and standard way to hook into a lower level language for speed boosts.
If you generalize Moore's law to cost of computation per second, it isn't failing. In 100 years, network availability, speed, and capacity will lead to yet-unimagined changes.
How will programming languages take advantage of a thousand or a million CPU's?
There was some reason I didn't do it that way (I'm the author of ArcLite). IIRC, it was because the semantics of Arc forms differed in a way that would basically require an Arc interpreter anyway, even if you compiled them. For example,
(foo "bar" 1 2)
is only a function call if foo is a function object. If foo is an array, it's an array index; if it's a hash, it's a dictionary get (both ill-formed in this example). Other operations are similarly polymorphic, eg. overloading + for addition and concatenation. In order to translate that reliably to JavaScript, you'd need to turn every funcall form into a type-dispatch. At that point, you're basically writing an interpreter anyway.
I think Common Lisp is a bit foreign, and recommend Arc as a first Lisp to my friends - it's clean, the tutorial is great, and the concepts you'll learn apply to all Lisps. The lack of a "proper" IDE and libraries don't really matter if you're just toying around learning the habits of a language.
Don't worry about wasting time - frankly, it sounds like an excuse. It takes one night's work to learn Arc, and the knowledge is very applicable to a lot of other Lisp derivatives out there.
Scheme is a great follow up after Arc, The Little Schemer series of books being especially fun. Once you feel ready for Common Lisp, "On Lisp" is great (at least the first half that I got through was).
Programming languages are things understandable by brains and executable by computers. In the long run, the interface to the latter will change as computer architectures change. What PG is looking for with Arc (I think) is the optimal interface to the former.
Do you really think you're going to be calling a C++ function through an FFI in 100 years?
I don't think we can imagine how we'll be programming in 100 years - Arc is a bit too easy to imagine - and was just about imagined nearly a half-century ago.
How would you design a language if you knew that you had >2^50 the processing power you have today to compile/interpret it?
Are you copy and pasting directly from the browser? If so, you might be getting weird line endings that Scheme doesn't understand. Try copying it into a text file first and then copying from there into Arc.