Currently, ensure-dir and date use system to run Unix commands. It would be much more portable if they used mzscheme operations instead. This has caused me trouble not only on Windows, but different versions of Linux. And I think make-temporary-file could replace /dev/urandom.
I changed date to get the date from Mzscheme, but it's not so easy to change ensure-dir. Mz's make-directory doesn't create intermediate directories like mkdir -p, and I don't want to get into trying to understand pathnames.
If anyone wants to take on being a distributor for a Windows port (or for different variants of Linux, for that matter), I do have a patch for date here: http://catdancer.github.com/date.html
Another option is the Anarki stable branch (http://github.com/nex3/arc/commits/stable/), which has most of fixes necessary to make most of Arc work portably on Windows and other OSes. (And, being a bug-fix branch, the amount of other random material is limited.)
I agree with the option, but I also think these posts will fall off the deep end in about 2 months. By then new members may only discover the option after they've discovered the problem. So we're not really saving new members wasted efforts unless the install page guides people.
pg: Even if you'd rather not think about supporting Windows portability yourself, providing a link to the Anarki stable branch would help new Windows users. Finding this stuff on the forum after it's fallen off the top couple of pages is not very easy.
It looks like you're using the anarki-specific bash script to start arc. The script executes uname (which returns the OS) and checks if it is 'Darwin'. You could probably change `uname` to 'junk' (note backquotes changed to single quotes) to disable this check. But if you're on Windows, running the bash script seems like asking for trouble.
As a note: I am not using a bash script to launch arc, I'm using a basic windows shortcut. This arc.sh file is initiating on it's own (as from eds; maybe because cygwin? in path?)
Changing uname to 'junk', didn't work (I even deleted the file) no luck (pg's arc2 doesn't have the uname problem, but errors on other things like creating directories and loading asv).
Anyone know how to prevent anarki from thinking my windows machine is unix machine?
Nevermind - I figured it out. I just changed the lib.arc so that ffi.arc doesn't get loaded. I hope someone who knows how to use git can put ffi into the lib folder and not have it load by default.
T.
I've been meaning to write a blog post on my view of what Arc is doing right and wrong, but don't have the time. Instead I'll suggest that the generic language design critique at http://lambda-the-ultimate.org/node/687#comment-18074 is worth applying to Arc; hopefully someone can answer those questions with respect to Arc.
I think all of those questions were pretty thoroughly answered in the original rationale essays for Arc. My only reservation is with question 4, because most of Arc could have been implemented as macros on another Lisp. That might change as it gets fleshed out.
An interesting article is "The Origins of the Turing Thesis Myth", which explains why it's a myth that a Turing machine can do anything a computer can.
The quick summary is that Turing machines can compute any algorithmic function. However,real computers do more than computing functions, and today's applications cannot be modeled by Turing Machines.
And this is what comes of trying to be flippant around people who know more than I do :)
That's a very interesting paper. I hadn't really thought about the theoretical ramifications of everything we let computers do these days... On the "I agree" front, there really isn't much to say.
Though I'm not convinced that it's strictly untrue that "all computable problems are function-based." For instance, take the robotic car: you could, for instance, model that as sequence of functions being called with new inputs, where each call represented the motion of that car over one "tick," which could be as short or as long as you like. And if there's a hole in that argument (which there may well be), then if worst comes to worse, we can use quantum mechanics to model the section of the universe with the computer running the program and obtain a mathematical description that way :)
Eh, but the problem is the assumption that the world itself cannot be modelled as part of the tape that the Turing Machine eats.
From a quick glance through the paper and the LtU comments it seems that its point is that interactive I/O cannot be modelled by the Turing Machine.
But as I've learned in Haskell, I/O can itself be mathematically treated, specifically by monads: the world-before-i/o-event is the monad that is input to a function, and the world-after-i/o-event is the monad you return. And I'm pretty sure that monads themselves can be modeled by TM: they can be represented by a part of the tape that the TM gets to and modifies, just like any function-to-TM mapping.
The problem is that the paper uses words like 'model' and 'function-based' rather vaguely. You can model I/O with a TM, but you can't actually do it, which is what they're getting at.
Implementing numerically stable and accurate transcendental functions is rather difficult. If you're going down that road, please don't just use Taylor series, but look up good algorithms that others have implemented. One source is http://developer.intel.com/technology/itj/q41999/pdf/transen...
That said, I don't see much value in re-implementing math libraries in Arc, given that Arc is almost certainly going to be running on a platform that already has good native math libraries.
I briefly tried porting Arc to the new PLT Scheme earlier, and ran into problems. I have some suggestions at http://arclanguage.org/item?id=7057
As for lojic's question about why not just use PLT Scheme? That's a very good question. Personally, I find Arc has a lot of negatives compared to PLT Scheme, and the only positive I see is the macro system isn't confusing like Scheme's. Arc also provides the excitement of exploring new territory, but I think I've about exhausted that.
What about the rest of you? Why use Arc instead of Scheme? (Maybe this should be a top-level question?)
The reasons I use Arc (well, not these days, due to incredible lack of time, but...) :
- excellent macro system (even if mzscheme seems to have a "dirty" defmacro too),
- the empty list is the false boolean,
- the syntax (particularily [] and :) is really a great improvement,
- I never built webapps that fast (and with so much fun),
- my list is close to almkglor's, but he's before me in the leaders' list, and I'll never catch him, so I can't follow him on this point :)
Now, sure, PLT is an excellent language/environment. But it's waaaaay older, that helps. Actually, if today I was given the opportunity to program in a Lisp of my choice (for work I mean, not for fun), I'd use Arc for a basic webapp and mzscheme for almost everything else.
Ditto. I especially like the fact that, because there are so few people working on it, I actually have a chance at making a difference. Not a very big chance, since as yet I'm still a noob, but it's still something to hope for ;)
I thought it would be cool to be part of new Lisp community also, and it may eventually develop into that, but it's increasingly seeming like just a pet project of pg's that may or may not develop into something more. That's not a criticism - I think it was generous of pg to open source Arc and invite folks to participate, and he's been straightforward with his intentions. It's just a slightly different project than I expected initially.
I'm also a Lisp newbie. If I was already experienced in Common Lisp or Scheme, I might be more interested in investing more time with Arc to learn something new, but at this point, I'm simply looking for the best Lisp to learn. Given that I primarily develop web based software, you'd think Arc might be the one; however the ease of creating simple web apps is offset by the lack of other stuff I need and the question of long term viability.
So, learning PLT Scheme seems like a reasonable course of action because I expect I can bring most of that knowledge back to Arc if it gains more traction. Who knows, if developing web apps with PLT Scheme becomes too painful, maybe I'll just switch back to Arc and give it a few months of learning.
I agree with your comments, and those above, and I actually think it's a huge strength of this project and why I remain interested. I think way, way too much stuff in the software community is focused on DO LOTS OF STUFF RIGHT AWAY GET IT DONE YESTERDAY WAAAAHHHHHHHH FASTER FASTER FASTER!!! That may be the right way to start a company, but it's a really crappy way to build a "one-hundred year language".
I like the fact that this project is moving slowly, and that the community (from newbies like me to pros like Tilton) has time to try things and reflect, and not in internet time.
I'm 36. I've already learned and thrown away about 3 different major development ecosystems in my career. I accept that I'll continue to do that in order to pay my mortgage, but in my own time, when I'm actually programming for fun, I want to hitch my wagon to something that will actually have some lasting value. And Arc (or at least something very similar) is the best-looking option to me at the moment.
1. It's very similar to Cadence Skill, which is the Lisplike we use in the office: CL-style list-based macros, t and nil, lexical variables (at least Skill++)
Usually, it seems to be either (n log n) or (n log n) - 1 digits.
And usually in this case I would leave of the O, as that usually refers to the performance of an algorithm in time or space. I suppose you could construe the number of digits to be "space" but multiplying O(n) numbers doesn't make that much sense.