I started Jarc by writing an s-expression reader. Then I implemented the 9 Arc primitives in Java and then I started writing eval in Java. Then I incrementally added the missing
global functions used in arc.arc - that took a while, but wasn't particularly difficult.
I spent a lot of time thinking about the Arc to Java interface. I wanted it to be
as light weight as possible and as brief as possible, in keeping with the Arc goal
of making programs shorter. In particular, I wanted to avoid any 'import' or 'defcall'
requirements if possible.
Another important consideration is how much of your interpreter can you write in Arc.
I was anxious to stop writing Java and be able to add missing global functions in Jarc
as soon as possible. At last count, I have 61 functions implemented in Java and
73 in Jarc itself.
Jarc doesn't have first-class continuations or fractional numbers. The interpreter also doesn't do tail call elimination. (I did write a compiler this year to do
tail call elimination in recursive functions.)
From what I learned working on Arc and talking with Conan about Rainbow, implementing a
continuation passing style interpreter is the way to go if you want first-class
continuations and tail call elimination.
Writing Jarc has been great fun, and I've learned a lot.
"I started Jarc by writing an s-expression reader. Then I implemented the 9 Arc primitives in Java and then I started writing eval in Java. Then I incrementally added the missing global functions used in arc.arc - that took a while, but wasn't particularly difficult."
That's how I planned to start the implementation, but I've also considered writing a compiler that translates the arc code into MSIL using Reflection.Emit. That would allow me to do tail-call optimization, and possibly CPS.
So what do you have currently for your Arc/Java interface, and what are your thoughts on difficulty of implementation, ease of use, etc. associated with that choice? Are there other options that you considered and discarded, or haven't gotten around to implementing yet? I was thinking of trying to treat assemblies, namespaces, classes, and objects like hash tables, and letting (obj 'name) access the member, but I don't know how well that will work with mostly static typing.
I'm not sure what to think about fractional numbers. I don't think I've ever had a good reason for using them, so I won't bother implementing them myself until I do.
In fact, CPS benefits from TCO. By converting a program into explicit CPS, every function call becomes a tail call (to the current continuation); without TCO, stack space would run out quickly. Certain TCO implementations can also use CPS; see http://en.wikipedia.org/wiki/Tail_call#Implementation_method....
I'm starting to view arc not as a language but as a thin membrane over an underlying lisp.
When I first read this I thought, "But what about arc on top of other languages beside scheme?" Then I realized I started writing Jarc because I wanted a thin membrane over Java. Interesting way to think about it.
Yeah I started thinking this way after spending some time atop sbcl. I'll release that at some point. It's quite surprising how little code you need to build the arc transformer when you choose the right lisp implementation, when you're not going against the grain of the underlying lisp.
161 days ago you announced your jvm.arc? Interesting. Seems useful.
So Groovy allows dynamic classes, not just interfaces? I'd like to do that in Jarc also. But I don't see how to do it without generating byte code. Jarc now includes Jasmin (a JVM assembler) so generating byte code is pretty straight-forward.
Sun's implementation of JDK 6 is co-bundled with the Mozilla Rhino based JavaScript script engine. (...)
Rhino's JavaAdapter has been removed. JavaAdapter is the feature by which a Java class can be extended by JavaScript and Java interfaces may be implemented by JavaScript. This feature also requires a class generation library. We have replaced Rhino's JavaAdapter with Sun's implementation of the JavaAdapter. In Sun implementation, only a single Java interface may be implemented by a JavaScript object.
In other words, the Rhino distributed with JDK 6 provides no JavaAdapter functionality the Proxy class can't already achieve....
I've fixed these Jarc bugs in Jarc 21. I would not have thought to try nesting optional args inside destructuring. Should work, though, of course. And actually the Jarc code is cleaner this way.
arc> (obj a 1 b 2)
#hash((a . 1) (b . 2))
arc>
(let list list:list
(obj a 1 b 2))
#hash((((a 1 . nil) . nil) . ((b 2 . nil) . nil)))
Currently, (obj a 1 b 2) expands into (listtab (list (list 'a 1) (list 'b 2))), which means it uses the local meanings of 'listtab and 'list during evaluation.* The Arc community already removes much of the need for hygiene by following the convention of using (uniq) for local variable names generated by macros, but that doesn't help in this case.
Technically, we could fix this as a community too, as long as we adopt a convention like one of these:
- Have (obj a 1 b 2) expand into ('#<procedure: listtab> ('#<procedure: list> ('#<procedure: list> 'a 1) ('#<procedure: list> 'b 2))), for instance.
- Have (obj a 1 b 2) expand into (eval!listtab (eval!list (eval!list 'a 1) (eval!list 'b 2))), for instance, and avoid using "eval" as a local variable name.
- Avoid using existing global variable names as local variable names, and vice versa.
Whenever possible, I like being able to repair a REPL session just by fixing one buggy global binding, so I don't recommend the first option here. The third option is closest to what we have in practice, but I don't like the way the global/local distinction is hazy without introducing some kind of boilerplate naming convention. So the second option is kind of my favorite, but it still smacks of namespace qualification, and it doesn't help Arc code that already exists.
Semi-Arc takes a "none of the above" approach and just implements the hygiene, despite the incompatibilities that risks. I've only found one positive use of "free symbol capture" in practice,* so that probably won't be the top issue when porting code to Semi-Arc.
* IIRC, an example I wrote had one macro that bound a gensym-named local variable and another macro that referred to it using the same gensym. The second macro referred to the variable plainly enough that it was no more useful than an anaphoric variable, and it could just as well be replaced with a dynamic variable or a global variable. Nevertheless, it may turn out to be a useful technique for a variable that needs to be captured by lexical closures but only needs to be used for a specific purpose (incrementing it and broadcasting an update event, maybe).
I thought about using the same pattern in the Jarc sql package to dynamically turn tracing of the SQL statements on and off. But it seemed inelegant making all the sql functions into macros just to get the dynamic tracing, so I decided to use thread-local variables instead in that case.
I've been away from the forum (trying to get my startup launched). I'll look into the bug reports in http://arclanguage.org/item?id=12269 this week. Thanks for the bug reports, as always.
I've changed the default behavior for printing Java types that don't have a fn in writefns* -- Jarc now prints them as
#package.class(hashCode)
It was getting annoying getting the missing writefns* messages. You can still define your own, but it doesn't pester you to do so anymore.
I'm working on a Rainbow compatibility module so that Jarc and Rainbow can share modules that call out to Java classes (like Swing and GAE datastore).