Macros are the main reason I decided to create Adder, a Lisp-on-Python with minimal impedance mismatch. Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.
When I first tried it, it took something like 50s at 2.4GHz, virtually all of which was the compiler. (The compiler runs at load time; obviously, saving the compiled code for the next run would help.) I got it down to...let me try it now...7s at 3GHz, but that's still too slow for a 200-line program.
If anybody's interested, the code's on Github [1]. To see the macro-heavy example, look at samples/html.+, which is an HTML generator. The framework takes 169 lines; the sample page starts at line 171. To see the output, run:
> Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.
The Python byte-compiler, specifically, appears to be fairly slow. I hadn't really thought much about this, but while contributing to a benchmark yesterday (http://news.ycombinator.com/item?id=1800396), it wound up staring me in the face. There's actually surprisingly little difference performance-wise in running Lua from source vs. precompiled (both pretty fast), whereas the difference between Python and pyc in my benchmark was wider than every other possible pair in the chart except python interpreted vs. "echo Hello World". (I didn't have any JVM languages, though.)
I don't really do eval-based metaprogramming in Python, but do so on occasion in Lua. I thought I felt better about doing so because Lua is syntactically much simpler (and has scoping rules that make avoiding unexpected variable capture easy), but the Lua compiler itself also appears to be substantially faster than Python's. (It doesn't do much analysis, but still usually runs faster than Python.)
And yes, it's not as good as straight-up Lisp macros, but Lua's reflection also covers a lot of low-hanging fruit that macros would otherwise handle. The biggest thing lacking in Lua compared to Lisp is an explicit compile-time phase for static metaprogramming. (Code generation is an inferior alternative.) Lisp macros win big in part because they can avoid the overhead of parsing, but parsing Lua is fairly cheap thanks to its small, LL(1) grammar (http://www.lua.org/manual/5.1/manual.html#8).
I quite like the approach taken in MetaLua (http://metalua.luaforge.net/) for allowing more advanced cases of meta-level programming. After some exposure, moving up and down levels using +{} and -{} becomes about as readable as meta-programming can IMHO. The explicit modification of the compiler is not so nice, but I'm guessing it is hard to do anything better in a a programming language with some actual syntax.
Of course, what I really would like to use is MetaLuaJIT, to get the best of all worlds :)
Macros were also the reason I wrote Noodle, which was a similar effort (Lisp compiling to Python bytecode) back around 2004. The project died when I couldn't figure out sane, easy-to-learn rules about how to make macros live with modules and imports, and I found I didn't like the syntax concessions I had to allow to make attribute access less of a terrible pain.
I still think something like this would be worthwhile, and I wish you the best of luck! I particularly like your (.bar.baz foo) syntax. And good on you for moving away from bytecode generation; I also found that to be a dead end.
So what are your plans for macros+namespaces? Will macros from imports live in the same namespace?
And what did you mean by "Python supports only two levels of lexical scope" at http://www.thibault.org/adder/ ? I don't recall any constraints regarding lexical scope levels in the VM.
and I wish you the best of luck! I particularly like your (.bar.baz foo) syntax.
Thanks!
And good on you for moving away from bytecode generation; I also found that to be a dead end.
Glad to hear I'm not the only one. :-)
So what are your plans for macros+namespaces? Will macros from imports live in the same namespace?
Actually, I hadn't thought about it--I haven't gotten to the point of being able to write modules in Adder. I think I see what you mean, though: any module has to be a Python module, so macros would have to be expressed as functions.
Strawman: the module could contain a variable listing the macros. When compiling an (import) form, the Adder compiler would check the imported module for that list, and update its internal structures accordingly.
And what did you mean by "Python supports only two levels of lexical scope"
That was a mistake; I've removed it. I think maybe I just didn't figure out how to generate bytecode for a triply nested function.
Of course, there is the remaining problem that Python can't define a scope without defining a function. I didn't want to use the standard tactic for turning (let) into a nested function (I don't trust Python functions to be fast enough), so the current compiler uses name mangling, tagging each variable with the scope depth. (Global variables are untagged, so that they can be accessed cleanly by Python modules.)
Macros are the main reason I decided to create Adder, a Lisp-on-Python with minimal impedance mismatch. Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.
When I first tried it, it took something like 50s at 2.4GHz, virtually all of which was the compiler. (The compiler runs at load time; obviously, saving the compiled code for the next run would help.) I got it down to...let me try it now...7s at 3GHz, but that's still too slow for a 200-line program.
If anybody's interested, the code's on Github [1]. To see the macro-heavy example, look at samples/html.+, which is an HTML generator. The framework takes 169 lines; the sample page starts at line 171. To see the output, run:
./adder.py samples/html.+
[1] http://github.com/metageek/adder
(Edit: it requires Python 3.x.)