Sat Sep 20 15:58:53 CEST 2008
more about macros and MetaML
More than I can process, but there was some talk about macros on LuT
Some things that struck me:
The Template Haskell school (for want of a better phrase - it
might better be called the MetaML school) of compile-time
meta-programming inverts the traditional Lisp notion of
macros. Put very simply, in Lisp macros are special things that
are explicitly identified, but macro calls aren't (they're
normal-looking function calls that happen to do special stuff at
compile-time); in Template Haskell, macros are normal functions
(that just so happen to return ASTs) but macro calls are
explicitly identified. This means that you can do anything you
would normally do with functions, so 'macros' are first-class in
such an approach.
Interesting, since this is one of the things that I consider a feature
in Staapl: heavily metaprogrammed source code can be read as if it is
Staging is strongly-typed runtime compilation. So you can replace
a interpreted parser expression built from closures with a
compiler using staging, and get a specialized parser that executes
at full speed with no interpretive overhead.
I'm probably missing a lot of nuances trying to compare Staapl with
MetaML by ignoring the type system..
The problem with LISP-style macros is that they force the user to
solve those problems over again. But, if the interpreter/compiler
writer has already done that, why should they have to? Why can't
they reuse that functionality? To put it another way, the problem
is that LISP doesn't sufficiently abstract from the way programs
are represented. It forces you to work with surface syntax,
represented as trees of atoms, when in fact a program has a great
deal more structure.
Now, hygienic macros do solve quite a lot of problems here, since at
least the identifiers are handled appropriately, but since the "great
deal more structure" in lisp is hidden behind the effect of a macro,
there is no way to check that structure before executing a macro with
the "wrong parameters" and generating a "run-time error at compile
time": you can't statically check macro code, and have to wait until
it is used in a tranformation for it to fail.
This leads to the following references:
Peter Van Roy:
Higher-order programming is more expressive. Macros are
syntactically more concise and give more efficient code. (Note
that expressiveness and verbosity only have a tenuous connection!)
In Staapl, expressiveness is traded for efficiency. The thing that
glues HOFs and macros together seems to be a good partial evaluator.
I.e. higher order traversal function applications replaced with static
'structured programming' code.
Peter Van Roy:
My definition of expressive power has nothing to do with Turing
equivalence. It's something like this: adding a new language
construct increases the expressive power of a language if it can
only be expressed in the original language with nonlocal
changes. For example, adding exceptions to the declarative model
of chapter 2 in CTM increases the expressive power, since the only
way to do the same thing in the declarative model is to add
boolean variables and checks to all procedures (checks like: if
error then return immediately). This was proved in the 1970s! (All
of CTM is structured in this way, with computation models becoming
more and more expressive, in the precise sense defined above. See
Appendix D of CTM for a brief summary.)
There's some more talk about quasiquotation being the really important
idea, not s-expressions. Frank Atanassow talks about macros just
being notational convenience, not really extending expressiveness
according to PVR's definition.
Some threads about MetaML:
One of the limitations of MetaML, incidentally, is that its
representation of programs is too abstract. Although you can
reflect a program---turn it into a value---the only thing you do
with it is compose it with other reflected programs, and evaluate
it. You can't, for example, rewrite parts of a reflected
program. This is sufficient to turn an interpreter into a
compiler, but not into an optimizing compiler.
Need to read that again..
But it does bring me back to the idea that the combinator approach in
Staapl makes things really a lot simpler.. But what is lost?
Tim Sheard: A Taxonomy of meta-programming systems.
Quasiquotation is defined as a different way to write syntax trees
built from algebraic data types, instead of writing the trees by