Thu Aug 7 08:33:59 BST 2008

Is classic code analysis necessary?

The Coma compiler includes a somewhat traditional but at the moment
quite limited intermediate code processor to implement optimizations.
Between compilation and assembly there is a point where the code is
structured as a control flow graph. This is not used yet, but there to
perform some optimizations not possible in the rewrite framework.
(I.e. jump chaining, loop tricks, ...)

However, I'm still not convinced such a postprocessor is actually
necessary. For very simple target architectures, I suspect the
generated code will be already quite optimal because:

  * Generating good specialized code is the whole point of staging,
    and mostly the responsability of the person writing the code

  * The partial evaluation for the functional subset works quite well
    to eliminate obvious parameterizations and the pattern matching
    allows the specification of a good impedance match to the machine
    instruction set.

As practical evidence for this, the PIC18 rewrite language already
provides quite optimal code in the practical uses that I've
encountered, which is low level embedded programming with some simple
language extensions, mainly to generate data lookup and code dispatch

On the other hand, when implementing more elaborate DSLs on top of
this system, it might be interesting to perform proper data flow
analysis and register allocation. For more complex targets however, I
suspect that Staapl looses its benefits.