[<<][compsci][>>][..]
Wed Mar 17 10:26:47 CET 2010

Hardware Mapping

Let's start at the meta-level[1]:

         "What are the important problems in my field?"

In an attempt to make things more explicit, what are the important
problems, and why am I not working on them?  What am I actually doing?
What is my main goal?

    Decouple function from implementation in numerical processing.

This is translated to the following problem:

       1. express numerical processing in mathematics (the DSL)

       2. find a way to express hardware mapping

There are plenty of examples of 1, so not much re-invention is
necessary.  There are plenty of examples of 2 also, but this field is
really quite broad, and there are many design decisions to make.

All I've been doing in the last couple of years is to learn about
languages and compilers, and while I did learn a lot, I'm still
struggling with making the target explicit.

My conclusion up to now is that the 1/2 distinction is better viewed
as a continuum, or at least a sequence of steps:

                  2       2'       2''
               1 ---> 1' ---> 1'' ----> 1'''

I knew for a longer time that I'm building a compiler, or more
specifically, a method for building multiple compilers.

What I'm starting to see now is that this is all about semantics and
proof.  When you move down the chain from specification to
implementation, you want to preserve meaning, or at least, preserve
meaning relative to a certain set of conditions that express
approximation.

It seems I've been looking through the wrong set of glasses.  The
aspect to focus in is really _correctness_ and not ease-of-use.

Actually, building a tower of languages is easy once you know what
problem to solve.  Writing DSLs becomes second nature with a bit of
practice.  Scheme, (Meta)OCaml and Haskell are all quite suited to do
the job.

However, while providing a lot of structure to eliminate silly
mistakes, these tools don't solve the correctness problem: ultimately
you need to define the beef as low-level computation (i.e. by pattern
matching).

The hard part is making sure that you preserve the intended semantics
facing a mountain of implementation details.

The real problem is managing those details, and replace them with a
structure that is ``obviously correct''.

I see it in the Staapl PIC18 compiler.  It consists of an ad-hoc set
of transformation rules that define the semantics of a concatenative
language in terms of generation and transformation of machine code.
And that is the _only_ thing it does.  There is nothing that is
somewhat structured to actually describe what the compiler is supposed
to do, and under what conditions it breaks.

So I'm getting really interested in correctness proofs[2], and in
adding static semantics to language towers[3].  Looks like I need to
start reading again.

As for the Staapl PIC18 compiler, I'm trying to asses if testing the
compiler by providing an ``obviously correct'' semantics (a reference
implementation) is good enough.  It is definitely more trustworthy to
have a correctness proof, but from a practical point of view, a test
suite with broad coverage might be sufficient.


[1] http://www.chris-lott.org/misc/kaiser.html
[2] http://compcert.inria.fr/doc/index.html
[3] http://lambda-the-ultimate.org/node/3179




[Reply][About]
[<<][compsci][>>][..]