[<<][staapl-blog][>>][..]
Wed Sep 2 14:08:44 CEST 2009

Why 2 stacks?

I've been trying to answer this question in a straightforward way for
a while.  Usually this translates to something like ``decoupling
argument passing from procedure calls''.  As Chuck Moore says[1]:

  ``Forth is highly factored code. I don't know anything else to say
    except that Forth is definitions. If you have a lot of small
    definitions you are writing Forth. In order to write a lot of
    small definitions you have to have a stack. Stacks are not
    popular. Its strange to me that they are not. There is a just lot
    of pressure from vested interests that don't like stacks, they
    like registers. Stacks are not a solve all problems concept but
    they are very very useful, especially for information hiding and
    you have to have two of them.''

My attempt at this is to approach what Chuck calls ``small
definitions'' from the point of view of locality and structure of
primitive and composite rewrite rules.  When reductions and expansions
are local, two stacks emerge quite naturally from a purely syntactic
desciption.

Let's take as an example the formal variant of the Joy language.  Here
we don't consider semantics.  Semantics is usually expressed directly
as functions operating on a stack, or operationally where also
function evaluation is made explicit as 2 stacks: a parameter stack on
which functions operate and a control / return / continuation stack
used for function nesting.

Formally (syntactically), we're only interested in describing the
equivalence of two quotations, not in what they actually mean.
Equivalence can be proven by supplying a sequence of quotations (a
proof) in which each consecutive quotation is related to the previous
one by the application of a rewrite rule.

Syntactic rewrite rules can be classified as:

  * local reduction: primitive operators are defined in terms of
    rewrite rules that reduce in combination with values

  * local expansion: composite operators can be replaced by the
    constituents of the quotation that defines them.

Note that the concept of `value' is important.  It is exactly this
concept, and the form of the primitive rewrite rules which are
specified as reductions on values, that gives the language its proper
form.  I.e. the primitive `swap' is defined as a rule:

       <v1> <v2> swap   =   <v2> <v1>

Two details are important: `swap' is the only non-value in the
expression, and it appears on the _right_ of a concatenation of
values.

Let's examine the behaviour of this formal system, defining numbers as
values, and assuming there are rules of the kind above defined for all
symbols (primitive words) or that they expand into quotations
(composite words).

The structure of the rules allows the following notation to be used.
Let the symbol `|' denote a separation between values on the left
side, and programs on the right.  This is only an _annotation_ of
reduction proofs (adding visual reminders of which objects are
values), not an essential part of the rules!  We move the `|' one
position to the right whenever we encounter a value.  If it is not a
value, we perform a primitive reduction or a composite expansion.

A example of an equivalence proof based on two primitive rules:

| 2 5 + 3 *
2 | 5 + 3 *
2 5 | + 3 *
7 | 3 *
7 3 | *
21 |

An equivalence proof including the expansion of a composite program

  square   =   [dup *]

looks like:

| 12 square dup
12 | square dup
12 | dup * dup
12 12 | * dup
144 | dup
144 144 |

This illustrates that the 2 stacks _emerge_ as a notation device for
presenting equivalence proofs: the parameter stack on the left of `|'
and the program (continuation) stack on the right, with facing tops.

( Note that I am begging the question.  _Of course_ the reduction
rules are structured in such a way that stacks emerge.  My point is
that you don't need stacks to explain concatenative stack languages,
just a principle of locality and some concept of reduction order, here
enforced by the concept of `value'.  Semantically this then neatly
corresponds to function composition, where the reduction order
directly relates to the fact that function composition in general does
not commute. )

[1] http://www.ultratechnology.com/1xforth.htm



[Reply][About]
[<<][staapl-blog][>>][..]