[<<][compsci][>>][..]Tue Oct 26 00:22:37 CEST 2010
The ultimate tension between static and dynamic. Looking from afar, the only argument that I distill from the static "eval-is-bad" camp is that reasoning about code is complicated by late-bound semantics. The argument from the dynamic "eval-is-good" side is that late-bound semantics are the most flexible (and simple?) starting point and thus preferrable as a language base. ( Macros float a bit in the middle between the fully dynamic Smalltalk approach where meaning is completely defined at run time, and the fully static typed functional languages where a large part of the meaning of code (types) can be used at compile time. I.e. in racket, macro bindings are always well defined (a macro name maps to a precise function that is known at compile time) but the way it transforms code does not preserve any other invariants. ) As Dave puts it [4]: Fexprs are bad for two reasons: they make the language hard to compile efficiently and they make programs hard to understand by subjecting the basic program definition to dynamic reinterpretation. Thomas Lord's comment[5] is quite interesting though. Also check out the Kernel[6] programming language. [1] http://kazimirmajorinc.blogspot.com/2010/10/on-pitmans-special-forms-in-lisp.html [2] http://en.wikipedia.org/wiki/Fexpr [3] http://lambda-the-ultimate.org/node/3861 [4] http://calculist.blogspot.com/2009/01/fexprs-in-scheme.html [5] http://lambda-the-ultimate.org/node/3861#comment-57967 [6] http://web.cs.wpi.edu/~jshutt/kernel.html [7] http://lambda-the-ultimate.org/node/3861#comment-57972
[Reply][About]
[<<][compsci][>>][..]