[<<][staapl-blog][>>][..]Thu Sep 10 13:01:50 CEST 2009
The point in [1] is that ordinary recursive first/rest approach is inherently serial, and in order to make it parallizable the decomposition should be more vector-like: split / process / append. The interesting remark (p.68) is that for any serial program that can be written as a composition of functions, it might be possible to parallellize those because of associativity as long as some kind of decoupling of intermediate functions can be found. This reminds me of something ;) Anyways, apparently ``parallel prefix function composition'' is something that can be used here[2]. The conclusion seems to be that MapReduce is a big deal, and that there are ways around the need for commutativity. One thing that cought my eye is this[3]: ``If you can construct two sequential versions of a function that is a homomorphism on lists, one that operates left-to-right and one right-to-left, then there is a technique for constructing a parallel version automatically.'' Just derive a weak right inverse function and then apply the Third Homomorphism Theorem[4]: ``A function on lists that can be computed both from left-to-right and from right-to-left is _necessarily_ a list homomorphism--it can be computed according to any parenthesization of the list.'' Some remarks. I'm trying to relate this to constructing algebra of recursion operators on algebraic data types in [5] on vectors. [1] http://research.sun.com/projects/plrg/Publications/ICFPAugust2009Steele.pdf [2] http://wwwtcs.inf.tu-dresden.de/~voigt/popl202-voigtlaender.pdf [3] http://www.ipl.t.u-tokyo.ac.jp/~takeichi/attachments/pldi07.pdf [4] http://researchspace.auckland.ac.nz/bitstream/2292/3514/1/005thirdht.pdf [5] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.125
[Reply][About]
[<<][staapl-blog][>>][..]