[<<][meta][>>][..]Sat Feb 27 16:35:30 CET 2010

For ordinary computations a lot can usually be gained by reusing intermediate results. The same is probably also true for IIR and FIR filters. I.e. when is it more useful to implement serial tructures vs parallel ones? This is about ladder and lattice filters. So, it would probably be useful to think about all kinds of refactorings of rational functions, and especially the effect of transformations on the coefficients. I.e. what would be _really_ interesting is to do on-line filter design, i.e. adaptive filtering with based on some optimization problem. Ok, that's the essential insight: it's not about filters. It's about coefficients and how to update them. I.e. given a certain fixed filter topology, how do the coefficients influence a certain cost function. Kalman filters etc.. This brings us pretty close to one of Jaques Carette's papers[1][2]. Another interesting one is this [3]. [1] http://www.cas.mcmaster.ca/~carette/newtongen/ [2] http://www.cas.mcmaster.ca/~carette/newtongen/verif_gen.pdf [3] http://www.cas.mcmaster.ca/~carette/publications/CaretteEtAl2008_AISC.pdf

[Reply][About]

[<<][meta][>>][..]