% here i'm falling in the ellipse-parabola-hyperbola trap again.. %% Some intuitive leaps. Separability versus displacement structure and %% differential equations like the Lapace and d'Alambert equation. And %% symplectic matrices. So what's this displacement business for symmetric matrices all about? In short: finding a way to write the sum/difference of two transforms of a single transform $A$ as a sum/difference of squares''. $$\Delta A = F A - A \tilde{F}^H = G \tilde{G}^H = \sum_{k=1}^r g_k \tilde{g}_k^H$$ More general, as a limited outer product sum. This should be done in a way as to lower the rank $r$, making it effectively separable. This way \emph{the original structure that we were forced to introduce to write our problem as an ordinary matrix problem can be recovered}. So the matrices $F_1$ and $F_2$ should really be gotten from the problem domain. For example for filtering problems, it is a shift matrix. Care has to be taken though that the original matrix $A$ can be reconstructed from this representation, since we want an alternative representation, not something completely different. One thing that has struck me thinking about structured matrices before is the question: if your equation is structured, why do you blow it up into a matrix? Naivity can be beautiful, because it usually hides the answer right there in the question: we can't do much equation solving in practice without using algorithms based on matrices and their factorizations, so we have to stick to formulating our problem in the language we know, to find a more general approach instead of exploiting the structure of the problem directly. This of course means that some of the (more obvious) fast algorithms we find by applying this philosophy are already encountered before by exploiting the structure of the problem in a more direct fashion. The key example here is the Levinson algorithm.