Sat May 9 15:05:26 EDT 2020
A while ago I ran into something I believe was called the D-Transform.
It is a reformulation of the Z-transform based on the transformation
Recently this idea has come up in private conversation related to the
field of audio processing and synthesis and analog modeling. The
basic setting is to oversample a signal to remove digital sampling
artefacts. That part is quite straightforward, but doing so exposes
numerical instability when signal operations are expressed in terms of
differences between subsequent sample values, as is usually the case.
Instead of keeping track of delays, it is possible to reformulate
filter topologies by keeping track of differences, all the way
assuming that signals do not change much from one to the other. In
essence it is a way to use the available precision where it is best
Finite Difference Equations - H. Levy and F. Lessman
At least I think that's quite similar to the '61 first edition I found
in a used book store.
TODO: Some previous log entries.