Fri Dec 14 15:59:31 CET 2007

meshy finished?

looks like we're at the end. got 8 devices talking to each other. so
time to make a "what learned?" section..

  * for DSP, use a dsPIC instead of a PIC chip, OR write a highlevel
    (but slow) set of primitives on PIC. i spent too much time in
    writing "fast" code that eventually didn't get used, or
    extensively modified to destroy the optimizations.

    DSP apps have the property that a lot of the code volume needs to
    be fast, which screams for a SEPARATE algorithm design and
    implementation/optimization phase. the problem here is on-target
    debugging. as long as the app scales time-wise (rate reduction
    without changing other variables) optimization can be postponed.

  * get it to work FAST, and start with the most difficult part, even
    if it means dirty hacked up proof of concept, then incrementally
    improve while keeping it working. don't spend time on things that
    solve needs that are not immediate if there are other immediate

       - debug network: eventually didn't get used
       - the hardware layer: it delayed everything else

    the mistakes had quite severe consequences in the end. i could
    have gained 2 weeks by not making the debug network.

    the cause of the mistakes seem to be

       - mismatch in skill (no analog electronics hands-on experience,
         and dusty theoretical understanding) but mostly misplaced
         confidence in non-tested skill.

       - underestimation of importance of debugging.

  * debugging deserves its own bullet. ironically, i lost a lot of
    time building a debugging tool. building that tool was a good
    idea, but i forgot a couple of steps:

       - underestimated the difficulty in getting the debug net
         working properly. this actually required an intermediate
         debugging phase to monitor the behaviour of both send and
         receive. i didn't anticipate these problems, which was a
         mistake. lesson to learn is to never underestimate the
         problems that can arise, even if the application seems really

       - doing high-bandwidth work (DSP) requires high-bandwidth
         debugging tools or at least a large storage space on chip for
         traces and logs. a solution here would be to make a separate
         circuit only for logging, or use a high-bandwidth host
         connection. an example could be a circuit that records to
         a flash card, or a USB connection to host.

       - need better host side software extension system for
         special-purpose debugging tools. it should be the same as the
         way the host system is written, so that tools can be moved
         into the main distro when polished. to make this easier, the
         number of extensible points needs to be limited such that
         they are better accessible. i.e. the console's need to be

so, to summarize:


     don't optimize and design at the same time if there is a lot of
     opportunity for optimization (i.e. DSP app on PIC18 where an
     order of magnitude of speed gain is easy to find). as long as
     time-critical cores are small, this is ok, but when the core is
     all there is, you need to get it to work first using a highlevel
     approach, and ONLY THEN make it fast.


     do not underestimate the difficulty of getting something right in
     reality, even if the logical model is trivial. programming
     problems seem to be about managing complexity, while electronics
     problems are about managing external influences, non-ideal
     behaviour, and tons of exceptions and hacks. these are entirely
     different. programming = abstraction, electronics = debugging.