Mon Feb 12 10:55:24 CET 2018

Too hard to generalize

I like the quip by Sussman that programming these days requires "basic
science", i.e. trial and error.

How did it get to this point?

An example: I have a script that performs several ssh connections in a
row, which is slow over a high-latency link.  The solution here is to
reuse the connection using a control master.  I.e. a cache.  However,
this behaves as a global variable, and requires extra machinery to
turn it into some RAII mechanism.

If this were implemented in a programming language, it would be easy
to express.  Instead it is part of the OS quilt.

So many things that should be simple, are not.  Mostly because they
are implemented in a concrete way.

EDIT: Fixed the control master thingy..  Took about an hour.

EDIT: The problem lies in distributed, connected systems that are
constantly upgraded.  If you just build a dedicated widget (even with
"unix-level" tools) and don't change it, it seems to work just fine.