Sat Nov 9 05:55:01 EST 2019

DHT11 state machine

I want a pure event-driven parser with two kinds of events:
transitions and timeouts.

See electronics.txt

  - assert 0
  - wait 18000
  - release (1)
  - start receiver
  - assert 0
  - wait ACK:80 / BIT:80
  - release (1)
  - wait ACL80 / BIT0:25 / BIT1:70

End pulse should probably timeout.

Events (interrupts)
- line change: 
  - save state + elapsed time
  - reset timer
- timer:
  - disable timer

Now I want to write this in a high level language such that I just
need to fill in the platform-dependent details.

Turn this around because it is simpler to think of a data bit as:

- wait 25 / 70
- send 0
- wait 50
- send 1

This means the DHT init sequence is:

- send 0
- wait 80
- send 1
- wait 80
- send 0
- wait 50
- send 1

To parse:
- reset timer at the last 0->1 transition
- at a 1->0 transition, queue the time delay

The machine is initialized after the request sequence is sent out.

This means the first two measurements that come out can be ignored, as
they are:
- the initial response time (20-40)
- the ack pulse 80

EDIT: So I don't really need a code generator.  I just need some
"scrap paper" to properly define the state machine, the initial state,
and possible post-processing.  Because once understood, state machines
(in my case) are very simple.

How does this generalize to more complex machines?

I want something that works both on UC and on FPGA.

The main difference is that on UC, events can be modeled as procedure
calls.  On FPGA they are state update functions evaluated at each
clock.  These need to be merged into the same representation somehow.