Security Camera Entry: Simple solution Date: Thu Aug 14 22:55:17 EDT 2014 What I want is a circular buffer: simply use the whole hard drive to keep as much data as possible. Then add metadata processing to this to do motion detection. Don't save frames that are sub-threshold from keyframe. Last saved frame is keyframe. Tag timestamp for each frame. Entry: Motion detection works Date: Fri Aug 15 00:31:46 EDT 2014 Algorithm: - 10 frames burnin - compute initial avg_diff from sqrt(sum diff^2) of two last frames - loop forever: - compute rms diff of new frame with key frame - if 10% over average (= noise level), this is a key frame - save key frame - update noise level average Entry: Next Date: Fri Aug 15 18:49:01 EDT 2014 - upload frames to remote storage - connect "dumb" capture from other cameras needs more information.. Entry: Hard Date: Sat Aug 16 10:22:14 EDT 2014 Problem is that this is actually not that easy. A tough requirement is that things get beamed out as soon as they are recorded. Also, doing this in rust is probably too much of a limitation. Up to now it was interesting, but the rest is best done in C. So how to send the image data over? - tcp - over vpn. - with keepalive (i.e. ping once every couple of seconds) - send jpeg frames? Entry: Reuse libprim Date: Sat Aug 16 11:29:20 EDT 2014 So time to get libprim back up. Basic idea is that libprim is my "standard toolkit" for media-oriented programming. Users are pdp, pf, lua, rust, ... OK, got it linked in. Entry: network protocol Date: Sat Aug 16 16:36:02 EDT 2014 So next step is serialization to tcp stream. Is that currently in libprim? Doesn't look like it. Only human-readable write and dump. How was it handled before? Really it isn't necessary. What I have now is more than adequate, as the tcp pipe can be used for restarts. However, it is not extendable in a trivial way, and it can't send keepalives. But still, to keep it interesting: the protocol is: send a line of text that describes the raw bytes that follow. Entry: TODO Date: Sat Aug 16 17:56:38 EDT 2014 - cros-compile for arm / mips notes: - maybe not run the video code on the router in fear of crashing it? Entry: Thoughts Date: Sat Aug 16 21:43:55 EDT 2014 This is the kind of problem that looks simple, but requires a lot of thought to figure out what the real problems are. Back to basics? The main problem is data reduction by ignoring irrelevant data. But is it really necessary? There is plenty of local storage, but that might be compromised, so beaming it is necessary. What is the bandwidth that can be used? It seems that there is no way around streaming. So there are 2 reductions: - don't record frames that have no motion - use lossy compression / noise reduction The AGC pictures are really noisy. So about beaming: do beaming of compressed frames, but record higher quality frames locally. That solves the real problem: there is plenty of local storage which can possibly be manually flushed. Entry: Today Date: Sat Aug 16 22:27:11 EDT 2014 So what got done? - fixes for zl / libprim and users: pdp, kmook - rewrite of .rs to .c to eliminate dep on small targets (for now) - bare bones tcp protocol - python server prototype for other end What is missing? A "real" language with all the libprim objects attached. Maybe putting this in rust is a good next step. Entry: TODO Date: Sun Aug 17 21:48:26 EDT 2014 - contrast for quickam (grey) - doesn't work on raspi, but fswebcam does - compile leaf without sc/ex/pf (fix build system) - 2 android devices -> doesn't work - gather data in main app + store + transmit Entry: Streaming server Date: Fri Aug 22 14:54:29 CEST 2014 Maybe best to setup standard streaming with command line tools before finishing motion detection. Needs ffserver setup. OK got something working, though this is a heterogenous system so plenty of small problems. Raspi /dev/video0 still seems to have problems. strace: ioctl(4, VIDIOC_QBUF or VT_SETACTIVATE, 0xbe8da428) = 0 ioctl(4, VIDIOC_QBUF or VT_SETACTIVATE, 0xbe8da428) = 0 ioctl(4, VIDIOC_STREAMON, 0xbe8da360) = 0 ioctl(4, VIDIOC_G_STD, 0xbe8da360) = -1 EINVAL (Invalid argument) poll([{fd=4, events=POLLIN}], 1, 30100) = 0 (Timeout) Probably was just resolution. Working now at 320x240 Entry: Data reduction Date: Sun Aug 24 10:45:19 CEST 2014 So streaming with ffmpeg works. Is there a way to configure the codecs to do better data reduction? It seems the bitrate is high compared to what the streams look like.. Mostly, they it is encoding noise.