mixer with direct out connected to delta1010 in all post-fader 1 guitar/bass clean 2 guitar effect chain 3 microbruite 4 volca keys 5 volca bass 6 volca beats 7 monitron / paia (todo) 8 paia (todo) from delta1010 - digital mix chain 13/14 out 1/2 from delta1010, for routing to analog effects 15/16 out 3/4 9 out 5 10 out 6 11 out 7 12 out 8 from intel built-in audio - pulseaudio digital mix chain 17/18 19/20 free sends: aux1 pre free aux2 pre/post guitar effects (pre default) aux3 post free notes: * aux send to digital in (e.g. for reverb/delay) is not really necessary as they can be sent digitally. * aux1 somehow gives feedback trouble on high-gain effects chain (self-send not off completely?). using aux2 in pre mode works fine. bcr2000 top row: mixer (0-127) midi control bittons: part mute bottom 3 rows: synth control (incremental) TODO: - sync microbruite to the volcas (all on midi sync?) - find a good sequencer - mount all the "knobs" to a movable surface - set up a good echo/delay digitally - set up midi for all - route guitar effects chain through mixer - focus on non-portable setup first built around delta1010 - later add "easy detach" portability - find a simple touch screen Entry: Sequencer Date: Fri Mar 4 18:38:06 EST 2016 Find a good sequencer, or build one. Entry: Analog Date: Fri Mar 4 19:31:29 EST 2016 8 channels should be enough. All of them have direct out to delta1010. I have 4 channels direct Entry: Monoprice Date: Fri Mar 4 20:04:54 EST 2016 - TS Single angle - TS Entry: Effects send Date: Fri Mar 4 21:06:59 EST 2016 Not sure why this didn't work earlier. Too noisy? Let's try again.. The reason was feedback, since this needs to be set pre-fader, and the self-send on the return channel is too strong. It's doable for most cases but easy to trigger. Entry: MIDI Date: Fri Mar 4 22:37:04 EST 2016 - map all midi devices to some other messages (osc?) - program bcr2000 for mixerm, mute botton + inc control operation SYNC: http://www.sweetwater.com/sweetcare/articles/how-do-i-set-up-my-korg-volca-series-synth-module-to-sync-to-an-external-midi-clock/ Entry: udev Date: Sat Mar 5 11:02:02 EST 2016 Fixed device names in /etc/net/udev Next is to find a way to share the device Using the legacy midi devices (and alsa's rawmidi?) is single-access only. Two options: - create a daemon that converts midi to osc - learn how to use the alsa sequencer Since I need the "accumulator" anyway, let's do this in erlang (exo). Entry: Sending clock sync from Erlang Date: Sat Mar 5 23:08:03 EST 2016 At 120bpm, 2pbs (* 2 4 24) 192 messages per second. 5ms per message. Entry: alsa midi Date: Wed Mar 16 11:49:52 EDT 2016 Looks like it's necessary to use alsa midi interface, as multi-port devices only show up with a single legacy /dev/midi? port. Making exo/c_src/midi_in.c to simply map back to raw MIDI in erlang. Alternatively, pass through the full binary event structs. Next: figure out how to set tempo, send midi sync, send start/stop to all clients at once, and configure volca's to sync to midi. Entry: amidi / aconnect Date: Wed Mar 16 22:16:43 EDT 2016 tom@zoo:~$ amidi -l Dir Device Name IO hw:0,0 M Audio Delta 1010 MIDI IO hw:3,0,0 USB Midi 4i4o MIDI 1 IO hw:3,0,1 USB Midi 4i4o MIDI 2 IO hw:3,0,2 USB Midi 4i4o MIDI 3 IO hw:3,0,3 USB Midi 4i4o MIDI 4 tom@zoo:~$ aconnect -l client 0: 'System' [type=kernel] 0 'Timer ' 1 'Announce ' client 14: 'Midi Through' [type=kernel] 0 'Midi Through Port-0' client 16: 'M Audio Delta 1010' [type=kernel] 0 'M Audio Delta 1010 MIDI' client 28: 'USB Midi 4i4o' [type=kernel] 0 'USB Midi 4i4o MIDI 1' 1 'USB Midi 4i4o MIDI 2' 2 'USB Midi 4i4o MIDI 3' 3 'USB Midi 4i4o MIDI 4' So some questions. - Should I use the sequencer or the raw midi? Maybe jack midi (which is on top of raw midi) The oss midi is likely not a good idea. - There are 4 ports, but only one /dev/midiCD0 device This http://wiki.linuxaudio.org/faq/start mentions alsa sequencer should be used for interconnecting applications. Jack MIDI might just be for recording, then? No, looks like the reason has to do with high-resolution timing. 250kHz not being enough. I wonder, is it possible to have the time sync sent out by the sequencer? http://www.alsa-project.org/~tiwai/alsa-sync.html Entry: alsa timers Date: Wed Mar 16 23:32:34 EDT 2016 tom@tp:~$ cat /proc/asound/timers G0: system timer : 4000.000us (10000000 ticks) G3: HR timer : 0.001us (1000000000 ticks) P0-0-0: PCM playback 0-0-0 : SLAVE P0-0-1: PCM capture 0-0-1 : SLAVE P0-3-0: PCM playback 0-3-0 : SLAVE tom@zoo:~$ cat /proc/asound/timers G0: system timer : 4000.000us (10000000 ticks) P0-0-0: PCM playback 0-0-0 : SLAVE P0-0-1: PCM capture 0-0-1 : SLAVE P1-0-0: PCM playback 1-0-0 : SLAVE P1-0-1: PCM capture 1-0-1 : SLAVE P1-2-1: PCM capture 1-2-1 : SLAVE P2-3-0: PCM playback 2-3-0 : SLAVE Entry: sending out sync Date: Wed Mar 16 23:33:24 EDT 2016 Is this up-to-date? http://www.alsa-project.org/~tiwai/alsa-sync.html core function seems to be: snd_seq_alloc_sync_queue() which is the only reference i find I need to look into the alsa sources... Too much old stale documentation.. in seq.h: /** sequencer timer sources */ typedef enum { SND_SEQ_TIMER_ALSA = 0, /* ALSA timer */ SND_SEQ_TIMER_MIDI_CLOCK = 1, /* Midi Clock (CLOCK event) */ SND_SEQ_TIMER_MIDI_TICK = 2 /* Midi Timer Tick (TICK event */ } snd_seq_queue_timer_type_t; void snd_seq_queue_timer_set_type(snd_seq_queue_timer_t *info, snd_seq_queue_timer_type_t type); Yeah this is badly documented. I'm annoyed. Let's look at code from others. Something to look for #define SNDRV_SEQ_EVENT_CLOCK 36 /* midi Real Time Clock message */ #define MIDI_CMD_COMMON_CLOCK 0xf8 Entry: hydrogen Date: Thu Mar 17 00:43:18 EDT 2016 Looks like jack midi is the way to go. http://www.hydrogen-music.org/hcms/node/1966 http://www.teuton.org/~gabriel/jack_midi_clock/ git clone git://gabe.is-a-geek.org/git/jack_midi_clock.git Entry: sooperlooper Date: Thu Mar 17 00:52:27 EDT 2016 http://essej.net/sooperlooper/ Entry: jack midi or alsa sequencer? Date: Thu Mar 17 00:57:03 EDT 2016 It seems that people are standardizing on jack, and jack midi. For new things it seems best to not use alsa sequencer. http://www.teuton.org/~gabriel/jack_midi_clock/ http://wiki.linuxaudio.org/faq/start#qwhat_is_the_difference_between_jack-midi_and_alsa-midi Since jack is used for audio anyway, it seems best to use it for midi as well. Jack can then provide abstraction of alsa sequencer or alsa hardware. jack_midi_clock.c uses these calls: send_rt_message(port_buf, 0, MIDI_RT_CLOCK); send_rt_message(port_buf, next_tick_offset, MIDI_RT_CLOCK); static void send_rt_message(void* port_buf, jack_nframes_t time, uint8_t rt_msg) { uint8_t *buffer; buffer = jack_midi_event_reserve(port_buf, time, 1); if(buffer) { buffer[0] = rt_msg; } } if ((mclk_output_port = jack_port_register(j_client, "mclk_out", JACK_DEFAULT_MIDI_TYPE, JackPortIsOutput, 0)) == 0) { jack_midi_event_reserve() - can be called from the process() method - needs sorted input Overall this seems like a much better approach to handling timer issues. Synchronize to the audio sample (block) clock for logical time base, and have jack handle the event scheduling details. For other things, alsa seq is probably ok. One thing to figure out is how usable erlang would be for midi event processing. Do I really need alsa sequencers for new code? For new softsynths, best to stick with jack. a2jmidid (separate app) is apparently better than seq and raw http://home.gna.org/a2jmidid/ the reason to use a2jmidid is that it allows to run jack in -X raw mode, while still allowing to bridge jack midi ports to (wrapped) sequencer ports. Yeah need to play with it. Coexist. Conclusions: Entry: audio/midi Routing setup Date: Thu Mar 17 01:55:41 EDT 2016 MIDI - Use jack midi for anything that uses jack audio already - Use jack midi for hi-rez timing (based off of sample clock) (ex: jack_midi_clock.c) - Use ordinary alsa seq for non-timing critical bits (e.g. erlang hacking?) - Create a jack midi interface to erlang (also for better timing). - Run jack in raw mode, use a2jmidid to connect sequencers to raw ports AUDIO - jack - pulseaudio jack bridge (optional) - pd (no jack midi tho) - rai soft synths with jack midi / osc or pd on stdin Entry: jack midi api Date: Thu Mar 17 02:19:23 EDT 2016 http://jack-audio-connection-kit.sourcearchive.com/documentation/0.116.1/group__MIDIAPI_g6037a3936f8d2ef063dbbd47d722f660.html http://www.teuton.org/~gabriel/jack_midi_clock/ Entry: Next Date: Thu Mar 17 02:25:32 EDT 2016 Try the drum machine sync. Entry: Jack timebase Date: Fri Mar 18 16:12:45 EDT 2016 Got it to work in exo. Entry: poly-rhythm timebase. Date: Fri Mar 18 16:15:41 EDT 2016 Discrete is simple. Can also do on audio for looper recording of analogs. Main problem here is interference between LFOs and loops. Entry: jack midi ports Date: Fri Mar 18 16:55:15 EDT 2016 scan: added port hw:0,0,0 in-hw-0-0-0-M-Audio-Delta-1010-MIDI scan: added port hw:0,0,0 out-hw-0-0-0-M-Audio-Delta-1010-MIDI scan: added port hw:3,0,0 in-hw-3-0-0-USB-Midi-4i4o-MIDI-1 scan: added port hw:3,0,1 in-hw-3-0-1-USB-Midi-4i4o-MIDI-2 scan: added port hw:3,0,2 in-hw-3-0-2-USB-Midi-4i4o-MIDI-3 scan: added port hw:3,0,3 in-hw-3-0-3-USB-Midi-4i4o-MIDI-4 scan: added port hw:3,0,0 out-hw-3-0-0-USB-Midi-4i4o-MIDI-1 scan: added port hw:3,0,1 out-hw-3-0-1-USB-Midi-4i4o-MIDI-2 scan: added port hw:3,0,2 out-hw-3-0-2-USB-Midi-4i4o-MIDI-3 scan: added port hw:3,0,3 out-hw-3-0-3-USB-Midi-4i4o-MIDI-4 scan: added port hw:4,0,0 in-hw-4-0-0-MicroBrute-MIDI-1 scan: added port hw:4,0,1 in-hw-4-0-1-MicroBrute-MIDI-2 scan: added port hw:4,0,0 out-hw-4-0-0-MicroBrute-MIDI-1 scan: added port hw:4,0,1 out-hw-4-0-1-MicroBrute-MIDI-2 Entry: separate exo and studio Date: Sat Mar 19 23:26:00 EDT 2016 I'd like to build this into something dedicated, and handle all starting/stopping from Erlang as well. If events need to be exchanged between exo and studio, it would likely not be such a huge deal.. Entry: MIDI Date: Sun Mar 20 13:50:23 EDT 2016 - use jack midi: -X raw, only using physical ports - start it in Erlang, and hook up the stdout monitor to keep track of port names - together with jackd, start a jack client erlang port that connects to all jack midi ports - use naming resolution to build a generic, plug and play midi router Entry: Startup Date: Sun Mar 20 14:27:28 EDT 2016 Maybe good to start looking into OTP? I still think that for what I want to do, OTP is overkill. I'd rather implement servers at the function level (init,handle) as opposed to module level, especially since most of them are so simple. Entry: M-Audio Delta1010 midi Date: Sun Mar 20 19:50:34 EDT 2016 Is problematic. When I connect it, all the others seem to skip timecodes. Some echo thing? Suspicious though that it's the first one so might be something else also. Entry: jack changes Date: Thu Mar 24 11:17:53 EDT 2016 separated midi and control clients so they can use different buffering schemes. control: rpc, midi: packet switching, with possible drops. still not happy with the possible drops (really a design problem) but for now we're fine. Entry: erlang sequencer Date: Thu Mar 24 11:19:54 EDT 2016 next is event sequencing. trouble is in the "note off" events. my idea had been to use a process for an event stream, but it's not clear if that's a good idea. ordered queues are probably better, but more clumsy to work with. yesterday i was thinking about the problem of interface vs. representation. in my payed gig i've been dealing a lot with this problem: to build something that has a model quite different from the way it is implemented. in any case it would be a nice experiment to see how far the erlang scheduler can be trusted with accurate sequencing. what i really want though, is infinite (floating point) timing resolution. a possible solution: - build a model in erlang that can in principle be translated to bare-bones implementation (state machines + priority queue). - once it works, implement the state machine scheduler Entry: Settings Date: Sun Mar 27 21:47:43 EDT 2016 Storage is necessary. I wonder if it makes sense to put this in a sqlite database, or to hard-code it in Erlang datastructures. Once "learning" is added, the former might make sense. EDIT: added erl_sqlite3 dependency and database. Entry: Routing Date: Thu Apr 7 21:40:30 EDT 2016 Everything is set up. Next: routing. Basic tool is serv:broadcaster (fan-out) and receive selection. Basically, event comes in at port and based on what it is, it needs to be dispatched to one or more registered processes. It doesn't seem like a good idea to route everything to every process. So that gives a way to organize. The router is a set of {Predicate,Process} rules. Broadcaster could be extended to that. Entry: Trigger & Learn Date: Fri Apr 8 00:38:35 EDT 2016 Triggers are easy on top of routing. Next: midi learn for rai code numbers. The interesting part there is to keep the locations invariant. Inserting forms will violate that. This was not a problem using the emacs cursor. Entry: jack and pulseaudio Date: Fri Apr 8 19:17:02 EDT 2016 what i want is: - keep studio launching jack, monitoring its stdout. no dbus business. - reserve system out for pulseaudio web browsing, skype etc.. keep pro card for music additionally: - have pulseaudio detect a running jack daemon and register as client (not dbus) # JACK: the dbus solution is not very robust. plus i need more control # over starting jack. Have pulse register as client instead (and kill # pulse when jack starts. clients will restart pulse when needed) ### Automatically connect sink and source if JACK server is present #.ifexists module-jackdbus-detect.so #.nofail #load-module module-jackdbus-detect channels=2 #.fail #.endif ## apt-get install pulseaudio-module-jack .ifexists module-jack-source.so load-module module-jack-source .endif .ifexists module-jack-sink.so load-module module-jack-sink .endif Entry: pulse udev Date: Sat Apr 9 00:39:20 EDT 2016 /lib/udev/rules.d/90-pulseaudio.rules to disable a card, use udev environment to inflence that rule. https://jamielinux.com/blog/tell-pulseaudio-to-ignore-a-usb-device-using-udev/ ATTRS{idVendor}=="1852", ATTRS{idProduct}=="5110", ENV{PULSE_IGNORE}="1" From default.pa: ### Automatically load driver modules depending on the hardware available .ifexists module-udev-detect.so load-module module-udev-detect .else ### Use the static hardware detection module (for systems that lack udev support) load-module module-detect .endif Entry: jackd bug Date: Tue Apr 19 09:41:46 EDT 2016 https://github.com/jackaudio/jack2/pull/153 2^31-1 Samples (~13.5h at 44.1kHz), EDIT: Fix is not in 1.9.10 release, so build from git. How to turn this into a jackd2 debian package? Not necessary: - build in ~/git/jackd2 using standard options - set ~/bin/jackd.zoo to use /home/tom/git/jack2/build/jackd Entry: lmms Date: Sat Nov 19 18:00:25 EST 2016 Not very useful to me atm, but useful as a pattern editor. How to get pattern from lmms into erlang? lmms -d hobble.mmpz # gives an xml file Maybe a good push for making something in Rust Entry: rust Date: Sat Nov 19 18:43:12 EST 2016 https://github.com/seriyps/rust-erl-ext Entry: next Date: Sun Nov 20 08:27:35 EST 2016 What to do next? I like the romantic Erlang story, but currently have no idea on how to put it to use properly. There is no point of increment. What is needed? - A pattern sequence player / editor - A sample player / matcher Maybe if the artistic inspiration is not there, focus on the dry, technical problems? The problem has always been management of phrases as midi notes + synth configs, or recorded samples. I've always felt the need to have to choose between synths and samples. Is that still the case? Also, reproducibility has been a problem. But that conflicts with the lure of the analog knobs.. Entry: FLAC decoder in rust Date: Thu Dec 1 08:19:31 EST 2016 https://github.com/ruuda/claxon https://ruudvanasseldonk.com/2016/11/30/zero-cost-abstractions Entry: UI revisited Date: Sat Dec 17 17:55:50 EST 2016 The point here is to keep it fun, right? And allow for some "job well done" gloating afterwards.. What about UI. I want to build my own widgets in a way that makes sense, e.g. using constraints or some declarative approach. As a language I'd go for either Rust or Haskell. Former if this has to ever run on a low-end machine (unlikely). Latter if it's ok to use a 2010-style laptop which can run reasonable 2D compositing and can host Haskell. Really, thinking about synth networks, it might be best to completely separate UI and sound, so platform would not be such a problem. The question would be: is Haskell responsive enough? Some requirements: - would have to be frame-synchronous: no tearing - I'd like to avoid X11 So.. I have all these thinkpads. They all have: lspci: 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 02) cpuinfo: (zora, broom) model name : Intel(R) Core(TM) i5 CPU M 520 @ 2.40GHz (tp) model name : Intel(R) Core(TM) i5 CPU M 540 @ 2.53GHz glxinfo: Device: Mesa DRI Intel(R) Ironlake Mobile (0x46) https://en.wikipedia.org/wiki/List_of_Intel_graphics_processing_units#Fifth_generation https://en.wikipedia.org/wiki/Arrandale https://en.wikipedia.org/wiki/Intel_HD_and_Iris_Graphics Entry: pick up? Date: Mon Jul 31 15:27:49 EDT 2017 Still working, and a bit more Erlang experience. What next? - simple wave editor - STM32F4 + RAI (Axoloti) But first: organization. - Going to have zoe as dedicated host for this since it has all the hardware wired permanently. It makes no sense to set this up as an LXC container, so maybe delete the "studio" lxc. - Set up backups. Looks like this is a new one: zoe_20170731_1533_zoe Probably safe to delete old ones. - To run on laptop, install it locally there as well. Maybe have a dedicated disk for it. Laptop will use roland UA-30 USB. - Studio will be always running. Run it inside of emacs. - Checked the fasttrack as main 2track recorder. Seems to work ok. - Do editing using tramp from core. Figure out a way to merge the subprojects properly. - Make a web interface? E.g. drum computer, routing - Set up ssh into studio image Entry: Setting up emacs for new project Date: Mon Jul 31 16:11:58 EDT 2017 - erl ssh access - distel - incremental build + load Entry: ssh Date: Mon Jul 31 20:24:31 EDT 2017 I really don't want to waste time on this. It's not working well. Maybe just use distribution on VPN, since distel needs that anyway. Entry: A sender and a receiver Date: Sun Aug 13 21:08:05 EDT 2017 So I want to think about making this easier. I have a sender, say a midi knob plugged into a PC somewhere in the distributed network, and I want to use it as an input to some process. How to tell the knob to send its data? - Register the pid to the controller's parent - Have the parent do filtering how to make this work with "midi learn?" this only works if all the controls can be left silent, or at least to filter out those controls that do not yet have an assignment. where is this actually coming in? EDIT: arch uses a hub to receive a port's messages, register a filter for its address to the midi_hub what this can't do is midi learn. hub would need to implement that. Entry: current jack arch Date: Sun Aug 13 21:21:28 EDT 2017 jackd, wraps the daemon. listens to stdout for connect events. jack_control, wraps port process to do things like jack_connect(). jack_midi, wraps port process has currently 16 in, 16 out which are connected to the physical ports sends everything to midi hub midi hub is serv:hub %% Midi in. Translate to symbolic form. jack_midi_handle({Port,{data,<>}}, Port) -> lists:foreach( fun(Msg) -> serv:hub_send(midi_hub, {{jack,MidiPort},Msg}) end, decode(Data)), Port; TODO: hub needs to register processes, so if they die it can remove them from the list. It was determined to do filtering at the source -- that's always a good idea. Entry: Implementing midi learn, improving hub Date: Sun Aug 13 21:34:04 EDT 2017 Send message to hub: I want to subscribe a part - add registry (probably already in newer erl_tools) to avoid stale pids. - midi hub should use sets as filters, so a filter can easily be added. - midi learn requires "quiet". otherwise, do it for things that are not yet assigned. Entry: So I had some time to practice GUI work Date: Tue Sep 5 23:26:45 EDT 2017 Now it's time to make a sound editor! Entry: Simplify channel codes Date: Sat Sep 9 16:21:02 EDT 2017 1 white 2 red 3 yellow 4 green 5 blue 6 black 7 grey 8 browngrey Entry: Next Date: Sun Dec 10 18:12:39 EST 2017 - set up staapl synth and connect to midi + mixer channel 8 - looper controller - midi control of 4 synths - midi learn Entry: Midi learn + patch Date: Sun Dec 10 18:40:25 EST 2017 Simplest way to get this set up is to automate all the ad-hoc binding that needs to be done. E.g patch main midi controller to synth and record. Entry: staapl synth Date: Sun Dec 10 17:53:42 EST 2017 Just keep it as-is. It's good enough. It needs analog filtering, but it can go into the microbrute. No extra filters needed. Entry: Looper Date: Sun Dec 10 18:01:49 EST 2017 That little USB mixer is the looper. Knob: mix-in ratio Slider: playback volume Button: record Ok, so: {{jack,8},{on,0,60,38}} Needs to go to some synth. I need an abstraction for a cable. A cable is a pid that registers to the midi task, and sends it to another one. This is for receiving: whereis(midi_hub). serv:hub_add(Hub, Pred, Pid). This is for sending: %% Midi out jack_midi_handle({midi,Mask,Data}, Port) -> Bin = ?IF(is_binary(Data), Data, encode(Data)), Port ! {self(), {command, <>}}, Port. Which is more low level. midi_jack doesn't seem to be registered to anything. actually that's the daemon. midi needs to be sent through a client. i need to write down how the processes are set up. I think the jack daemon is hard-linked to midi_hub. That is bad practice. Fix this to not used registry, but use an application namespace. Yeah this needs to be cleaned up a little. Get dialyzer + typer up then refactor. But make it work first. That is busywork. Ok, now I get it. To connect midi ports, use a jack client. Routing through Erlang maybe has too much delay? midi:jack_midi(Client,NI,NO,ClockMask). I believe the idea has always been to allow this to be an "os" for components that are tighter and can communicate at the jack level. Ok. Now get something working. Lost appetite.. Time to go to bed probably. Entry: MIDI timecode not coming out Date: Wed Mar 21 18:54:52 EDT 2018 I forgot how it works. Send out something on the console to indicate how it works. jackd_need_client(State) -> ... ClockMask = db:midiclock_mask(), ... It's set to 14 from db 8 + 4 + 2 3 2 1 connect in-hw-2-0-0-MicroBrute-MIDI-1 studio:midi_in_0 connect in-hw-2-0-1-MicroBrute-MIDI-2 studio:midi_in_0 connect studio:midi_out_0 out-hw-2-0-0-MicroBrute-MIDI-1 connect studio:midi_out_0 out-hw-2-0-1-MicroBrute-MIDI-2 connect studio:midi_out_0 out-hw-5-0-0-WORLDE-easy-control-MIDI-1 connect studio:midi_out_5 out-hw-4-0-0-USB-Midi-4i4o-MIDI-1 connect studio:midi_out_6 out-hw-4-0-1-USB-Midi-4i4o-MIDI-2 connect studio:midi_out_7 out-hw-4-0-2-USB-Midi-4i4o-MIDI-3 connect studio:midi_out_8 out-hw-4-0-3-USB-Midi-4i4o-MIDI-4 connect studio:midi_out_10 out-hw-3-0-0-M-Audio-Delta-1010-MIDI connect in-hw-3-0-0-M-Audio-Delta-1010-MIDI studio:midi_in_10 connect in-hw-4-0-0-USB-Midi-4i4o-MIDI-1 studio:midi_in_5 connect in-hw-4-0-1-USB-Midi-4i4o-MIDI-2 studio:midi_in_6 connect in-hw-4-0-2-USB-Midi-4i4o-MIDI-3 studio:midi_in_7 connect in-hw-4-0-3-USB-Midi-4i4o-MIDI-4 studio:midi_in_8 connect in-hw-5-0-0-WORLDE-easy-control-MIDI-1 studio:midi_in_0 So if I understand correctly: studio:midi has 16 midi ports 5,6,7,8 are connected to the analogs, and should have the time code. sqlite> .dump PRAGMA foreign_keys=OFF; BEGIN TRANSACTION; CREATE TABLE midiport ( port_id INTEGER PRIMARY KEY NOT NULL, port_name TEXT NOT NULL ); INSERT INTO "midiport" VALUES(1,'BCR2000-MIDI-1'); INSERT INTO "midiport" VALUES(2,'BCR2000-MIDI-2'); INSERT INTO "midiport" VALUES(3,'BCR2000-MIDI-3'); INSERT INTO "midiport" VALUES(5,'USB-Midi-4i4o-MIDI-1'); INSERT INTO "midiport" VALUES(6,'USB-Midi-4i4o-MIDI-2'); INSERT INTO "midiport" VALUES(7,'USB-Midi-4i4o-MIDI-3'); INSERT INTO "midiport" VALUES(8,'USB-Midi-4i4o-MIDI-4'); INSERT INTO "midiport" VALUES(9,'LPK25-MIDI-1'); INSERT INTO "midiport" VALUES(10,'M-Audio-Delta-1010-MIDI'); INSERT INTO "midiport" VALUES(11,'Axiom-25-MIDI-1'); INSERT INTO "midiport" VALUES(12,'Axiom-25-MIDI-2'); INSERT INTO "midiport" VALUES(13,'Axiom-25-MIDI-3'); INSERT INTO "midiport" VALUES(14,'MicroBrute-MIDI-1'); CREATE TABLE midiclock ( port_name TEXT PRIMARY KEY NOT NULL ); INSERT INTO "midiclock" VALUES('USB-Midi-4i4o-MIDI-1'); INSERT INTO "midiclock" VALUES('USB-Midi-4i4o-MIDI-2'); INSERT INTO "midiclock" VALUES('USB-Midi-4i4o-MIDI-3'); INSERT INTO "midiclock" VALUES('USB-Midi-4i4o-MIDI-4'); INSERT INTO "midiclock" VALUES('MicroBrute-MIDI-1'); CREATE VIEW midiclock_mask as select sum(1< select * from midiclock_mask ...> ; 16864 I think I might have pulled in erl_tools, which then breaks sql? Ok, so it's starting the other one. ;;(defun my-start-studio () (interactive) (my-start-erl-prj "studio")) (defun my-start-studio () (interactive) (my-start-erl "studio" "/i/tom/studio/erl.sh")) I don't even know which is the correct one. The one on /i/tom/studio has a different database. Resolved. Needed to do db_init.sh again. Maybe just get rid of studio directory on zoe then? Entry: Convert to rebar and incorporate in exo Date: Sat Apr 7 15:41:56 EDT 2018 EDIT: 2019/4/18 this happened somewhere in the past. See exo log. Entry: Midi patterns Date: Thu Apr 18 17:39:01 EDT 2019 So what if I have a pattern. How do I play it back? Now that I now how to properly do reloads, this might be a lot easier to use. EDIT: I completely forgot. This needs to be discoverable. At startup, it needs to play a pattern on one of the synths. EDIT: Reverse engineering. jack_midi is not registered. Get pid from studio_sup. That process supports message: {midi,Mask,Data} P ! {midi, -1, midi:encode({on, 0, 64, 100})}. That crashed it. Probably the -1. It also takes symbolic midi. This now works: f(P),{ok,P}=studio_sup:find(midi_jack). obj:get(P,midi) ! {midi, -1, {on, 0, 64, 100}}. obj:get(P,midi) ! {midi, -1, {off, 0, 64, 100}}. But it seems it is missing a whole lot of infrastructure. Entry: Sqlite Date: Thu Apr 18 17:51:14 EDT 2019 So basically, all that can be replaced by code, because it is integration code. A module save is going to be just as fast as a DB save. Entry: Review? Date: Thu Apr 18 19:55:21 EDT 2019 This was written when I did not have a whole lot of experience yet with some Erlang corner cases. Do a review, and client it up a bit. EDIT: I'm afraid that the timing won't be good enough. To do something good, do it at the C end at Jack frame rate. Or change the jackd daemon to Rust to have a less hacky dev experience. Entry: Synths Date: Thu Apr 18 21:35:12 EDT 2019 64 bass 128 keys I can't find the drums though. Ok they are just under different keys. So it's clear what this needs: something that can do midi routing, midi recording and midi playback with perfect timing. So run it inside jack, and run the soft synths, midi and sampler all in the same core. This is not for Erlang. What erlang is good for is to connect the midi to other shit on the network, or vice versa. Act as a bridge, but don't let it do timing-sensitive things. Entry: A recording format. Date: Thu Apr 18 22:09:29 EDT 2019 It really needs to go just in RAM, so it doesn't matter all that much. Saving it to disk will need time stamps for midi. Maybe just midi files? Entry: A recorder Date: Mon Apr 22 11:50:19 EDT 2019 It's clear that doing midi in Erlang is not going to cut it, so do it in Jack instead. Extend the current Jack code to do midi, and then replace the existing C code. What I want: something that's continuously recording, but can take snapshots. Basically, there is no reason not to record continuously these days. Just use a circular buffer, and create an interface to go back and find things. So I already have a format for this. It just needs to be chunked to also do audio. Let's reserve a drive for it. This can just be USB. EDIT: 34e6aa85-8440-4d19-ac76-5646a856443e 500GB Seagate USB2 Record jack in/out and all midi in/out. Jack client can do the chunking and send a message to erlang. This is 48kHz * 2 * 8 * 2 or (* 48000 2 8 2) 1.5MByte/sec _+ MIDI. That should not be an issue. The disk is 500GB, so about 90 hours: (/ (* 500.0 1000 1000 1000) (* 60 60 (* 48000 2 8 2))) 90.42245370370371 Use a global start/stop knob. Start a new file as with the video recorder. Remount r/o on stop. What is the next step? - instantiate recorder - create jack client to tap the busses Entry: Recorder Date: Mon Apr 22 13:45:26 EDT 2019 Bulk it up into one second chunks? Jack with a frame size of 64, which is a lot of pressure on Erlang without any good reason for it. Midi can be left as is, but the audio should probably be reduced. Maybe pick something that can easily resolve 20Hz so FFTs can be used in place on the data to scan for frequency content. Say 10Hz is the block rate, that is about 4K sample blocks which is 128K frames for 16x16bit channels. This is reasonable. On par with cameras. Studio could also record a camera feed. Entry: TODO Date: Mon Apr 22 14:13:47 EDT 2019 - jack audio client, bulks up a number of channels into 4K time samples, and sends it out to the recorder. this can be done inside a single Rust core application that ties into Erlang. the recorder can be Erlang i already have this client somewhere, so next step is to put it in studio. - the process that starts the jack daemon and sets up connections can just be kept as is for now. later incorporate in Rust once it becomes easy to do. Entry: Next Date: Tue Apr 23 15:08:51 EDT 2019 Rust code is currently in exo. It's fine there for now. Once context is active, moving it will be easy. It starts on the studio node: exo:start(exo_rs). exo_rs:call({open,{jack_client,<<"test">>}}). <0.617.0>: Unknown source port in attempted (dis)connection src_name [rust_jack_sine:sine_out] dst_name [system:playback_1] {ok,0} First issue: connecting ports. I've install qjackctl but it doesn't see the running jack instance. EDIT: It works. I was using patchbay, not connect.. Wanted: - Get a list of ports - Connect command for ports - A "connect daemon" : restore connections on restart - Store connections in a db? - MIDI learn Basically, I want it to remember how it was last set up, and possibly restore it as well. Maybe also use version numbers of code. This needs some refactoring. The code is set up in weird ways. It would fare better with a flat supervisor structure. I don't find a good starting point. Assume the list of ports is known. Add code to just do a connection? Entry: Studio c build appears to be broken Date: Tue Apr 23 18:47:07 EDT 2019 Change it to redo. Then make it support fast reload, so the C code becomes easy to change. For glue code Rust likely will be overkill. Rust is for data-heavy applications that need speed and memory efficiency. EDIT: Redo works and it produces c_src/studio.host.elf Next step is to make it use the same upload mechanism as the other exo binaries. EDIT: It already works: <0.30665.2>: update via erlang: {'exo@10.1.3.12',<<"studio.host.elf">>,"/home/tom/bin/studio.elf"} EDIT: Fixed reloads for control and midi. Entry: Next? Date: Wed Apr 24 11:23:12 EDT 2019 I need a host for DSP code that's always on. It seems best done as a jack application. There's a choice here: go for C first, or Rust. I think it's probably best to have both options available, so make a C host for simple DSP code that reloads and reconnects according to some state. Then move the Rust jack code into studio. Maybe do the latter first acctually, to open the doors and allow better decisions between these two. EDIT: How to manage dependencies? studio will depend on some rust code. Maybe the entire looper code can go into studio for now? It doesn't really need to be separate. And just manage dependencies with links. EDIT: Rust code builds. EDIT: Rust code is integrated into erlang code and supervisor. (exo@10.1.3.12)3> {ok,P}=studio_sup:find(studio_rs). {ok,<0.698.0>} (exo@10.1.3.12)5> studio_rs:call(P, count). {ok,1} (exo@10.1.3.12)6> studio_rs:call(P, count). {ok,2} The Rust jack code can now just be moved maybe? EDIT: Done. It is part of studio_rs now. Entry: Recorder Date: Wed Apr 24 12:55:36 EDT 2019 C or Rust? Let's assume it's ok to handle the saving in Erlang, then only audio channels need to be recorded, as the midi client can already dump its data to the recorder daemon. Take jack_midi.c as a template. It's just dumping the data on stdout in the jack loop. Is that a good idea? Mabe a condition variable together with double buffering is a better idea. Actually, a semaphore is better suited. https://www.geeksforgeeks.org/use-posix-semaphores-c/ EDIT: Basic skeleton is coded. Test it only when integrated. Next? Automatically connect client? Not needed. For now it can be done manually. I think it's all there now to make a recorder. Create the start/stop functions. EDIT: Get rid of the midi hub? Use the broadcaster instead. EDIT: Done: serv hub is deleted. Entry: Buffering Date: Thu Apr 25 08:08:18 EDT 2019 It's currently not done properly. 1. write() is called inside the realtime thread. This is not a good idea. 2. is the update to write_buf variable actually atomic? Yes it probably is ok. Entry: time stamps Date: Thu Apr 25 08:56:27 EDT 2019 These are going to be rolling anyway, so let's just embrace that and encode the phase using only 8 bits. This gives ambiguity of: (/ 48000.0 (* 64 256)) 2.92 Hz for 8 bit (/ 48000.0 (* 64 256 256)) 0.01 Hz for 16 bit Let's leave it at 8 bit for now. It's probably enough. (/ 48000.0 64) 750 Hz is the control rate. Midi is 31250 baud or 3150 bytes/sec, or about 1000 note messages per second. The jack rate is in the same ballpark. Problem is how to synchronize the clocks of the midi and audio clients? The trick is to not do this at all maybe? Use a single daemon for both audio and midi -- just extend the midi one, with the following constraints: - send out midi once per jack frame (64 audio samples) to keep the latency low. - bundle up audio, and send it out in a single block every 4096 audio frames. Entry: Next? Date: Fri Apr 26 17:12:07 EDT 2019 Unify jack audio and midi. First clean up the jack_midi client? What about this: use the semaphore to just send a "ping" to the low-prioirity thread for each message that needs to be sent out. Packet size needs to change to {packet,4} (* 8 2 4096) So do that first. EDIT: ok I managed to get stuck in some weird corner, likely build system bug. I'm too tired to debug this right now.. But the general idea should work: use {packet,4} messages so audio frames can be sent as well. One write call for midi, and one for audio bundled to 4096 samples per channel. EDIT: Still buggy, but it's already clear that a 132kbyte write is not going to work from the real-time thread. It's looks like keeping midi and audio separate might be a better idea. But how to synchronize? Is there a global jack time stamp? Yes there is. Read this with a clear head. There are several routines. EDIT: Using this: jack_nframes_t f = jack_last_frame_time(client); audio_buf[write_buf].stamp = f / nframes; Entry: recorder Date: Fri Apr 26 22:17:59 EDT 2019 So the midi and audio clients are working. Recording is easy from this point on. Start thinking about what to do with the recorded data. Entry: Random samples Date: Sat Apr 27 08:42:24 EDT 2019 So I have a bunch of midi keys accessible now. Map those to random samples? Entry: Looping the recording? Date: Sat Apr 27 08:57:37 EDT 2019 But first, find a proper way to play back and loop a previous recording. Input is the loop length. It needs crossover at the audio stitch point. The main problem is to create a data structure that is easily accessed. It would be an index into a mmapped log file. BTW does mlockall interfere with regular memory mapping? Entry: Manifold music Date: Sat Apr 27 09:07:13 EDT 2019 I have 24 knobs in incremental mode. These could be used to make local excursions in a 24-dimensional control manifold. How to make this idea concrete? Entry: Recording midi Date: Sat Apr 27 09:43:01 EDT 2019 So I've added midi clock F8 data. I'm starting to worry that this is getting a little too verbose. Probably best to store it in binary, and keep it chunked? Chunking is likely not neccesary, but binary would make it easier to handle in Rust. Entry: Recorder is up Date: Sat Apr 27 10:07:39 EDT 2019 What I want is basically to use the wall clock to find a particular sequence, or some other kind of marker. I need: - Winding and start/stop via midi controller - Some graphical multi-level display First, maybe solve some issues with the recorder. Especially deleting old chunks based on disk free. EDIT: Not using disk free, but using sum of data and index files to trigger deletion. Entry: mlockall Date: Sat Apr 27 14:05:38 EDT 2019 Should loops be rendered into non-mmapped RAM? I can't do mlockall. So the architecture should definitely split the mmapped disk read process from the real-time sample playback process. Entry: Jack: multiple clients? Date: Sat Apr 27 14:07:59 EDT 2019 Should I rely on fine jack client granularity? It seems the only reason to use jack is to use already existing applications. For my own DSP code I can just as well run it all in one client, or use Pd as a host. Entry: Connectivity Date: Sat Apr 27 14:11:44 EDT 2019 So let's revisit the notification mechanism. It might be easier now to : - get notified when clients (dis)appear - query port lists - restore connections from db Entry: registration callbacks Date: Sat Apr 27 15:02:57 EDT 2019 These tap points are enough ASSERT(0 == jack_set_port_registration_callback(client, port_register, NULL)); ASSERT(0 == jack_set_port_connect_callback(client, port_connect, NULL)); ASSERT(0 == jack_set_client_registration_callback(client, client_registration, NULL)); - port_connect studio:midi_out_12 system:midi_playback_14 1 An external application can be used to connect ports, but we can keep track of connections in the database. - client_registration studio 1 - port_register studio:midi_out_0 1 Client registration isn't so useful, but port registration can be used to look up connections in the database and restore them. I do not see a way to get notifications of midi ports being plugged in, i.e. USB devices. Actually, this might work but it would go via port aliases. EDIT: ok, got it. api is just weird. This should now be placed in a db. Also, jack_daemon process should serve as a supervisor, restarting its clients. Or this could be done using nested supervisors. EDIT: This is for later. Getting bored with it. Maybe solve it at another time? Entry: jack events Date: Sat Apr 27 18:13:01 EDT 2019 It seems to work now: (exo@10.1.3.12)33> studio_rs:call(studio_rs, {open, {jack_client, <<"rust_jack_sine">>}}). studio_rs:call(studio_rs, {open, {jack_client, <<"rust_jack_sine">>}}). {jack_control,"studio_control"}: {client,true,<<"rust_jack_sine">>} {jack_control,"studio_control"}: {port,true,{<<"rust_jack_sine">>,<<"sine_out">>}} {jack_control,"studio_control"}: {client,true,<<"connector">>} {jack_control,"studio_control"}: {connect,true,{<<"rust_jack_sine">>,<<"sine_out">>},{<<"system">>,<<"playback_1">>}} {jack_control,"studio_control"}: {client,false,<<"connector">>} {ok,0} (exo@10.1.3.12)34> studio_rs:call(studio_rs, {close, 0}). studio_rs:call(studio_rs, {close, 0}). {jack_control,"studio_control"}: {connect,false,{<<"rust_jack_sine">>,<<"sine_out">>},{<<"system">>,<<"playback_1">>}} {jack_control,"studio_control"}: {port,false,{<<"rust_jack_sine">>,<<"sine_out">>}} {jack_control,"studio_control"}: {client,false,<<"rust_jack_sine">>} {ok,0} This is the basic code reload infrastructure: - wrap any DSP / synth code in a jack client - keep track of connections - restore connections when ports are created Entry: Next Date: Sat Apr 27 20:01:09 EDT 2019 DB support is there. The rest should be straightforward. It's time to start making synth code. Entry: Gated MIDI recorder Date: Sat Apr 27 20:43:36 EDT 2019 This is different from the permanent recorder, which is more for capturing "moments". The gated recorder is intentional loop creation. What I need for this is a data structure. This needs to be something that is easy to "advance". It seems more of a rust-like problem. So next step is to create a Rust Jack client with midi inputs. A simple structure would be a set of circular buffers, one for each cycle. Just rebuild them on each cycle, merging incoming signals. This avoids the need to work with linked lists. So two approaches, per period: - a linked list (C) - a bouncing buffer (Rust) In jack, align them to the 64-sample buffer. To do these in C: - record midi events in a (growable) array - record the sequence in a circular list, referring to the events It's actually more convenient to think of this as a t->[e] map: time mapping to a set of events that happen at that time. Recording an event will create a new list at t if it doesn't exist yet, otherwise it will append to the list. Linked lists are hard in Rust, but it seems that unless the circular "bounce" is used, it will be hard to do this properly. Note on linked lists in Rust: https://cglab.ca/~abeinges/blah/too-many-lists/book/README.html I'm just going to use circular bounce. This will be straightforward in C as well. The core is just a growable vector with iteration and bouncing based on the record periods. EDIT: It's pretty much just a delay line: a smaller circular buffer "rolled" inside a larger circular buffer of allocated space. When events are added, the inner circle gets larger. For audio it is similar, but there the buffer is actually circular. So what are the primitives? - peek event (to check timestamp) - dequeue event - enqueue event then, optionally, if enqueue causes overrun, reallocate Design them as byte or word queues. EDIT: Implemented the data structure draft in jack_midi.c EDIT: Filled it in some more. Basic structure seems to be there. Gaps need to be filled in and control is not clear yet. Entry: MIDI control Date: Sun Apr 28 01:00:54 EDT 2019 Typical problem: how to send some more complex data structures over MIDI? Maybe it is time for s-expressions. Midi can send 0-127 in a sysex, so ASCII would be good. Entry: Next on sequencer Date: Sun Apr 28 08:11:16 EDT 2019 Make the iteration over the "now" events abstract. EDIT: Got some sign of life. So it basically works. But there is a lot of "management shit" that still needs to be implemented. - state marshalling - midi routing matrix The question is really to sysex or not. It is not absolutely necessary, since there can be a more generic pipe between Erlang and the port program. And it seems like a bit of a pain. But once implemented, it might be useful for other things. Basically that 7-bit thing is awkward. Text seems to be the only non-awkward way to deal with it. Yeah I don't want to do this with s-expressions. Use binaries, and use a bit stream. https://blogs.bl0rg.net/netzstaub/2008/08/14/encoding-8-bit-data-in-midi-sysex/ 7 bytes + byte with msb Send the top bit byte first. This avoids the need for adding a length spec. EDIT: I put some stub code there, but I need a test routine for these.. So assume the enc/dec problem is soved pending bugs. The remainder is then to allow for larger messages to be transferred. I can't quite do this kind of work right now. Entry: Better control Date: Mon Apr 29 08:40:21 EDT 2019 This needs some synchronization between high and low priority threads. Maybe try a new architecture? - low pri input thread - high pri processing thread - low pri output thread This way any state snapshotting can be done in the high pri thread, and communication delays can be handled in the low pri output thread. As for the queues: don't do any allocation. For larger messages, add some form of round-trip acknowledgement. This allows performing transactions over time. This is possible because the larger messages do not have any low-latency requirements, i.e. they are state save/restore. EDIT: Cleaned it up a bit, and made the buffers bigger to allow for 64*8 encoded / 64*7 decoded sysex chunks to be transferred. Entry: Output thread not necessary? Date: Mon Apr 29 10:17:11 EDT 2019 For simple midi it seems fine to just send a single write() call. But it seems that at some point this will break down when bulkier messages are sent. I can just try this to see how it goes. E.g. fill up with a dummy sysex message. It is possible to poll from the real-time thread to see if the low-priority thread has actually sent out a buffer. This way some kind of priority scheduling can be done. E.g. don't send sysex if the previous buffer didn't go out yet. This is actually an interesting problem. EDIT: I've done some tests, and it doesn't really seem to be a problem to send out small chunks of data continuously via the process thread. Entry: write() to an anonymous pipe Date: Mon Apr 29 11:15:14 EDT 2019 I wonder, what are the guarantees that this will not block? I.e. I just want it to keep buffering, assuming that eventually the low priority thread can do some bulk read to collect a bunch at once. For now the assumption is that it is ok to do continuous writes from the real time thread as long as the chunk size is "small". Entry: sysex tests Date: Mon Apr 29 14:24:15 EDT 2019 Weird. Simple function sysex_dec_size, not doing what I think it does. EDIT: operatore precedence Entry: Protocol Date: Mon Apr 29 21:48:45 EDT 2019 This is a pain in the ass really. Requires a lot of ad hoc messages with dump & parse. Actually dump isn't really the issue. This could be done in text form actually, or ETF. For restoring patterns: it might be simpler to use individual set commands for this. These can then be added to a "pending" queue, which is added to the playback queue when it's time. Maybe do that first. That would make generated patterns possible. Basic idea: keep the C code and data structures simple. Put all data shuffling at the Erlang side. That "edit queue" isn't such a bad idea. It could also be used for removals. Perform the queue edit right before playing the event. Entry: Circular buffers Date: Tue Apr 30 08:49:54 EDT 2019 It's possible to avoid copying, but not without a lot of effort. The point of the copying is to have the write pointer be at a spot where there is space available without the need for moving any data. I.e. the constant re-write ensures all operations are constant time, at an increased cost. With hot caches I doubt it will be a problem. EDIT: This seems to work well. Entry: Queues Date: Sat May 4 20:30:39 EDT 2019 I find myself in need for more than one type of queue, so maybe generalize some things such as: - wrap a void* memory block and operate on void pointers. - make it growable? - move the write / grow part to the non-rt thread? This seems to have a complexity explosion. Let's just stick to two implementations of queues. Or write the implementaiton as a macro. What about defining an ML-style module as a header file? EDIT: Yeah that's a nice side track! EDIT: Yes, I like this. Move ns_queue to uc_tools. Entry: Dumping Date: Sun May 5 13:17:17 EDT 2019 Edits are working. But I want to do this differently. It seems all a little too convoluted. What about providing a mechanism to swap the whole state atomically, and have the low priority thread process the state? Entry: Dump as text instead Date: Sun May 5 14:41:53 EDT 2019 While it works, I think this binary format is going to give issues. It doesn't take much more effort to just dump as text. This way the sysex doesn't need to be decoded. EDIT: Ok, generalized to ns_dump.h s-expression dump module. EDIT: Still needs chunking. How to do that? EDIT: Done. Chunks are reassembled based on paren balance. EDIT: Using explicit more/done marker to avoid paren balance count. Entry: Sequencer on STM32 Date: Sun May 5 16:38:22 EDT 2019 The jack midi sequencer is written with full sepration between core and editor. The core will run on an STM32 without malloc. This opens up an architecture of distributed sequencers, reducing MIDI bandwidth needs. Entry: Pool the events Date: Sun May 5 19:47:12 EDT 2019 Tracks will likely get out of balance (power law), so it makes sense to pool events. Though that will break some cache locality, and will make it impossible to erase tracks separately. Maybe not then? Entry: Next? Date: Sun May 5 21:04:17 EDT 2019 All the hard stuff is implemented. Add some convenience, such as changing midi clock tempo, sequence tempo changes. Entry: Relative time stamps Date: Sun May 5 21:06:32 EDT 2019 I just realized there is no simple way to speed up the sequences, so maybe it would be good to represent the event time stamps fractionally, using some kind of highly composite denominator. This is for polyrythm. What do we need? (* 2 2 2 2 2 2 2 3 3 3 3 5 5 5 7 7 11 13) 9081072000 It might not be enough. This might need to approximate fractions in a different way. It's not actually necessary to have high resolution in the small time scales, so powers of two will do. But it is necessary to have a bunch of prime numbers to make it possible to have exactly locked sequences. (* 2 2 2 2 2 3 5 7 11 13 17 19 23) 3569485920 I'm looking at it the wrong way. Subdivision doesn't matter. It is only in the way that the period is represented that we need to ensure perfect locking. But it is going to be necessary to use relative indexing. I need to do this when I'm awake.. Entry: Relative event time? Date: Tue May 7 13:21:55 EDT 2019 This is only necessary in the case that edits are too expensive. They might not be. Otoh loop speed change is something I would like to have midi-controlled since it is so basic, so let's do relative events. Once thing I worried about is that high resolution relative events take up more bits when dumping patches. But maybe 16 bits is enough? That will give exact positioning for the more common power-of-two forms. EDIT: Phase resets are still difficult to do in that they would be O(N). Maybe that is really not an issue, but it does show that the setup might be a little too simple. Slow phase morphs are possible though, by changing the period. Entry: Map all the synths Date: Tue May 7 20:46:14 EDT 2019 Basically everything is there now to start making some pattern music from Erlang. Put it all in one module. Start doing this by making a port/channel/transpose knob set that will determine where the keyboard midi notes are going. Then add some learn functionality that records a tag in a database. Entry: Connecting midi Date: Thu May 9 17:53:46 EDT 2019 This should be a single functioon. I want a list of clients and ports. Entry: RAI Date: Fri May 10 13:10:33 EDT 2019 Is connected to exo. The doodle.rkt is set up as a task on broom. Changing params will send on the pd messages. It's now set up such that: - emacs uses udp send to localhost 12345 on panda - panda has forwarder to whichever doodle_pulse it finds - exo:start(doodle_pulse) starts on node - push_change will issue reload message I think that this s-expession approach is more convenient. Keep it. Entry: Let's do a full set of "standard" effects and synths Date: Fri May 10 15:51:13 EDT 2019 And compile them to WASM. EDIT: Judgement is off. Depressed state. Entry: Midi controllers Date: Sat May 11 15:16:25 EDT 2019 Let's see if this works: I want to run the synth on lroom, and plug the midi controller into tp. (exo@10.1.3.2)2> {exo_notify,{10,1,3,10}}: ignored: {add,dmmidi,{tp,1}} Where does this come from? {exo_notify,{10,1,3,10}}: exo_notify: Data: <<"env /etc/net/udev/notify-exo.sh tp\n">> {exo_notify,{10,1,3,10}}: ignored: {add,dmmidi,{tp,1}} It does handle midi, just not in a very transparant way. What do I actually want it to do? Assume a hub-and-spokes model maybe, with zoe always central hub, and other audio and midi devs referring back to it? Now what does it do? It looks for midi_raw, which normally runs on 29, and sends it the 'add' message. midi_raw ! {add,midi,{tp,1}}. Then 29 tries to start an erlang process to access that node. This had a bug. Then it tried to send to midi_raw process, which now has a different protocol Got it now. Let's move midi_raw to zoe where it belongs: on zoe. Ok, done. So it goes like this: - udev notifies exo_notify on 2 - this forwards it to midi_raw - midi_raw starts a socat process to the node with the device - and sends midi data on to midi_hub Next: how does a pulse synth get the data EDIT: Simple: it subscribes to midi_hub Entry: Virtual patch bay Date: Sun May 12 20:38:35 EDT 2019 Use a breadboard or a patch bay as an input device to a digital modular synth. E.g. take a breadboard and have the top layer be outputs and the bottom layer be inputs. Send some scanning signal on the top and read out the bottom. Thinking about this some more, it seems that a patch matrix would be more appropriate. That would be easy to do with a touch screen gui, e.g. using a tablet or an old phone. This can also be done in code, similar to the number input. Entry: Some controller ideas Date: Sun May 12 20:52:16 EDT 2019 - patch bay (source code, gui or physical "scanning" bay) - code editing with rotary encoders: one to scan through parameters, one to set the value. this doesn't even need a rotary encoder: two sets of keys will be enough. Entry: RAI Date: Sun May 12 21:23:30 EDT 2019 Instead of going through udp, it should have a mode where it uses distel to send it straight to exo. Entry: Mobile Date: Tue May 14 10:47:36 EDT 2019 So here's what I want on tp: - work independently of exo network (mobile) - optionally jack or pulse - easycontrol grey knob set up for code param nav So make it a bit more abstract. Instead of a single hub, use many hubs that have the same interface. Just chain them together. This can be done later. It's just procrastination. Get the synth up first. Entry: rai synth Date: Tue May 14 11:27:29 EDT 2019 In main_pd there is mention of a proc instance. In pulse there is no such thing. The synth init code uses this instance information. I likely don't need the instance, so how to init the proc? I remember: pulse and jack use static data. Pd uses dynamic data. So it looks like the param_reader doe snot support the main gate/freq inputs. Those will have to be done separately. EDIT: I think I understand. It will need to be special-cased though, just like it is in Pd. There are a couple of possibilities. Maybe raw midi is the most useful? I need a default way to map gate, note and velocity to something. EDIT: Note on/of via pd style messages. Next is to get some sound out of it. It's not doing anything, likely because of those "controls". Maybe create a simpler synth first? EDIT: Ok it works. But pulse has too high latency to be playable. I think the kmook archive has a jack midi implementation. I actually did get quite far last time. Pretty much finished it. Entry: jack/pulse Date: Tue May 14 18:42:18 EDT 2019 Make the studio daemons part of exo. That way it is easier to start/stop. EDIT: There are a couple of architectural issues in how all these daemons are connected. The proper solution is to have a supervisor with dependencies. Probably some nesting as well. Entry: jack permission problem? Date: Wed May 15 12:23:44 EDT 2019 There's a difference between starting it from a tp terminal, and starting it via ssh. In the former there are no error messages about permissions, and the devices show up. I don't quite understand what's going on here.. Anyways, for now there is a workaround: just start it in a terminal. Entry: Running code standalone on wanda + tp? Date: Wed May 15 12:29:19 EDT 2019 Basically, every node that runs jack should probably have a midi_hub. Then when jack comes up, send midi events to all midi hubs in the exo network. EDIT: Going forward with this. Run the simple synth as a jack plugin. EDIT: Ok synth works. Voice allocator is a piece of shit though. EDIT: Solved the "local hub on partition" issue in exo. Entry: Voice allocator Date: Thu May 16 14:46:32 EDT 2019 How to create a better voice allocator? - push all voices on the stack - to allocate, pop one, mark it as used - to deallocate, check that it is used and pop push it on the stack Never dealloc a voice if there is another one that is not used. Can a monophonic allocator be a special case? I have one on the PIC that uses a stack. Entry: LRU voice allocator Date: Sun May 19 09:18:16 EDT 2019 The main data structure is a list with the following oprations: - Take the first N slots (= voice allocation) - Add to the head (= note on) - Remove from within the body (= note off) Only the allocation step needs to be fast, because it is run on each loop. EDIT: I think I have something based on queue rotation. Next is to integrate it. EDIT: Again. There are two obvious next steps: - accumulators in lta - voice allocator Entry: Starting a jack pluging Date: Sat Nov 16 11:38:59 EST 2019 Two things are currently set up: {rai,ProcName} -> {rai, proc, [ProcName]}; rai_udp -> {rai, run_udp, [12345]}; This assumes the port is in ~/bin. (exo@10.1.3.12)6> f(Proc),{ok,Proc} = rai:proc(doodle_pulse). f(Proc),{ok,Proc} = rai:proc(doodle_pulse). So change it to use the default port spawning mechanism? Not so important atm. Let's ask some questions. How can I find out the protocol? What messages does the thing support? The main idea here was to send from emacs. What actually happens when editing doodle.rkt? Redo is not rebuilding anything. EDIT: There is still something wrong with "redo install". Removing the .*.install files seems to fix the problem. It's uploading, but not reloading. Should be easy to fix but first make the rai interface go through another mechanism. Currently it is sendin UDP frames without properly tagging what file it is from. Entry: fix midi learn Date: Sat Nov 16 18:37:48 EST 2019 It's completely obvious now: go into emacs, set point to the value, hit the learn button, wiggle it and. Entry: Resume Date: Mon Apr 13 17:56:41 EDT 2020 I want to get back into writing music software. I want to do it without the ego this time. I think that was the main impediment. New tricks: - no_std rust + async - exo monolytg + incremental system is up - (soon) webassembly for gui bootstrapping - erlang epid + distro db New insights: - Use Erlang only for "patching", not for transport - Keep cores simple (state machines) - Focus on delta1010 DSP effect on ECP5 (better FPGAs will follow) Entry: fixed function everything Date: Mon Apr 13 22:22:02 EDT 2020 To make current setup usable, it needs to be made less programmable first, so I can then loosen it up. Entry: Lower the "analog barrier" Date: Mon Apr 13 22:26:09 EDT 2020 - mixer - envelope the rest can just be noise Entry: Ricky Tinez videos Date: Mon Apr 13 22:30:43 EDT 2020 I like his mindset. Good way to get to know about recent gear too. Entry: on FPGA work Date: Mon Apr 13 22:41:13 EDT 2020 One other thing: it is completely trivial to add processing to a FPGA setup. So why not start out with BBB, use the iCE40 board I already have, and add a bus to the ECP5 boards. Can I make ADCs in FPGA? Entry: Motley Crew DSP Date: Tue Apr 14 08:27:22 EDT 2020 The mixed digital/analog might be simplest to do with cheap ADCs in a microcontroller. It doesn't really matter it is 10 or 12 bit if it is just for a synth voice that still needs to go through envelopes and filters. Entry: What is missing in the setup? Date: Tue Apr 14 12:39:09 EDT 2020 It's not really playable. Currently it's ok to let zoe be part of it. Moving to a different setup is too much work, but new code should be written so it can run on minimalistic hardware. E.g. move to C or Rust. Erlang is mostly there for network management. Entry: scale finder Date: Tue Apr 14 21:53:03 EDT 2020 Go back to the idea of log scale with coarse "infinite" adjustment knobs on the sides. Entry: something not right Date: Wed Apr 15 21:52:21 EDT 2020 (exo@10.1.3.12)1> jack_daemon: ALSA lib rawmidi_hw.c:133:(snd_rawmidi_hw_drain) SNDRV_RAWMIDI_IOCTL_DRAIN failed: Input/output error jack_daemon: ALSA lib rawmidi_hw.c:133:(snd_rawmidi_hw_drain) SNDRV_RAWMIDI_IOCTL_DRAIN failed: Input/output error jack_daemon: ALSA lib rawmidi_hw.c:133:(snd_rawmidi_hw_drain) SNDRV_RAWMIDI_IOCTL_DRAIN failed: Input/output error jack_daemon: ALSA lib rawmidi_hw.c:133:(snd_rawmidi_hw_drain) SNDRV_RAWMIDI_IOCTL_DRAIN failed: Input/output error jack_daemon: ALSA lib rawmidi_hw.c:133:(snd_rawmidi_hw_drain) SNDRV_RAWMIDI_IOCTL_DRAIN failed: Input/output error Had to disable the midi hub for devices on zoe: they should use jack. It currently doesn't really check. Also ran into some issues with the cheapo 4 chan midi box. Power cycling helped. And then studdenly it all works. My guess: voltage issues somewhere. Probably best to replace that cheapo with some custom midi circuitry. Midi out is easy enough, and really one output for the 3 volcas is good enough. Also ordered some isolation transformers. It does help quite a lot to get the power supply noise out. Entry: dumb time Date: Thu Apr 16 15:51:56 EDT 2020 i need something i can use when i'm really dumb. Entry: Looper Date: Thu Apr 16 16:27:18 EDT 2020 What would be a good architecture for a looper? I already have a midi looper. Maybe start there and do audio in the same app? Entry: Recorder Date: Thu Apr 16 17:29:20 EDT 2020 So I actually have this recorder. Maybe it's time to actually start using it and write a frontend? Entry: Volca sync Date: Thu Apr 16 17:37:33 EDT 2020 Spec says monaural 3.5mm. So it's only pulses. Not start/stop. Entry: next Date: Thu Apr 16 19:01:33 EDT 2020 Okay today is not the day, but what does this need to be useful? - Proper recording & playback of audio and midi loops - Flexible and usable midi routing - Soft synths - Soft effects Pretty much everything! I should probably take a week to work on it. Entry: volca hacks Date: Thu Apr 16 19:04:23 EDT 2020 https://www.synthanatomy.com/2020/03/korg-volca-sample-hack-brings-polyphony-probability-triggering-more.html Entry: stay idiosyncratic Date: Thu Apr 16 20:24:18 EDT 2020 there is no point in trying to be everything to everybody. let it flow. Entry: Some notes Date: Thu Apr 16 21:21:24 EDT 2020 Started playing a bit yesterday. It's clear that the analog setup is not going to be enough to make tracks. Esp the drum machine doesn't have enough control. It might be good enough to get an initial thing going, but then it should probably be sampled such that more control is possible for mixing. Lessons: 1) use a two-step approach: - use synth as user interface to make loops - mix the loops separately End mixing stage should be completely digital, so I need a controller just for mixing, eq and other things. 2) don't focus on midi recording. Recording analog is going to be what enables many channels and layering. My analog bandwidth is too limited and should be used mostly for bussing. Also from the past it doesn't make sense to keep midi data without all the other synth parameters. The sound config will just get lost. So do this for fully specified digital only. 3) if we're focused on using the analog gear only for sound design, what's the best way to lay out the channels? It's not wired up in a bad way acctualy. Currently I have: digital inputs: 1-8: direct out from mixer channels digital outputs: 1-2: 2x stereo 3-4: 1x stereo bus 5-8: 4x mono bus This means that effects send from analog channels to the digital effects can be digital, and all the stereo returns can Some bounce ideas: There are two main ways to combine analog and digital:. analog synth -> digital effect -> recorder, or digital track -> analog effect -> recorder. 5) full access mixing is going to be important The easycontrol is fixed pot, so let those be fixed per channel functions. Volume for slider and some other function for the top knob. Let's keep it simple and fix the number of tracks to 9. The button on the easycontrol can be used to select a channel parameter on the br control. This can be anything, but EQ and effects send is going to be the most important. This will need a panel display to show vu meters for channels, current function for the br. 6) digital channel allocation optimized for bi-directional 2-bus changes: spirit/solderstation mixer -> digital (these are all better than direct out) - 2: Main mix - 2: Sub mix - 2: Solderstation mix (bench analog) - 1: send 1 (pre) - 1: send 3 (post) digital -> spirit mixer - the main mix is digital. its output should go via ub mixer to monitors - 2x stereo busses - 2x mono busses what about this: analog -> digital: two stereo busses digital -> analog: two stereo busses Using two busses makes it easy to pass the entire "background mix" in two directions, and still have a "foreground mix" to focus on. The 2 stereo busses can also be used as 4 mono busses by panning the channels. 7) Build the final mixer in pd At least for now, until I have another interface. Entry: The last 2 channels Date: Thu Apr 16 23:53:00 EDT 2020 Probably effects sends are going to be most useful. One is definitely reverb. This can be the unused one. The other one could just be duplicate of send to brute. Entry: Ardour Date: Thu Apr 16 23:56:47 EDT 2020 Can I actually use ardour as the mixer? Sure why not. Let's give it a try. I'm not really all that into using existing software, but a mixer/tracker is probably a good idea. No need to reinvent that wheel. EDIT: Yes this is a good idea. Ardour is good for plumbing. That also makes it clear what the next step should be. Ardour should probably do timebase. Can it be looped? Anyways. The feel is back again for synth stuff. The need for a looper becomes apparent. Also something that can sync to ardour's timebase. Entry: Ardour replacement Date: Fri Apr 17 09:53:27 EDT 2020 First: is it possible to reuse ardour file format to get config? .config/ardour5/templates/studiomix-template/ Yes it probably is. But not for now. What should the main router do? 1. properly name I/O 2. map readable names to routable names 3. delegate connecting Entry: Ports Date: Fri Apr 17 09:57:40 EDT 2020 (exo@10.1.3.12)1> obj:dump(jack_daemon). obj:dump(jack_daemon). #{audio => <0.769.0>,control => <0.763.0>, hubs => #Fun,midi => <0.768.0>, port => #Port<0.1373>, {<<"in">>,<<"Axiom-25-MIDI-1">>} => <<"in-hw-6-0-0-Axiom-25-MIDI-1">>, {<<"in">>,<<"Axiom-25-MIDI-2">>} => <<"in-hw-6-0-1-Axiom-25-MIDI-2">>, {<<"in">>,<<"Axiom-25-MIDI-3">>} => <<"in-hw-6-0-2-Axiom-25-MIDI-3">>, {<<"in">>,<<"M-Audio-Delta-1010-MIDI">>} => <<"in-hw-1-0-0-M-Audio-Delta-1010-MIDI">>, {<<"in">>,<<"USB-Midi-4i4o-MIDI-1">>} => <<"in-hw-2-0-0-USB-Midi-4i4o-MIDI-1">>, {<<"in">>,<<"USB-Midi-4i4o-MIDI-2">>} => <<"in-hw-2-0-1-USB-Midi-4i4o-MIDI-2">>, {<<"in">>,<<"USB-Midi-4i4o-MIDI-3">>} => <<"in-hw-2-0-2-USB-Midi-4i4o-MIDI-3">>, {<<"in">>,<<"USB-Midi-4i4o-MIDI-4">>} => <<"in-hw-2-0-3-USB-Midi-4i4o-MIDI-4">>, {<<"out">>,<<"Axiom-25-MIDI-1">>} => <<"out-hw-6-0-0-Axiom-25-MIDI-1">>, {<<"out">>,<<"Axiom-25-MIDI-2">>} => <<"out-hw-6-0-1-Axiom-25-MIDI-2">>, {<<"out">>,<<"M-Audio-Delta-1010-MIDI">>} => <<"out-hw-1-0-0-M-Audio-Delta-1010-MIDI">>, {<<"out">>,<<"USB-Midi-4i4o-MIDI-1">>} => <<"out-hw-2-0-0-USB-Midi-4i4o-MIDI-1">>, {<<"out">>,<<"USB-Midi-4i4o-MIDI-2">>} => <<"out-hw-2-0-1-USB-Midi-4i4o-MIDI-2">>, {<<"out">>,<<"USB-Midi-4i4o-MIDI-3">>} => <<"out-hw-2-0-2-USB-Midi-4i4o-MIDI-3">>, {<<"out">>,<<"USB-Midi-4i4o-MIDI-4">>} => <<"out-hw-2-0-3-USB-Midi-4i4o-MIDI-4">>} Small recap about how this works. 1. Jack daemon starts up, Erlang parses output. This is translated into an event stream like this: <0.764.0>: {port,true,"system:midi_playback_14"} <0.764.0>: {alias,"system:midi_playback_14","out-hw-8-0-0-LPK25-MIDI-1"} <0.764.0>: {connect,true,"system:midi_capture_1","studio_midi:midi_in_10"} Entry: Fixing connect Date: Fri Apr 17 10:46:21 EDT 2020 Exo DB interface changed to support distributed mode - Reads are still the same, but schemas have some restrictions - Writes need to go through exo_db:put Entry: Optimized Date: Fri Apr 17 13:32:50 EDT 2020 1. delta1010 9+10 spdif out can be used. i put them in ua-30 input 2. ua-30 only needs power to move that signal to its line out with attenuation 3. the ua-30 line out (main mix monitor) can now go into spirit 2-track in, which allows straight connection. note that the isolation transformer does add some attenuation. 4. this allows the 1202 eurorack to be removed entirely! 5. the ua-30 could be connected to another PC which then provides digital sync. e.g. beaglebone with FPGA. Entry: sprit SX acting up Date: Fri Apr 17 15:04:34 EDT 2020 I think it's the yellow right main mix slider. Entry: jack plugin gui Date: Fri Apr 17 23:28:56 EDT 2020 How to create plugin guis? Following youtube video I end up here as example: https://github.com/calf-studio-gear/calf/tree/master/gui there are xml files: https://github.com/calf-studio-gear/calf/blob/master/gui/gui/compressor.xml LV2 is mentioned: https://lv2plug.in/ https://tytel.org/helm/ Entry: fix connect Date: Sat Apr 18 17:01:15 EDT 2020 Start with what I want: - connect(keys, brute). This involves a couple of things: 1. Determine if the connection is local to a jack instance. 2. If not, use the epid 3. Else, delegate connecting to jack I only want to solve 3. atm. The rest should be a fairly trivial extension. But I want to be able to optimize, so let's use the convention that if the pid's are the same in an epid, then connect is special-cased to a local connection. Entry: Names Date: Sat Apr 18 17:25:11 EDT 2020 To solve connect, solve names. Why are there 4 kinds of names? 1. main port name, e.g. "system:midi_capture_2" 2. alias "in-hw-2-0-0-USB-Midi-4i4o-MIDI-1" 3. jack_audio port number I'm going to add another level of user-defined epid names on top of this. Should those refer to numbers or names? Probably numbers are going to be most convenient. The names are ad-hoc anyway. Let's just keep it uniform. EDIT: Working through the alsa midi connect. There is a degree of freedom here a. use alsa connect to connect the ports directly b. implement a connect mechanism inside the jack_midi client The latter requires more code, but could later be added to provide some more flexible routing or mapping. So we have to go with the former for now. EDIT: It seems alsa naming is not stable. It looks like I'm doing some parsing to map in-hw-10-0-1-BCR2000-MIDI-2 to BCR2000-MIDI-2 Indeed. Check the "scan:" handling in jack_daemon.erl So there is no free lunch to look up the hardware ports, but what can be done is to look them up in the db by using the knowledge of what ports the studio_midi client is connected to. Some workaround: studio_db:system_port/2 But this exposes a problem: connections cannot persist if names are not stable. So don't save them that way. Use the functional relation of studio_midi port -> system port instead. TODO: Remove the connection store/restore from jack and move it to exo:connect/disconnect. Entry: mixer Date: Sun Apr 19 10:09:38 EDT 2020 I do want a smaller digital setup with volume control and main out, so let's put the mixer back. EDIT: Ok it is getting convoluted. 1. ua30 output -> ub1202 tape in 2. ub1202: CD/TAP to CTRL ROOM This then frees up the bus for other things, so connect that back to the spirit. Additionally, I've set up a path from spirit monitor out to 11/12 on the ub1202, so at least there is an analog path to the speakers. I probably should create a diagram for this. Entry: So what's next? Date: Sun Apr 19 11:07:21 EDT 2020 I have 'exo:connect' working. This should probably log. Entry: Couple of wiring changes Date: Sun Apr 19 23:17:44 EDT 2020 - UB1202 is available as sub mixer - The UA30 is wired to rackhub, which gives it audio sync - rackhub will also be panel for the drum machine Entry: There might already be many midi/osc control apps in android store Date: Mon Apr 20 16:44:08 EDT 2020 And I have all those old phones still. Entry: Time base Date: Tue Apr 21 17:46:03 EDT 2020 So probably best to pick time base in jack. How to do that? First, switch off the other one. https://community.ardour.org/a3_features_midi_clock EDIT > Preferences > MIDI > Sync TimeCode > JACK Trasport/Time Settings > Ardour is JACK TIme Master Let's turn off the support in studio_midi. Entry: 909 kick in reactor Date: Wed Apr 22 17:52:49 EDT 2020 https://www.youtube.com/watch?v=A_AWH3SgM84 Entry: Don't record kick & hh Date: Wed Apr 22 18:47:41 EDT 2020 This is of course obvious, so why did I not do it? Entry: waveforms: sampler and logic analyser Date: Sun Apr 26 16:01:35 EDT 2020 I have two things that are really the same problem: navigating digital logic and oscilloscope traces, and editing music loops and samples. I want a fast and simple tool. Start with ideas from wvvw? The trick there if I recall was to use the mouse in the same way as in snd: left button is left selection, right button is right selection. Time zoom is done using the middle button based on the location of the mouse pointer. Then just have a lot of those windows. I need a substrate for this and I really don't want to use the browser in a first attempt. I want to be able to navigate this with just a mouse, and as long as the mouse is in the window, keys have special meaning. No chords: just use the keyboard as buttons. Let's do this so it will also work on a microcontroller, where left/right mouse movement is one pot, scroll wheel is another, and left/right buttons are two dedicated buttons. Would SDL+Rust be a good first try? So once it is possible to delineate loops, playing them back is fairly trivial. That said, let's take a look at Ricky Tinez' MPC3000 routing setup: https://www.youtube.com/watch?v=ngFK2YOmuCU - 10 audio outputs - drum group (7 pads / channels) - daw tempo midi in (ERM multiclock) - pro 3 synth in reach - put pads on separate outputs - don't overcomplicate things Some notes: since I don't really need separate midi ports, why not put them all on one channel? Anyway, veering off. For now the setup is ok. Control is tight enough using jack midi. But the importance of having both audio and midi in the same client is becoming important, so maybe that is something to focus on first? Entry: Midi filters Date: Sun Apr 26 23:51:52 EDT 2020 So I have that connect mechanism. I can make filters that connect channels to channels. Entry: sequencer Date: Mon Apr 27 00:14:45 EDT 2020 I think I'm done with these dinky sequencers on the volcas. Let's build something more centralized. Maybe that step sequencer... LMMS Entry: Polyphony Date: Mon Apr 27 00:26:19 EDT 2020 This could be done manually by chaining the 3 synths to give 5 voices. Entry: Fix things Date: Mon Apr 27 16:12:29 EDT 2020 There is plenty of stuff that just needs to be wired up into the new exo connect/redo mechanisms. Entry: axoloti I2S interface Date: Mon Apr 27 17:35:54 EDT 2020 Notes from chat with Johannes. - I2S : multiple busses - 3x clock 8MHz -> PLL in audio - SPI conntroller configurable as I2S STM32F4 - SAI block (msb/lsb different) old axo: no SDram, STM32F407 new axo: F427 (with SDRam controller) https://www.instructables.com/id/Garduino-Gardening-Arduino/ - Generate ELF for "headless" object. - start with release: https://github.com/axoloti 2.0 has ELF - this one has the ELF stuff https://github.com/axoloti/axoloti/tree/master/api cd platform_linux make Implement this class: https://github.com/axoloti/axoloti/blob/master/api/patch_class.h - enable packet communication debug axoloti/src/main/java/axoloti/connection/USBBulkConnection_v2.java dump_tx_headers dump_rx_headers Entry: arturia microfreak Date: Mon Apr 27 22:50:28 EDT 2020 digital: 1 knob algo select, 3 parameters per algo analog SEM filter https://www.youtube.com/watch?v=-ZmwOaWNmcs i can do the digital oscillators in rai. all the other synths can probably be modified to take filter inputs. this is neat another one: https://www.youtube.com/watch?v=aRTLlfDRzpU what to learn from that? rai is _way_ too simple to do all this, so maybe start building two synths: one that is more classical analog style (rai), and one that has more structural variance, to be written in rust. maybe find a combination? Entry: filters Date: Mon Apr 27 23:07:57 EDT 2020 What do I already have here? - keys,bass: different, but both derived from miniKORG700S 12 dB/octave (2nd order) - brute: Steiner-Parker filter - monotron: MS-10/MS-20 12 dB/oc - fatman (needs new controller + one unassembled) https://yusynth.net/Modular/EN/STEINERVCF/index.html I count 3 caps, but wikipedia says 12dB/octave https://en.wikipedia.org/wiki/Arturia_MiniBrute Entry: A synth in rust Date: Tue Apr 28 01:41:28 EDT 2020 Basically, use an entity component architecture. Entry: Refocus Date: Fri May 1 17:07:46 EDT 2020 I want too much. It's important to first start playing, get in the grove, and then figure out requirements. Also the routing is currently not really set up well. It is very complex and too manual. Go back to the midi learn idea. Entry: Midi learn Date: Fri May 1 17:09:49 EDT 2020 What is midi learn from the user interface pov? - a button that enables learn mode - the source controller (just play or turn) - something that selects the target So the problem is really selecting the target. Basically every target will need a learn button. Entry: Change bus setup Date: Sat May 2 08:56:14 EDT 2020 I don't use stereo track, so let's set up the busses such that they record mono only. Also I need to find a way to make recording simpler. What is missing? - save/restore of connect across client restarts - simple loop sampler - 1-1 port epid naming 1 I'm not sure why I find the connection thing so difficult. Maybe I want to do too much at once? jack_control: {port,true,"20200502:midi_in"} jack_control: {port,true,"20200502:r95"} jack_control: {connect,true,"20200502:r95","system:playback_1"} Let's do some simpler things first. EDIT: Changed names to "out_1" etc.. So... output port appears. This should lead to a connect operation. The question is then: how to make that query "natural". jack_control: {port,true,"20200502:out_1"} First, is the name stable? Yes: it's the only thing to go by. Second: this needs to survive jack restarts, so the name cannot be a pid. Let's start with removing the connect from main_jack.c Now what should happen when the port comes up? Who is responsible for maintaining the connections? It is important to let sqlite do the scan over the table. We're storing a pterm. The pterm could just have this form: {jack_port,<<"doodle:midi_in">>} I can't assume there is only one jack daemon in the system, but for now it is probably ok. EDIT: Got something working with some hardcoding. Entry: Behringer NR300 Noise Reducer Date: Sat May 2 18:12:21 EDT 2020 I want to mod it to make the attach time much shorter. Also wtf is mute anyway? It's inverted? https://thetoneking.com/behringer-stomp-box-cross-comparison-chart/ It is derived from NS-2 noise suppressor. attack knob: left = 0 Ohm mid = 150 ohm right = 850k Ohm os in series with R22 4.7k EDIT: bridging that doesn't do a whole lot... EDIT: I've wired the distortion chain into the effects loop. Found schematics. See library/link/boss-ns2-* https://www.freestompboxes.org/viewtopic.php?t=11953&p=128917 http://obrazki.elektroda.net/47_1279571553.jpg Entry: figure out how that patch bay actually works Date: Mon May 4 14:07:27 EDT 2020 EDIT: Printed out schematic from manual and put it with the boxes. Entry: code, next Date: Mon May 4 15:58:43 EDT 2020 Start out with a mixer maybe for the looper? EDIT: To move this in the Rust direction, it might make more sense to start working with midi instead. Maybe translate the existing recorder? MIDI: - think state machine os - start/stop filter (ardour loops) - time doubler - arpeggiator - filter (e.g filter time from big) LOOPER: - record is linear record with wrap-around - playback will need to fade - keep originals, possibly cut - auto line-up? not necessary when midi-synced AUDIO: - revive circular recorder + trigger? - navigator for circular recorder Entry: The idea is maturing Date: Tue May 5 19:06:24 EDT 2020 And also tying into things I've learned and that are just now coming together, such as rust async. Basically, I want an operating system for many small state machines. One important insight is that RAI can't be everything because it misses a certain flexibility in control flow. It's fine though. It can be a modular synth. So let's design it like that: machines that send messages. Entry: Crackles Date: Fri May 15 12:14:18 EDT 2020 It seems to be the +4/-10 buttons on the delta1010. I had taken out the Xenyx802. I've wired it back up, but now it just goes into channel 8 of the spirit. That is the "debug" mixer so can just go wherever. This frees up input 5/6 on the delta1010. Those could be effects sends? EDIT: Yes. Jack now has AUX1 and AUX3 strips. 1202 is now free, so I've put the snake there and filled it up with pc and sprit monitor. Entry: Tone knobs Date: Sat May 16 20:01:17 EDT 2020 So find out the pre/post eq thing on both pedals.