I started “january” the same way I start all of my pieces. I got a sequence stuck in my head, so I turned on the synthesizer and programmed the notes into a sequencer, then added accompaniment on two more sequencers and supplemented those melody lines with some keyboard.
I record EVERYTHING into my digital work station, Reaper, and these recordings can go on for the better part of an hour, grabbing individual lines from each of the synth voices separately and recording the MIDI info that comes out the sequencers and the keyboard.
The first draft of “january” is here: https://www.youtube.com/watch?v=DkfCn… So, with all the info I needed to make this song, I started fiddling with the tons of MIDI tracks I had, duplicating parts, redoing others, and tweaking the keyboard lines. I also sped it up to 140 bpm from 120.
When I finished rewriting the melodies in the DAW, I recorded them separately. The cool thing about my method is I can preview how the synth voices will sound and can experiment easily.
The modular voices are as follows:
Sequence 1 (lowest on screen) is a Braids Macro Oscillator going into a VCA with a second ADSR sweeping the Timbre parameter
Sequence 2 (the lines in the middle) is an E350 Morphing Terrarium and a Z3000 going into an E440 Filter and then a VCA, with a second ADSR changing the filter cutoff
Sequence 3 (top line of the video) is a vintage ARP2600 (1973) as the main bass line and simultaneously a Rings is getting the control voltage, and because the way Rings resonates, pitch only changes when there’s a NEW note, so the same note hit over and over, only makes a sound the first time it gets the signal
Sequence 4 (also appearing on the bottom) is a Braids and an Erica Varishape VCO going into an SVVCF filter with cutoff controlled by both velocity coming from Reaper, and my hand moving over a Koma Kommander.
So, the whole piece is 4 lines of music at a time. I figured this thing would have more appeal with a video. So, with a very handy piece of ExtendScript from http://omino.com/pixelblog/2011/12/26…, I was able to import the MIDI tracks into After Effects and derive pitch and velocity information on a note by note basis. Also, I imported the individual audio tracks and used the amplitude information to drive things like particular size. Trapcode Particular did most of the heavy lifting. More keyframe tweaking to compensate for the delays in particle generation.
So, it’s done. Whew.