I arrived at class and was glad to be told that we would be given time to make some final adjustments to our patches. I needed this, as there were still some sensitivity issues I wanted to fix.
When I opened my patch and connected Kinect, I was horrified to find that for some reason, everything had been reversed. Positive values were negative, up meant down, left meant right, etc...
I couldn't tell what hand was controlling what. Immediately I started working around this. I had to change the entire order of my samples, as well as move the scale tool to the ones that previously hadn't needed it. It took me twenty minutes, but I fixed it, and was relatively happy, until I realised that the movement I'd set to the high C was working the opposite way, and there was no way for me to fix it. I tried and tried, but nothing worked - it would only be triggered if I moved my left hand inside rather than outside of my body, which complicated things. To fix this, I simply changed the sensitivity of it, so that I didn't have to move quite so much.
The sensitivities of my various notes had changed too. For example, if I wanted to play my low C and Eb, in order to play the Eb, I'd have to play the C again because I couldn't desensitise the Kinect enough to stop the two from clipping. If I did, you'd have to crouch down and leap up to trigger the notes, which while fun, wasn't what I was after. If I had prepared my patch earlier, I would have loved to have just spent weeks adding little things into my patch, constantly adjusting it so that the parameters I'd set stayed constant. I'd have made it much more accessible, and easier to solo properly. I'd also have liked to have been able to sort some of the latency issues - sometimes if you moved too fast, Max wouldn't pick up your movement, or there would be a half second delay while playing certain samples, meaning playing in time became very difficult.
I did explain to Ross that the purpose of my patch was to have fun, and that he shouldn't expect a virtuoso performance. If anyone was able to make their solo sound like Dr. John I'd probably walk out in jealous disgust.
After making some final adjustments to my patch (those damn sensitivities...) I was ready to perform.
I started off by showing everyone how in control I was by playing all of my notes cleanly and without clipping. I had of course practiced this, and knew exactly where to position my hands so that the notes would play themselves. Unfortunately, as I played my final C, having made sure it all sounded so clean, Jordan played something on the computer next to me, and it sounded like my final note was connected to a bunch of other notes and noises. This frustrated me, as you can see on the video, but I knew that it was Minton, and not my oh so perfect patch that made the nose...
After demonstrating a brief solo to the class along to a slow blues backing track in C, I let others come up to have a shot, as per my original idea.
Steve, Ross (our lecturer) and Dean had a shot, and as I watched them, I suddenly felt very pleased; my original idea had been to create a patch where anyone could come up and solo over a blues backing track using just their hands; at first I thought the idea was too ambitious, but here I was watching them do what I'd envisioned from the start. I felt a real sense of accomplishment.
People seemed to really enjoy it, and everyone had fun. There were some minor niggles that I'd have liked to fix - the latency, for example, but I was very pleased with my end result. Ideally, I'd have liked it to have been much smoother - if I was able to make it so smooth that whoever was in control would be able to play along with a live blues band in time, playing a thought out solo rather than just random notes, that would have been fantastic.
Alas, this was my first time working with Max and with technology and music together. I am very pleased with the end result, and feel that I have gained a huge amount of experience and confidence from this module as a whole. I look forward to taking the Tech module again next year, where I aim to push myself and come up with something even better. Hopefully my breakthrough for it won't arrive so late in the year next time, meaning when I go to perform it, it will be mind blowing.
Though to be fair, to look back at the video, it is pretty mind blowing seeing people play a blues solo using just their hands. If I may say so myself.
VIDEO TO COME
Friday, 7 June 2013
Wednesday, 5 June 2013
Leading to the final project...
About 2 weeks before our project was due, I was at my wits end. I had no idea how to get the patch I had in mind to work, and was fresh out of ideas.
However, someone on the cycling74 forums posted a very simple idea; break it into two patches. One with a bang that triggers the samples, and another that let me control max with kinect. Then I'd just connect them together. It was stupidly simple, but I tried it.
I used the samples I'd created from logic earlier - seven split second long notes from Logic's basic MIDI electric keyboard sound, one for each note in the C blues scale.
I then successfully created the first patch. Whenever I triggered a bang, the sample attached to it would play. That was the easy part; the hard part was getting kinect to respond to it.
To do this, I downloaded and used Kinect-Via-Synapse, a prepare Max patch created to be used with synapse, which I explained in a previous post. I thought I would have to spend days programming my own patch to be used with Synapse, but much to my amazement, after I set up the parameters in K-V-S, it did the rest of the work for me. Whenever I moved my hand (or whatever limb I assigned to it) the values on KVS would change.
I spent a good deal of time trying to set up K-V-S in a way that would suit what I wanted, but it never seemed to work. I was worried about this until I decided to connect my patch to synapse in order to do some early troubleshooting, when I realised that without me having to do anything, Synapse was automatically working with the first patch I'd created. If I assigned a limb to a sample, it triggered the bang that played it. I was very pleased and impressed that synapse had done this, and at how simple it was for me to connect it - all I had to do was create a new synapse object in my patch and assign what limb it would follow, and over what axis.
My next problem was a major one, though; the sample would be triggered the second Kinect picked me up, and it would play non stop until I turned it off. I didn't want this; I wanted it to only play once when a certain value was reached. After a lot of trial and error, I found that the "onebang", "kslider" and "clip" objects were the ones that I needed. The onebang would solve the problem of my sample playing over and over, the slider and attached numbers box would let me know the where and when my samples would be triggered, and the clip box would let me assign at what value my samples would be triggered.
I set a rough number for each sample, and again, after much trial and error assigning various samples to different limbs and axis, I found that if I was just to use my hands, as per my original idea, it would be much easier. Max and Kinect didn't respond so well to my knees, torso and head joints. This also meant that who ever was soloing would have limited access to what they could play and when.
So instead, I assigned the samples to both hands, and had them set to different points on my x and y axis. Right hand up would trigger a note, right hand down would trigger a note, left hand up and down would trigger different notes, right hand side to side would trigger different notes, and one left hand movement to one side would trigger the final one.
Again, after much trial and error dancing about in front of the Macs in the computer room, I found a combination of movements that would be simplest to pick up.
I had to spend a lot of time adjusting when and where the notes would get triggered, as notes could clip very easily. I also had to overcome the problem of how notes were triggered; some would only be triggered if you moved your hand down and back up again for example, which was in no way ideal. I fixed this by using the scale tool, and reversed the scale so that values would go from top to bottom rather than from bottom to top, meaning they could be triggered if I moved my hand in the opposite direction.
I spent an entire day working out the values at which the notes would be triggered, and focused on keeping it simple. When performance day came, however, I found a lot of my work to be in vain however, as Kinect decided to work a little differently with my patch after I'd saved and closed it, which I will discuss in my next post, where I will talk about the performance itself...
Subscribe to:
Comments (Atom)