Friday, 7 June 2013

My Final Project - The Performance!

I arrived at class and was glad to be told that we would be given time to make some final adjustments to our patches.  I needed this, as there were still some sensitivity issues I wanted to fix.

When I opened my patch and connected Kinect, I was horrified to find that for some reason, everything had been reversed.  Positive values were negative, up meant down, left meant right, etc...
I couldn't tell what hand was controlling what.  Immediately I started working around this.  I had to change the entire order of my samples, as well as move the scale tool to the ones that previously hadn't needed it.  It took me twenty minutes, but I fixed it, and was relatively happy, until I realised that the movement I'd set to the high C was working the opposite way, and there was no way for me to fix it.  I tried and tried, but nothing worked - it would only be triggered if I moved my left hand inside rather than outside of my body, which complicated things.  To fix this, I simply changed the sensitivity of it, so that I didn't have to move quite so much.

The sensitivities of my various notes had changed too.  For example, if I wanted to play my low C and Eb, in order to play the Eb, I'd have to play the C again because I couldn't desensitise the Kinect enough to stop the two from clipping.  If I did, you'd have to crouch down and leap up to trigger the notes, which while fun, wasn't what I was after.  If I had prepared my patch earlier, I would have loved to have just spent weeks adding little things into my patch, constantly adjusting it so that the parameters I'd set stayed constant.  I'd have made it much more accessible, and easier to solo properly.  I'd also have liked to have been able to sort some of the latency issues - sometimes if you moved too fast, Max wouldn't pick up your movement, or there would be a half second delay while playing certain samples, meaning playing in time became very difficult.
I did explain to Ross that the purpose of my patch was to have fun, and that he shouldn't expect a virtuoso performance.  If anyone was able to make their solo sound like Dr. John I'd probably walk out in jealous disgust.

After making some final adjustments to my patch (those damn sensitivities...) I was ready to perform.

I started off by showing everyone how in control I was by playing all of my notes cleanly and without clipping.  I had of course practiced this, and knew exactly where to position my hands so that the notes would play themselves.  Unfortunately, as I played my final C, having made sure it all sounded so clean, Jordan played something on the computer next to me, and it sounded like my final note was connected to a bunch of other notes and noises.  This frustrated me, as you can see on the video, but I knew that it was Minton, and not my oh so perfect patch that made the nose...

After demonstrating a brief solo to the class along to a slow blues backing track in C, I let others come up to have a shot, as per my original idea.

Steve, Ross (our lecturer) and Dean had a shot, and as I watched them, I suddenly felt very pleased; my original idea had been to create a patch where anyone could come up and solo over a blues backing track using just their hands; at first I thought the idea was too ambitious, but here I was watching them do what I'd envisioned from the start.  I felt a real sense of accomplishment.

People seemed to really enjoy it, and everyone had fun.  There were some minor niggles that I'd have liked to fix - the latency, for example, but I was very pleased with my end result.  Ideally, I'd have liked it to have been much smoother - if I was able to make it so smooth that whoever was in control would be able to play along with a live blues band in time, playing a thought out solo rather than just random notes, that would have been fantastic.
Alas, this was my first time working with Max and with technology and music together.  I am very pleased with the end result, and feel that I have gained a huge amount of experience and confidence from this module as a whole.  I look forward to taking the Tech module again next year, where I aim to push myself and come up with something even better.  Hopefully my breakthrough for it won't arrive so late in the year next time, meaning when I go to perform it, it will be mind blowing. 

Though to be fair, to look back at the video, it is pretty mind blowing seeing people play a blues solo using just their hands.  If I may say so myself.

VIDEO TO COME

Wednesday, 5 June 2013

Leading to the final project...

About 2 weeks before our project was due, I was at my wits end.  I had no idea how to get the patch I had in mind to work, and was fresh out of ideas.

However, someone on the cycling74 forums posted a very simple idea; break it into two patches.  One with a bang that triggers the samples, and another that let me control max with kinect.  Then I'd just connect them together.  It was stupidly simple, but I tried it.

I used the samples I'd created from logic earlier - seven split second long notes from Logic's basic MIDI electric keyboard sound, one for each note in the C blues scale.

I then successfully created the first patch.  Whenever I triggered a bang, the sample attached to it would play.  That was the easy part; the hard part was getting kinect to respond to it.

To do this, I downloaded and used Kinect-Via-Synapse, a prepare Max patch created to be used with synapse, which I explained in a previous post.  I thought I would have to spend days programming my own patch to be used with Synapse, but much to my amazement, after I set up the parameters in K-V-S,  it did the rest of the work for me.  Whenever I moved my hand (or whatever limb I assigned to it) the values on KVS would change.  
I spent a good deal of time trying to set up K-V-S in a way that would suit what I wanted, but it never seemed to work.  I was worried about this until I decided to connect my patch to synapse in order to do some early troubleshooting, when I realised that without me having to do anything, Synapse was automatically working with the first patch I'd created.  If I assigned a limb to a sample, it triggered the bang that played it.  I was very pleased and impressed that synapse had done this, and at how simple it was for me to connect it - all I had to do was create a new synapse object in my patch and assign what limb it would follow, and over what axis.

My next problem was a major one, though; the sample would be triggered the second Kinect picked me up, and it would play non stop until I turned it off.  I didn't want this; I wanted it to only play once when a certain value was reached.  After a lot of trial and error, I found that the "onebang", "kslider" and "clip" objects were the ones that I needed.  The onebang would solve the problem of my sample playing over and over, the slider and attached numbers box would let me know the where and when my samples would be triggered, and the clip box would let me assign at what value my samples would be triggered.

I set a rough number for each sample, and again, after much trial and error assigning various samples to different limbs and axis, I found that if I was just to use my hands, as per my original idea, it would be much easier.  Max and Kinect didn't respond so well to my knees, torso and head joints. This also meant that who ever was soloing would have limited access to what they could play and when.
So instead, I assigned the samples to both hands, and had them set to different points on my x and y axis.  Right hand up would trigger a note, right hand down would trigger a note, left hand up and down would trigger different notes, right hand side to side would trigger different notes, and one left hand movement to one side would trigger the final one.
Again, after much trial and error dancing about in front of the Macs in the computer room, I found a combination of movements that would be simplest to pick up.

I had to spend a lot of time adjusting when and where the notes would get triggered, as notes could clip very easily.  I also had to overcome the problem of how notes were triggered; some would only be triggered if you moved your hand down and back up again for example, which was in no way ideal.  I fixed this by using the scale tool, and reversed the scale so that values would go from top to bottom rather than from bottom to top, meaning they could be triggered if I moved my hand in the opposite direction.

I spent an entire day working out the values at which the notes would be triggered, and focused on keeping it simple.  When performance day came, however, I found a lot of my work to be in vain however, as Kinect decided to work a little differently with my patch after I'd saved and closed it, which I will discuss in my next post, where I will talk about the performance itself...

Tuesday, 12 March 2013

Slowly getting there...

Slowly but surely, I'm getting close to the patch I'm after for my project.

What I want to do is far too complicated for me to simply start from an empty Max patch, so I'm taking ideas from various patches and programmes to get my desired results.  

Instead of using a webcam, I'm using Synapse, a programme that lets you use the Microsoft Kinect to create music using audio/visual/text programmes such as MaxMSP and Ableton.

A very brief overview and demo of synapse can be found here;



At the end of the video, he uses his left hand to manipulate pitch and downsampling using the Y and X axis respectively.  This isn't on Max, but on Ableton.  The programme looks very simple to use (for basic things like that...) and it is a perfect example of what I want to do.  Using this, I could let my left hand control pitch bending/volume, while my right controls the notes themselves.  Since Kinect uses 3D sensoring, I could assign which octave the performer plays on to the Z axis.  

The main problem is that I can't test it out at college because the Macs here won't let me install new programmes without Admin permission.  Meaning I'm going to have to download Max/Ableton/Synapse onto my own laptop, which is good because it means I'll be able to work from home.

Another thing I've been toying around with is the Kinect Beatwheel, for Max and Synapse.  The beatwheel lets me control samples using the Kinect again, but in a much simpler way. 


The only problem with this is that it lets me control a sample loop that's already playing; this would be handy if I wanted to manipulate my backing track.  

I've joined the Cycling 74 community, and have began posting on the forums.  Since I can't use any of the programmes on the computers at college, though, I'm a bit restricted on what I can do there - until I install all the programmes on my own laptop.

Therein lies another problem, though - I'll only be getting 30 day trials for everything.  The final performance is more than 30 days away.  Meaning I won't be able to use what I create (or 'edit'.  Saying 'create' is a bit cheeky...) on the actual day where I need it.  But we'll cross that bridge when we come to it...

Tuesday, 19 February 2013

Ideas for Final Project

I have decided that I'm going to use a webcam for my final MAX MSP project.

The idea currently is that I will record a basic blues backing track on my keyboard at home, and using a combination of MAX and a webcam, play an improvised blues solo over it simply by waving my hands and moving.  Audience interaction will be possible.

There are other things I'd like to be able to do, but this is the basic idea for now.  I've signed up to the Cycling 74 forums and am seeking help from users there.  I plan to build my own patch, using elements and ideas from other patches I find online.

There are a lot of patches and things from other users I can take or use in my own patch; for example, I would like to use something similar to what this man has done.  In the video, he uses the motion picked up by the webcam to adjust the sound, at first using it like a theremin, and eventually, using it to control an entire array of sequences and sounds.

Another example of something I'd like to use can be found here.  In the video, the artist has set the webcam up so that his hands control different things; his left controls the pitch, and his right hand controls the volume.  He also uses a handheld android device to fine tune the notes.  Using a webcam, he is controlling three variables.  He shows how he's done this by splitting up the webcam's feed into three different sections.  If I can work something like that into my own patch, it will mean more freedom in my final project and performance.



Tuesday, 11 December 2012

Project ideas and composing with MAX



Right now, being told to compose something musical with max is like asking a toddler to paint your portrait using fingerpaints; the child will love every second and probably be very pleased with the result, even if it looks like --

I can't even be bothered to finish that simile.  Composing using Max is hard.  Last week I had fun playing around with the Function box, using a combination of that, numbers boxes and the preset tool to make a piece of music inspired by this;  

Basically any horrible sound or attempt at music I got from Max, no matter what it was, sounded just as disgusting as that piece of notation looks.  
I now find myself stuck for ideas; I don't want to just add in phasors, saws and things I remember from past lessons and hope it all comes together and makes something that sounds kinda cool; As fun as that would be, I want my 'writing' process to have some kind of structure.  Or I at least want to REMOTELY know what I'm doing, rather than just typing random words into the object tool and linking the results together to see what I get.

In my search for inspiration I came upon this page.  Containing this video;



Which is kinda cool, until you realise what months of working with MAX does to you;




Tuesday, 30 October 2012

Tuesday 30 October Class - Looping

Today in class we listened to Come Out by Steve Reich.  It was composed using an audio recording of an interview with one of the youths involved in the Harlem riot of 1964.  It is heavily glitch based, with the words 'come out to show them' repeated over and over, eventually overlapping with the other playbacks of it, leading to various layers of a single voice.  It, like most things we hear in this class, was 'interesting'.

The song is just composed of the same line 'Come out to show them' looped over and over again, but every time it's repeated, another repeat is added a little bit later, until it all completely overlaps, to the point where it just sounds like noise at the end.  This is a prime example of process music.

We have been asked to start thinking about what we want to do for our final project, but still have no idea what to do.

Tuesday, 16 October 2012

A Recap - What we've been doing in classes

The Public Performance Technology class is an interesting one.  Interesting in the way that it is different from most of the other practical performance modules I've taken.

Whereas other performance based classes rely on technical ability and knowledge and implementation of theory, this one is more like a maths class.  Or a physics class.  Science;  Any kind of class that made me think too hard in high school - something I have forgotten how to do having being primarily focused on playing the keyboard and moving my fingers for the past three years.  In order to make the music, I have to remember numbers, learn functions, study, and revise.   Basically, making my brain think in a way it hasn't been accustomed to thinking for in years.

Let me explain.

The class is about mixing technology with music.  Using electronics and machinery, mixed with a programme known as MaxMSP, we can make music throg--you know what, I'll just link videos to show you what I mean in future posts.  For now, let me explain Max.

MaxMSP [http://cycling74.com/products/max/] is an audio/visual programme that relies on using a certain level of computing code to create music.  It's difficult to explain, but I'll do the best I can.  I am still very much learning how to use it.  
In simpleton terms, you create circuits - similar to those seen in a physics class - on a computer.  Using various commands and numerical values, you can send MIDI signals, effecting velocity and pitch with various tools and buttons and I have no idea what I'm talking about.  The past few weeks in class have been spent both seeing examples of music performance technology in action (using old, turned inside out electrical toys, or the Kinect hardware from the Xbox 360, etc) and working through the Max MSP tutorials, in what is currently seeming like a vain effort to get to grips with the rather completed programme (for us mere music students, that is).  

I have only just begun to get into the nitty gritty of the programme, having went over the basics of it.  Future blogs will focus on a combination of interesting examples of technology in example, and my (probably vague) attempts to explain how Max MSP works.  By the end of the year, the plan is that my final blog post will contrast from this one in every way, and I'll be typing, explaining and performing like a musical Einstein.