Wednesday, 8 June 2011

Rudimentology

Rudimentology was our final end of year performance held in Stevenson college for the Performance Technology students.

The nature of the show was to show off and present the experiments and projects that the class had come up with using programs such as Max MSP and Pure Data.

Video Footage By Ivor Blair


The fundamental idea of the show was based upon and original drum composition composed by Tam Dickson, composed using Logic Pro. Once this composition was made, it was decided that the piece would be played by numerous people around both an acoustic drum kit as well as electric drum kit pads. The acoustic drum kit was positioned in the center of the room with the drum pads connected around the kit. Each pad was connected to Max Msp to sound a sample when hit. Each pad had a different sample allocated to it with the samples making up the sounds used in Tams original composition so that the composition could be played across the pads. In the final performance 5 people needed to play around the drums and pads to meet the requirements of the composition. Throughout the piece members changed pads and live drumming was used throughout to highlight areas of the composition both on its own and with the sampled sounds.




Ivor had created a a sound manipulation device using the RJDJ software on his iphone that uses the Pure Data program.. Through the software, he created four separate four bar scenes that did different things, for example he had one scene that manipulated the live audio coming into the iphone to sound as if you were talking in bubbles. In the final performance, RJDJ openend the show, manipulating the live input that it was hearing from Seans Wii mote and Mikes Theremin.



Sean created a Max Msp patch that worked with a Nintendo Wii remote. The Wii remote uses bluetooth to connect wirelessly to Seans Max patch. The remote was then placed inside a hamster ball. The remote then sent signals for Max to process how the ball moved, depending on the balls movement, it would trigger samples from Tams composition and alter the pitch of the samples. The ball was thrown, spun, rolled and moved in different positions to create different variations on Tams composition.



Fearn had created two devices. The first was a keyboard that played back notes depending on where your hand was positioned inside a box. All notes made up a major scale and served as keyboard that did not need to be touched to play. Her second device was a musical pencil. A circuit was created on a piece of paper using a drawn a line. Once a finger was placed on one end of the line and the pencil on the other, the circuit became active. Drawing along that line then created notes that varied in pitch as you drew closer or further away from your finger.



Mike had manipulated a moog theremin. Again the theremin is an instrument that does not need to be touched to manipulate pitch and volume. Mike had created a Max Msp patch that was based upon a sample of Tams drum composition. Using your hands changed where in the sample that you were looping and the length.



Everything on show was a huge success. The final performance saw Mike, Sean and Ivor manipulating the sounds and samples of Tams composition. As these were all DI'd through the PA system, they were then faded out to the full composition being played out with the acoustic and electric drums.
I thoroughly enjoyed the event but with that I could have used my own project in the show. As written in the previous post, I wanted to integrate Tams drum composition into my pong game. I feel this would have been another great addition in user interaction that could have fit in along with Ivor, Mike and Seans piece. On the day I helped set up the venue, moving drums around and gathering anything that anyone needed to help present their projects.

*All photos by Lisa Wickstead 

My Failed Experiment

As touched upon in earlier blog postings I have wanted to create a pitch detection game of pong, the idea being that the paddle is controlled by pitch recognition, moving up and down and when the ball strikes the paddle, a random sample is played that is integrated from the drum pattern that our performance technology project is based upon.
I started working on this project after being inspired from a similar project as seen below:


In this video, the user created a game of pong using MAX/MSP, Jitter and Flash and used pitch detection to control one paddle and colour detection to control the second paddle.

For my experiment I was using a Max5 to control the pitch elements and decided to use a program called Processing to create my game of pong.
The reason for choosing processing is because I don't have a wide knowledge of computer programming or flash skills, programming seemed relatively straight forward in its command based programming style as opposed to a script such as html.. Being an open source software, there is a huge audience that are creating new things on the platform that made it more appealing to someone new to the field.
As pong was used in one of the softwares demonstrations, there are plenty patches already made that give the general structure for the game. I used one of these patches and modified it to suit my needs, slowing down the paddle speed as well as making it slightly bigger, changing the speed of the ball as I felt it to be too fast when you are trying to play with pitch to hit it and decided to make it a one player game. I had originally decided to make this for two people to play but went against that idea as I thought one microphone may pick up too many pitches at once. I was not completely sure how to write so that the ball bounced against the wall instead of the paddle so used a second patch and copy and pasted sections and modified the scripting until I understood how the ball should bounce and not bounce into the side or too early.
Here is a screenshot of my game of pong in action:


The next stage was to have pitch detection working in Max 5. Having never used pitch detection before I had to find info on how to send live audio into Max and for that to read the pitch of my voice. 
Using the Cycling 74 website, I came across a tutorial on how to use live inputs in Max:
From there U researched pitch detection and found that I'd need an external tool for Max named sigmund which would gather pitch data in Max that is detected from the selected audio input device. From here I managed to create a simple patch based upon the Sigmund help document that detected the pitch of my voice received through my computers microphone.


My first problem I came across was filtering what I did not need in my max patch. I knew that to control my pong paddle efficiently that I would need the pitch detection to be accurate from the lowest note that I make to the highest. In the patch that I modified, my slider would move from the bottom to roughly the middle when I made the lowest note, meaning I needed to make a cut off so that the bottom of my slider was the lowest note possible. I couldn't figure out however where in my patch to add the numbered cut off.

My second and worst problem was the integration between my processing pong game and my Max 5 patch. Having both patches almost ready to go with only slight modifications needed, I needed to figure out how to integrate both pieces. I tried looking all over the internet for similar patches made. The best I could find was a patch that was using a midi controller in Max to control faders in Processing. Unfortunately processing doesn't come complete with the ability to link these two objects easily and external plug ins need to be added into both programs. My searching led me to the belief that I needed the ability to send OSC messages to each program and possibly needed GUI controls to aid my pong, pitch abilities.

This is unfortunately where I got a bit over my head and couldn't figure out how to actually link the two pieces and how to program each side so that they work with the other and had to call a day on the project.

This is a project that I would like to continue and do further searching on and believe that if I can get the two sides working with each other that it will open up a lot of new possibilities with the new skills that I have learnt on each format.

Tuesday, 31 May 2011

The worlds biggest musical installation?

In present day, technology is changing faster than ever before. Musical installations no longer need to be set in one place for someone to interact with or enjoy as Bluebrain have shown us. Bluebrain is a two-man band/composition team made up of Hays and Ryan Holladay. Bluebrain had the idea of revolutionizing the divide between album and live performance. They have created an album that you must be in the right place to enjoy. Set in Washington D.C.'s National Mall, Blubrain produced their album as an iphone app that is soon to be available on android. The listener can then only enjoy the album in the national mall area where they have to interact with the mall to hear everything. Using smartphones gps device, your smartphone traces exactly where you are and where you are walking, as you walk around and get closer to certain things, the album grows. As you approach one thing in the mall, you will hear a keyboard section sweep into the composition, approach another and a cello will enter, instruments will fade in and out depending on where you are situated and where exactly you are in the mall. Some things you will only hear if you are right up against an area, touching it. This is a huge step on how music can reach your audience. Bluebrain wanted to expand on the idea of localizing your music. But having people come to interact with your album could be the future that drives some huge pieces of musical work!


Here is a video hosted by vimeo that shows the project in use:


http://vimeo.com/bluebrain/thenationalmall

Under the bridge down town

John Morton is an American composer best known for his work on the manipulation of music boxes and their sounds. For this work he is often compared to the likes of John Cage is who known for his use of prepared piano technique. Morton approached the public art program with the idea to install a device in the famous Central Park that captures the feel and motions of the park. Morton recorded in the park over 40 days in the course of a year capturing all the sounds heard daily including: the cracking of leaves, ball games, kids singing, poetry recitals, conversations, bells,  musical instruments and more. He ended up with hundreds of hours or recordings. Morton then edited down his recordings to short samples. The installation was set up in 2009. When The Delacorte in central park chimes its bells, Mortons installed computer programme records the chimes, plays them back disjointedly and starts a 20 minute composition of the samples recorded within central park with the samples being played back at random. This installation was installed in a tunnel in a busy section of Central Park on the walkway to the zoo. 







Friday, 15 April 2011

Pong

So this is a project that I have started working on after watching this video:


Basically in this video, the author has used Max/Msp, Jitter and Flash to create this game of Pong. He is using his voice volume to control one of the paddles of pong and the second paddle is controlled by a motion sensor picking up the yellow scissors from his live video camera.

I have started working on a four paddle game of pong using a program called processing. Processing is a free, open source program using its own language to program animation, graphics, digital art, games and much more.

I have used a basic code to get the backbones for my game of pong. I wanted it to be a four paddle game to make it faster and more fluent. My thought is to have 4 microphones, each controlled by either pitch or volume. Every time a players paddle strikes the ball I want a random sample specific to that paddle to trigger. On the event of no paddle striking the ball, a loosing sample will be played. basically the paddles are a trigger to set of random samples, manipulated and controlled withing max/msp. As well as the paddle strikes a constant musical extract or ambient sound would be playing in game to keep the music fluent with the loops adding to the music and making it new.

I'm wanting this to be an interactive experiment as well as a compositional experiment.

iPod Touchings

I just got a brand new iPod touch 4th gen and was looking into areas of using this with max5. Some of the features of the new iPod include: 2 cameras (one on front and one on back), one of which films in HD, Speaker, Touch screen, Movement/motion sensors, light sensor (to adjust brightness of screen to lightness), 4 push buttons, multi touch functionality and a usb out. This gives many avenues for the iPod to be used and manipulated to a musical instrument or some sort of trigger or control pad.

This is just one video I've found of someone using their iPod touch to manipulate max. The user here transports midi information to max with use of the motion sensors to create a new composition out of the loops and notes at his disposal.


I was thinking myself of new ways I could use the iPod.

At first I thought of a practical sense that could benefit me. As a working DJ that uses the program traktor, I have seen that you can download apps for the iPod that allows for a quick touchscreen cue, play and waveform. I was thinking that alongside laptop use if you know your set inside out you could use the motion sensors of the iPod to trigger a loop that you have cued up on the iPod. Flicking your wrist along to a beat would play the loop that you have cued, with every flick the loop is stopped and restarted,, the main menu button on the iPod to play the new next waveform from its cue and the screen itself could show the waves and faders so you can easily fade out one track into the next, touch to position your cues or have a manual play and stop button.

Possibilities composition wise are endless. The built in light sensor could be used as a trigger as could the HD video camera. Perhaps when a light change is detected the iPod could automatically record whatever is in front of it, information could be transfered out and that video of whatever length could be played back on computer.

The motion sensors could be used to manipulate waveforms, velocity and pitch much like the video shown earlier as well as triggers for loops. The multi touch screen also could be ideal for the manipulation of waves or as trigger pads, being able to turn samples on and off without having to do one at a time or alternatively manipulating a wave in different ways all at once.

I'm going to hook my iPod into max soon and work out what exactly can be played with and to what extent.

Wednesday, 24 November 2010

Max5 Patch - Manipulating Wave Forms With AKAI LPD8


This is a Max5 patch that I've made in class to use with my Akai LPD8 midi controller. I have two identical patches running side by side each with two different wave form reading. The first is an extract from one of my own compositions; a short ambient piece making the most of electronic effects over a piano section. The second wave form is a continuous fast drum loop. I use two of the eight pads on my LPD8 to switch each sample on or off, I am then using the ctlin knobs what happens with the waveforms; 1 scrolls through the wave form, the 2nd second scrolls so we can loop a smaller section, the 3rd controls the speed at which we play back from 1 to 15 to give the effect of a higher pitch and the 4th knob -1 to -15 to give us lower pitches. As there is 8 pads the first row controls the drum sample patch and the second row controls the ambient extract. Playing these two pieces together and manipulating smaller sections at different speeds and times creates a whole new original piece of music.



To Download the patch and samples used Click Here