This project made use of a Wii Remote & Nunchuk to act as an audio controller that allowed a user to generate and manipulate sounds in 3D. Visual feedback of the sounds was provided in real-time through a screen-based interface. This novel musical interface was designed to be used in the Listening Room at Queen Mary University of London, which is equipped with a 16-speaker ambisonic system with the visualisation displaying an aerial view of the space.
All of the software for this project was written in Max/MSP/Jitter and made use of a number of external objects. aka.wiiremote (Masayuki Akamatsu), jit.boids object (Wesley Smith) based on the Boids algorithm (Craig Reynolds) and the ICST ambisonic externals. To record sounds that could listened to outside of an ambisonic setup it was necessary to convert the ambisonic signal to a binaural signal. The ambi2bin~ external object (Miguel Negrão) was used in order to implement this process.
This project was done as part of the Media & Arts Technology PhD Programme at Queen Mary University of London.