The piece of cruft I repurposed for this project was an Etch-a-Sketch. My intention was to allow a user to control the drawing process on the Etch-a-Sketch just by moving the position of their head/face. The hardware that I used was an Arduino Uno, two stepper motor driver carriers, and two stepper motors that were mounted onto the Etch-a-Sketch in place of the drawing wheels.
I wrote a patch in Max/MSP/Jitter that allowed the user to draw on the Etch-a-Sketch by changing their location in front of a camera. I used face tracking from Jean-Marc Pelletier`s cv.jit library to track the position of the user`s face which then sent messages to the motors, informing them which direction to draw in.
This project was done in the Interactive Digital Multimedia Techniques Course (part of the Media & Arts Technology PhD Programme) at Queen Mary, University of London.