Monday, January 19, 2009


this project/instrument is still a work in progress as part of 220b and 220c project. my paper for this project has been accepted for HCI 2009.
master thesis at vienna university of technology, research group for human computer interaction. available on tuwien library's website.

this paper is published in chi 2006 extended abstracts on human factors in computing systems.

this project was a cooperation between me and my colleagues at tu wien and akademie der bildende kuenste in vienna. i implemented the audio components of this project using c++ and bass library. analyzing the sound samples with fft, sending massages to video components to control the visuals.
256 final project: simulation of human movement using opengl and sonification of the simulation using rtaudio.

250a final project: used max/msp's jitter to determine the location and motion of eyes, nose, and mouth, controlled news tracks through granular synthesis, and mapped facial movement parameters to grain parameters using silence detection and large grains.

No comments:

Post a Comment