L       Sound Installation

        
Real-Time Music Composition With colors and movements tracking

Dr. Shai Cohen
Department of Music
Bar-Ilan University
Israel

   Introduction
  The project invites the players to interact with sculptural electronic instruments, which
    track colors, movements and the body gesture, and map them via internet connection
    into a real-time music composition.
    The premiere showing of the installation took place in board of trustee’s conference at
    Bar-Ilan University 2008.

   Mapping
  The main idea of the project is to map a video camera as a sensor with Jitter data to
    manipulating midi and audio live and interactively. The music compositions include 4
    elements: 16 changeable frequencies for additive synthesis, a random poly player for
    cello-clarinet-flute sampled phrases, collision interface with 4 animated balls that play
    bells sound when they hit the walls and each others in a spatialisation surround
    environment and a triggered narrator sounds (Yossi Banai) that read from Alterman
    poem.  

   Technique   
   Wearable wireless sensor platform are use for interactive hands performances. The
    player wear color hand gloves and move them in front of a digital camera. The X-Y
    positions are tracked and send via the internet to another computer that analyzes the
    data, response to the movements and activate the music composition consistently.

   Setup 
   2 computers running Max/Msp/jitter software and 4 patches (see projects).
    4 Loudspeakers.
    MOTU Ultralite firewire audio Interface.
    Digital camera with firewire connection.
    2 gloves with 4 colors.
    Internet connection.

   
 


   Projects main music screen


    Instrument main screen 

    Loop-based main screen      

 

     Projects main video screen

     Photos-Video-Sounds