MultiPerception - the Original Individual Project
Our perception of the world is based on multiple senses instead of a single one. When playing music, the sound is the main feedback. And music visualization can enrich our perception and understanding of the music. MultiPerception generates real-time animation when participants playing MIDI keyboard.
Besides, computer can 'percept' and respond to the music played by participants using machine learning now. Meanwhile, MultiPerception also generates visual effects for the music created by computer to provide participants with visual feedback at the same time.
Finally, a dialogue between visual and sound, human and computer is formed.
Triggered when AI music is being generated
Triggered when pitch bend wheel is being tilted
Triggered when participant plays very fast