Atlanta, GA


  • White Facebook Icon
  • White Instagram Icon
  • White LinkedIn Icon




Our perception of the world is based on multiple senses instead of a single one. When playing music, we hear the sound first. And music visualization can enrich our perception and understanding of the music. ‘MultiPerception’ generates real-time particle systems when users play MIDI keyboard.

Besides, computer can ‘percept’ and respond to the music played by human with machine learning techniques now. Meanwhile, ‘MultiPerception’ also generates visual effects for the music created by computer to provide users with visual feedback at the same time. 

Finally, a dialogue between visual images and sound, human and computer is formed. 




MAX is used for music generation and signal conversion among MIDI keyboard, Unity and Magenta.


I developed real-time music visualization with Unity's Visual Effect Graph.

1.Particle Systems— Stack / Node Based

2.Interaction— Script

When users play fast, the visual effect 'explosion' will appear.

3.Visual Effects

Pitch Bend Wheel

AI Music


I make the computer be able to respond human with music using Google Magenta (a tool which can be used to make music and art using machine learning.



2019 Shanghai Science and Technology Festival

We did a performance about AI musicI for 2019 Shanghai Science and Technology Festival with the technique of MultiPerception. I worked with teachers from Tongji University, NYU Shanghai, Shanghai Conservatory of Music and SIVA on this project.

Shanghai Science and Technology Museum

Opening Show of 2019 Tongji Design Week

Besides, it was recently performed by my teammates as the opening show of Tongji University's Design Week (I received my bachelor's degree of industrial design from Tongji university in Shanghai, and now I'm doing HCI in US).