top of page

Create Your First Project

Start adding your projects to your portfolio. Click on "Manage Projects" to get started

San Diego Undergraduate Tech Conference Project

Project Type

Tech Conference

Date

October 2025

Skills

Audiovisual Composition, Human-Computer Interaction, Digital Instrument Design, Machine Learning

Tools Used

Software: PureData (FluCoMa + GEM Libraries), Ableton, Audacity
Hardware: Zoom H4Essential

Credits

Programmed/Composed/Designed by Keene Cheung

Link

Earlier this fall, I presented a project at the San Diego Undergraduate Tech Conference that explored visual-reactive audio. I wanted to explore a way in which features of a work of visual art could be extracted to control a musical composition. My solution to this was to take a set of visuals programmed using PureData that could be controlled live by a user controlling a 3-dimensional RGB vector, almost like an analog lighting board. I then used the FluCoMa library, a machine learning library in PureData, to take this vector and train it using a Multi-Layer Perceptron Regressor to translate those values into a 10-dimensional MIDI CC vector that I mapped to various samples and synths in Ableton that were part of a soundscape composition I created for this event. This allowed me to effectively use color as an instrument to control my music live. This all culminated in a research paper that I authored, focused on human-computer interaction for music and digital instrument design, and a presentation + demo I gave at the conference. The novelty of this new interface for multimedia performance led to my being awarded Best Demo for the outstanding live showcase of my work.

bottom of page