Works

  • Graduate Research

    My work and research at the GTCMT Music Intelligence Group focused on creating percussive models for human-machine musical interaction. I participated in building TablaGyan, a user-based program for automatic tabla improvisation. As part of my Masters project, I developed percUs, an interactive system for artificial rhythm accompaniment. Overview of the research conducted at the GTCMT can be found here.

__________________________________________

  • NeuroTune – A Brain Computer Interface

    Neurotune combines Electroencephalography, Digital Signal Processing, and Music Synthesis. The device intends to offer an automated system that transforms biological activity (e.g. brain wave signal) into music, in order to help people reduce their stress level and/or provide them with a creative mode of expression through sound/music creation and biofeedback.
    Poster

__________________________________________

  • Pod~ – A Pure Data external object

    pod~ is a Pd external for real-time onset detection following the perceptual models of the outer, middle and inner ear. I am implementing the inner ear filtering model based on the Bark scale and weighting each Bark band according to equal-loudness contours (ISO226).
    github project

__________________________________________

  • SMASH – Music and art performance project for the 2012 Listening Machine concert

    SMASH is a piece that repurpose everyday surfaces such as chairs, tables, and CD racks as electroacoustic percussive surfaces. Four percussionists generate an array of electronically synthesized sounds by employing a realtime computer analysis of the surface vibrations. The end result is an exciting exploration of the connection between acoustic physical surfaces and electronic synthesis.
    github project

__________________________________________

  • PULSES – iOS application for generative music synthesis

    Pulses combines minimalist graphics and stochastic music sonification. While providing visual entertainment by capturing the theme of outer space through objects such as black holes (static) and stars (dynamic), it also proposes a unique approach to meta composition. Pulses integrates ObjectiveC (main client code), OpenFrameworks (graphics) and LibPd (sound synthesis).
    Presentation


[I’ll post the video back soon]

__________________________________________

  • TOK! – A collaborative acoustic instrument application for iOS devices

    Our application takes advantage of the iPhone mobility and transforms it into a drumstick to tap on flat surfaces in order to produce acoustic sounds. Tok! is also (peer-to-peer) networked and consists of a shared interactive music score to which the players respond by rhythmically tapping their phones, creating a percussion ensemble. Through their social interaction and real-time modifications of the musical score as well as their creative selection of tapping surfaces, players can collaborate and dynamically fabricate interesting rhythmic music with a variety of acoustic timbres.
    Tok! Paper accepted at the 2012 International Conference on New Interfaces for Musical Expression (NIME)
    – Paper
    github project

__________________________________________

  • MIR Paper: Towards a Complete Rhythm Description of Music by the Estimation of Sub-Beat and Supra-Beat Rhythmicity

    In this paper, we present an algorithm based on beat similarity matrix and onset interval histogram to automatically extract the sub-beat and the supra-beat rhythmicity of a musical piece. From this information, we can then obtain a rank ordered set of candidates for the time-signature and meter of a given song.
    – Paper
    – Presentation

__________________________________________

  • Near-Field Acoustic Holography: The Frame Drum

    My research at the University of Illinois Urbana-Champaign involved the investigation of modal vibrations of a Remo frame drum using near-field acoustic holography techniques.
    – Paper
    – Presentation

__________________________________________