Motion Capture and AI-Driven Interaction in Performance
Integrating motion capture and AI for interactive, multisensory performance environments.
I am exploring the integration of motion capture technology with AI-driven systems to enhance the interactivity of live performances, with a focus on developing workflows that bridge movement data and digital media. In collaboration with Stephen Lucas and David Stout of iARTA and the Hybrid Arts Laboratory, I am currently developing a piece that utilizes Rokoko motion capture suits. These suits include full-body tracking, gloves for capturing fine motor movements, and facial expression sensors. The data, processed through Rokoko Studio, can be streamed to animation software such as Blender and Unity. My research focuses on creating a TOX (TouchDesigner Operator) that will enable direct interfacing between the Rokoko system and TouchDesigner, allowing for seamless integration into live performance environments.
​
My goal for this work is to create a dynamic point-cloud human form using instancing techniques in TouchDesigner. This form will be animated in real-time by data streamed from the motion capture suit, providing a visually striking representation of the performer’s movements. In addition to visual effects, this data can be applied to modulate audio and lighting parameters dynamically, creating a multisensory, interactive performance environment.
​
I envision this work evolving through multiple iterations. For instance, motion capture gloves could be worn by instrumentalists, such as pianists or cellists, to animate digital figures or to live-modulate audio effects based on their performance gestures. By combining physical performance with AI-enhanced interaction, this research explores how digital technologies can deepen the relationship between a performer’s movements and the audience’s experience.
​
Beyond live performance, this project has potential commercial applications in fields like electronic music visualization. The workflows I am designing could be adopted by VJs or multimedia artists seeking tools for dynamic, real-time interaction between human movement and digital media. This cross-disciplinary approach underscores my commitment to developing tools that are not only artistically innovative but also practically useful in broader creative industries.