Technology

Author/Title Research Type Related Fields
Zhuodi Cai. 2026. How I Perceive It. CVPR 2026 (Computer Vision and Pattern Recognition Conference) Art Gallery, Denver, CO, USA, and online. Creative Work, Technology
Rose Xu, ViiVAI Labs. Interactive Haptics Demos at the Intersection of Sensor Technology and Embodied Design. CoMotion Makerspace, University of Washington, Summer 2025. Technology, Projects
Zhuodi (Zoe) Cai, Ziyu (Rose) Xu, Juan Pampin. Human-Machine Ritual: Synergic Performance through Real-Time Motion Recognition. In Proceedings of the Thirty-Ninth Conference on Neural Information Processing Systems (NeurIPS 2025). Creative Work, Technology, Publications, Articles, Essays, Projects
Rose Xu, Git for Everyone!, OSC Community Fellows Workshop Series, co-led by UW Libraries and the eScience Institute, May 8, 2025. Technology
Musical auditory feedback BCI: clinical pilot study of the Encephalophone, Frontiers of Human Neuroscience, June 15 2025. Sec. Brain-Computer Interfaces, Volume 19 - 2025. Technology, Publications, Articles
Low-cost Ambisonic Mic, Rihards Vitols, 2019 Technology
J. Anderson (2016): Ambisonics & the ATK @ DXARTS. Presented at the Pacific Northwest AES Section Meeting (Nov 2017), Seattle. Technology
The 9e2 Festival performance of the Encephalophone Ensemble Creative Work, Technology
 Fetz, E.E. Volitional control of neural activity: implications for brain-computer interfaces.  Journal of Physiology (Lond.), 579.3: 571-579, 2007 Technology
Ambisonics and 3-D Audio Playback Systems Technology
Ultrasonic Beamforming Technology
Ambisonic Toolkit (ATK) Technology
Analysis, Transformation and Synthesis (ATS) Technology
Ongoing
Image Title People Involved Related Fields
DXARTS Sound Lab layout mockup
Ambisonics and 3-D Audio Playback Systems
An early prototype of an ultrasonic speaker array.
Ultrasonic Beamforming
ATK
ATK
ATS: snapshot of ATSH editor
ATS
Forthcoming
Image Title People Involved Related Fields
How I Perceive It (System Application in Art)
Real-time human-machine interaction during rehearsal. A dance artist performs in front of a projection controlled by our system described in the paper. The laptop screen shows the multimedia interface, while wearable IMU sensors and a smartphone are used to stream and monitor real-time data.
Human-Machine Ritual: Synergic Performance through Real-Time Motion Recognition
Completed/published
Image Title People Involved Related Fields
Using IMU-based motion recognition, Python pipelines, and real-time integration with TouchDesigner and SuperCollider, I designed a wand interface that lets users cast spells through gestures while experiencing synchronized visuals, sounds, and tactile feedback.
Summer Internship with ViiVAI Labs and CoMotion Makerspace
Image of Rose Teaching the Workshop
Git for Everyone!
Encephalophone block diagram
Musical auditory feedback BCI
microphone in parts
Low-cost Ambisonic Mic
Ambisonics & the ATK @ DXARTS
Encephalophone Ensemble at 9e2 Festival
Closed-loop Brain-Computer Interface
Closed-loop Brain-Computer Interface