Neuroscientist blends dance with science at UN AI summit

Education
Webp s0g7pgzv6vxxiv4uglmvobrqihlo
Renu Khator President | University of Houston

University of Houston neuroscientist Jose Luis Contreras-Vidal, a pioneer in brain-machine interfaces, will present at the United Nations AI for Good Global Summit in Geneva, Switzerland on May 31. Known for his work in controlling wearable exoskeletons for rehabilitation and mapping art-evoked brain activity, Contreras-Vidal will not merely give a speech but will also showcase “Meeting of Minds,” a blend of artistic performance and scientific experiment.

Collaborating with Rice University and Sam Houston State University, Contreras-Vidal’s presentation involves dancers wearing EEG skull caps while performing. The University of Houston team will record their brainwaves during the performance. "We created an elegant, engaging, and aesthetic approach to observing the creative brain in a dynamic state," said Contreras-Vidal.

Contreras-Vidal’s research focuses on reverse engineering the brain to develop new interfaces that enable direct communication between the brain and external devices such as robots or prosthetic devices. Anthony Brandt from Rice University emphasized the significance of arts in understanding pattern recognition and memory.

The invitation to speak at the summit stems from Contreras-Vidal’s pioneering research at the UH BRAIN Center (Building Reliable Advances and Innovations in Neurotechnology), funded by the National Science Foundation. His work has extended into examining brain activity during expressive movement in social contexts.

In 2022, Contreras-Vidal began collaborating with choreographers Andy and Dionne Noble from Sam Houston State University’s Noble Motion Dance company. Their joint project “Live Wire” was successfully performed in Houston and later reprised at an international workshop in Virginia.

The project evolved into Diabelli 200, where performers wore neuroimaging equipment during live performances. UH graduate students Maxine Annel Pacheco Ramírez and Aime Aguilar-Herrera recorded mobile brain imaging data during these performances. Shepherd School violinist Nanki Chugh analyzed this data and won first place for her presentation on neural dynamics between conductor and pianist.

“Meeting of Minds” addresses social division through dance featuring Lauren Serrano and Tyler Orcutt who transition from conflict to cooperation using choreography elements known to trigger neural synchrony. Projections designed by Shepherd School doctoral candidate Badie Khaleghian allow audiences to observe real-time brain responses.

Brandt highlighted that this performance marks progress toward studying human behavior in natural settings using mobile brain imaging techniques. Contreras-Vidal added that this research could lead to personalized art prescriptions based on music-based interventions aimed at improving health and wellbeing.

The May 31 performance of “Meeting of Minds” will be streamed live on the AI For Good Conference website starting at 2 a.m. Central time.

___