Enjoy a series of one minute samples of our experiments and performances from 2014-2020

For full description and credits of performances, please visit our Performances page. There, you will find longer videos of our major works.

 

Electromyograph, 2020 - It detects the electrical activity of muscle tissue and represents it as sound, driving the vibration of large metal panels.

Excerpt of AI Sensorium, Act III. Choreography by Daiane Lopes da Silva, Soundscape by Patricia Alessandrini. Performance by Samuel Melecio-Zambrano and Julia Rubies Subiros (wearing electromyography to trigger sound. The device is connected wireless to the large panels designed by Patricia and created by Michael Koehle.

E.C.G - collects biometric data from dancer’s heart. Weidong Yang translated the data into visual art and Tim Russel into Music.

Nominated for Outstanding Visual design by The Izzie’s Awards.

Choreography: Daiane Lopes da Silva in collaboration with dancers. Dancers: Hien Huynh, Daiane Lopes da Silva, Nathaniel Moore, Juliet Paramor, Hannah Wasielewski Visual art: Weidong Yang. Composer: Tim Russell. Installation: Tanja London.

MOCAP - Motion Capture - Full body suite that digitally records the movement of people, in this case, the movement of a dancer.

Excerpt of MESH, 2017. Choreography by Daiane Lopes da Silva in collaboration with dancers. System design and digital art by Travis Bennett. Performance by Juliet Paramor. Original music by Ben Juodwalkis. For full credits and description of MESH, please visit our Performances page.

What Does the Bot Say to the Human?”, 2017. Installation by Jyiai Young, Weidong Yang and Shih Wen. With its multiple phases, this work transforms the 2016 United States Presidential Election Twitter data into a large-scale installation to probe the question of how artificial intelligence (AI) via social media assumes form and transforms the shaping of the future of a nation.

Digital Painting - Raymond Larrett and Shoshana Green made Butoh meet visual art during our Dance Hack in 2016.

Dance Hack happens every December at CounterPulse, SF. Please, visit our Dance Hack Page for more information about this event.

Our incredible team of scientists and designers - Weidong Yang, Sonny Green, Travis Bennett and Raymond Larrett created this app for UC Berkeley in 2016. Citizen Dance shines the spotlight on the thriving dance culture on campus and celebrates dance as an essential part of public life and culture at UC Berkeley. In each of the three performance locations the app will release content unique to that performance with options for audience engagement.

Touch Screen, 2015 is developed by scientists and visual artist Weidong Yang. A wall was transformed into a giant touch screen through 3D depth sensor. Dancers interact with the fluid dynamics simulation to create live visualization.

Improvisation by Daiane Lopes da Silva and Anastasia Kostner at the Open Lab.

Time Lapse, 2015 - experiments at Kinetech Arts Open lab.

Participants: Raymond Larrett, Daiane Lopes da Silva, Esha Nambiar, Erin Alexi Huestis and Megan Meyers.

Slit Scan, 2016 developed by Weidong Yang.

The real slit scan is an instrument for measuring ultra fast event in Physics laboratory. We adopted it for distortion of reality.

Interactive Remote Ink, 2016 - A prototype of remote movement triggering ink simulation.

Dancer: Anastasia Kostner. Prototype implemented by Weidong Yang. Implemented in Openframeworks with ofxFluid addon.

Time Bubble, 2015 - creating fictitious environments where they ponder the multi-mode of reality, segmenting and reconnecting space and time using their movements and actions to activate a “digital bubble” that acts like a portal to the past, present, and future.

streakcam.jpg

Slit Camera, 2015.

The following session is in progress… we are updating this page. Thanks for your patience! Come back tomorrow!

Streak camera, 2015.

Markov Chain, 2014

FRACTAL MACHINE, 2014

MOSAIC, 2014

SILKWORM, 2014

AP camera, 2014.