In collaboration with UC Davis we are developing technology that allows geoscientists from spatially distributed locations to jointly explore shared 3D data sets. The system is based on three key principles: interactive visualization, individual exploration, and collaborative analysis. The users can explore the data independently and interact in real-time with their physically dislocated collaborators. Discussion between remote users is provided by the system communication channels such as group audio, webcam video, or 3D video. We have performed several experiments between UC Davis and UC Berkeley where two geoscientists were remotely interacting with a tomographic model of the Earth's interior under the western U.S.
->Video (Nov 2009)
Tele-immersion allows joined performance of two dancers from two remote sites. Our lab is collaborating with the Department for Dance and Theater at UC Berkeley and University of Illinois, Urbana Champagne (UIUC). The dancer's body motion is captured by several stereo cameras and streamed in real-time to the remote site. The two streams are joined in a 3D virtual space where the dancers see their rendered bodies. Based on this visual feedback, the dancers can synchronize their movements.
- Portable Dance Lab - demonstrating performance of the portable teleimmersion system with 4 stereo cameras (Nov 2010)
- Two dancers interacting through a portable system (Feb 2009)
- Two dancers in virtual space (with music) (Dec 2006)
Live dance, robotic cameras, and audio technologies juxtaposed with large and small scale projections draw audience members into a performance in 360° that they can enter at any time. Inspired by Nine Evenings: Theater and Engineering in New York's 69th Regiment Armory by artist Robert Rauschenberg and Bell Laboratories engineer Billy Klüver, Panorama brings together a multi-disciplinary cast of dance makers, artists, scientists, engineers, roboticists, and digital game makers to create an evening of interactive and technologically alive theater, honoring the cutting-edge collaborations and technological explorations that are the hallmark of the Merce Cunningham and John Cage legacy.
Co-sponsored by Cal Performances, the Department of Theater, Dance, and Performance Studies (TDPS), the Center for Technology Research in the Interest of Society (CITRIS), and the Berkeley Center for New Media (BCNM). Artistic Director: Lisa Wymore
Using our tele-immersion system we have successfully performed several remote experiments with UIUC. On Dec 8, 2006, the tele-immersive dance was presented as the part of "The Resonance Project" for CITRIS Holiday Gala 2006. This was the first public performance of a remote collaborative dance using tele-immersive environment. View archived webcast of "The Resonance Project" live performance from CITRIS Holiday gala 2006:
->CITRIS Holiday gala, part I
->CIRTIS Holiday gala, part II
Remote Dance Experiments (2007)
Another live performance was featured for BERKELEY DANCE PROJECT in April 2007 by streaming two dancers from the Teleimmersion lab to the Zelerbach theater at UCB campus. In addition, a demonstration of live bi-located dance between UC Berkeley and the University of Illinois, Urbana-Champaign, was also presented.
In collaboration with the W.M. Keck Center for Active Visualization and the Earth Sciences at UC Davis we are exploring new rendering capabilities, including rendering in the KeckCAVE environment. KeckCAVE framework allows exploration, manipulation, and creation of 3-D datasets and models.
View video of teleimmersion dance data in Cave environment -> Click here
Immersive virtual settings allows integration of real-time full body 3D capturing with pre-recorded data. Such environments present a new approach to learning physical activities, such as tai-chi. In this work, participants are taught several tai-chi moves in either a 2D video system or a 3D immersive system. As compared to traditional 2D video, the 3D immersive environment allows view of the scene from any angle. The aim of the research is to demonstrate that people can learn the movements faster when using immersive virtual reality. The Teleimmersion lab collaborates with Virtual Human Interaction Lab at Stanford University.
View 3D video of the experiment -> Click here
Motion capturing with multiple stereo cameras produces large amount of 3D point data which have to be sent through the network in real-time. This work investigates compression algorithms using human body skeletonization.
View video of the skeletonization -> Click here
Read technical report on "Model Driven Compression of 3-D Tele-Immersion Data" -> Click here