Virtual & Mixed Reality + Experiential Design
Wake_AHenson_still1.JPG

Wake : Exploring Co-Located Co-Presence in Virtual Reality

This page is a work-in-progress for my Masters of Science in Computational Design thesis project, “Wake : A Guided Mixed Reality Experience,” at Carnegie Mellon University.

Created in collaboration with the multi-disciplinary performance duo, slowdanger (Anna Thompson and Taylor Knight), this work continues our research inquiry into embodied interaction, co-presence, and the shifting boundaries between physical and digital in virtual and mixed reality experiences.

Wake is a Guided Mixed Reality Experience for one participant in-headset, one dancer, and one facilitator. The experience is built in Unity 3D for the HTC Vive, Vive trackers, and an Intel RealSense Depth Camera placed at the position of the participant’s eyes in the headset. The project uses a custom algorithm to stream and render in real-time a high resolution video (synced depth + RGB feeds) of a physically and virtually co-present dancer.

VR Headset Prototype

VR Headset Prototype

Vive Pro Headset with Intel RealSense Depth Camera (infrared sensors + RGB) mounted to the front. RealSense camera signal extended and powered by fiber optic cable with external battery pack.

Wake Prototype 1

Wake Prototype 1

The following images are from Wake Prototype 1, performed at the Carnegie Museum of Art, September 2018. Ru Emmons performed the dancer’s role.

Wake_AHenson_still2.png
Screen Shot 2018-08-28 at 3.05.52 PM.png
Screen Shot 2018-11-09 at 10.54.27 AM.png
Screen Shot 2018-08-28 at 2.25.00 PM.png
Wake : Prototype 2

Wake : Prototype 2

The following documentation is from Wake Prototype 2, performed at Thrival Festival, Humans X Tech event. slowdanger (Anna Thompson and Taylor Knight) performed.

IMG_7555_edit.png
IMG_7552.JPG