CT + VXD Intern Project
Ekemini Nkanta & Gabriel Drozdov
Summer 2022
Watch our demo! »
Final Presentation: Keynote
·
PDF
·
Zoom Recording
For our summer intern project, we challenged ourselves to make augmented reality more accessible, engaging, & group-friendly. The result was Reality Check: a prototype for room-scale AR experiences that offers seamless, body-driven interaction between people and the virtual world. Visitors are immersed via a large-format "mirror" display, rather than a headset or phone screen, and can walk around the space (or reach out and "touch" virtual objects) in order to trigger music/sounds, environmental changes, immersive cutscenes, and other events.
By eliminating most technical limitations, as well as its main barrier to entry, exhibits can now use AR to its fullest potential.
Reality Check is powered by the ZED 2 stereo camera, which uses AI to replicate human vision.
This, combined with the Unity game engine, offers exciting capabilities:
- Body & object detection that uniquely identifies & tracks entities within 40m, without the need for markers, bodysuits, or tracking devices
- Proper occlusion between virtual & real objects
- 3D interaction using depth sensing & spatial triggers
- Dynamic scene elements, such as adaptive audio and lighting
- Support for large groups both indoors and outdoors, with no hard limits on capacity
- Integrate ZED 2 stereo camera with the Unity game engine
- Prototype a "mirror" display that can detect and augment 1-3 objects in 3D, as well as reveal info / animated surprises for each one
- Pull in creative experimentation with sound & light from our Dynamic Soundscape and Light as Narrator proposals
- Conjure up bigger, mind-bending concepts for a room-scale mirror (puzzles, illusions, 3D minigames to play with trackable objects...)
- Run ZED tests with pre-built Unity scenes included in SDK
- Skeleton tracking
- 3D Avatar
(rigged mannequin is included!)
- 2D Skeleton
- 3D Avatar
- Object detection
note: bodies register as objects too! full list of detectable objects
- 2D bounding box and mask
- 3D bounding box
- Virtual object placement
- Planetarium
good example for mixed reality & virtual lights
- Movie Screen
use it as a test backdrop to see how sharp the silhouette mask is
- Skeleton tracking
- Map a custom avatar to the user’s body
- Place virtual objects/scenery in a physical room
- Create interactions between the visitor and the virtual objects using trigger volumes
- Add adaptive audio
- ...using FMOD?
- Simulate Unity lights illuminating real objects
- Add dynamic light source(s): changing color, intensity, direction…
- Create a short AR cinematic that respects the geometry of the physical room using Unity Timeline
- Detect a custom object by training our own machine learning model
(a.k.a. not a person, vehicle, bag, animal, electronic, fruit/vegetable, or sports ball) - Map a virtual object (3D model) to an artifact
*Edit: I discovered from the tests that we can’t track a detected object’s rotation at any given time - only its position in the scene and which pixels on screen belong to it (2D mask). But! We can still try this with larger/standing objects whose real-world counterparts wouldn’t rotate during a session - like statues, columns…- Display annotations/pop-ups over it
- "Fantasy plants Set - Hand-painted" by Victoria is licensed under Creative Commons Attribution.
- "The Altar (object №1)" by salinaforr is licensed under Creative Commons Attribution.
- "Runestones" by Athikar is licensed under Creative Commons Attribution-NonCommercial.
- "Stylized Foliage" by soidev is licensed under Creative Commons Attribution.
- "low poly Ivy" by spicybamer is licensed under Creative Commons Attribution.
- "Western Cowboy (Rigged)" by human being is licensed under Creative Commons Attribution.
- "Low Poly Western Saloon" by Infima Games is licensed under Standard Unity Asset Store EULA.
- "Worldskies Free Skybox Pack" by PULSAR BYTES is licensed under Standard Unity Asset Store EULA.
- "Reflect" icon is by creative_designer from Flaticon.