Skip to content

3k3m1n1/AR-mirror

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logo

Reality Check

CT + VXD Intern Project
Ekemini Nkanta & Gabriel Drozdov
Summer 2022

Watch our demo! »

Final Presentation: Keynote · PDF · Zoom Recording

Introduction

For our summer intern project, we challenged ourselves to make augmented reality more accessible, engaging, & group-friendly. The result was Reality Check: a prototype for room-scale AR experiences that offers seamless, body-driven interaction between people and the virtual world. Visitors are immersed via a large-format "mirror" display, rather than a headset or phone screen, and can walk around the space (or reach out and "touch" virtual objects) in order to trigger music/sounds, environmental changes, immersive cutscenes, and other events.

By eliminating most technical limitations, as well as its main barrier to entry, exhibits can now use AR to its fullest potential.

Behind the Scenes

Reality Check is powered by the ZED 2 stereo camera, which uses AI to replicate human vision.

This, combined with the Unity game engine, offers exciting capabilities:

  • Body & object detection that uniquely identifies & tracks entities within 40m, without the need for markers, bodysuits, or tracking devices
  • Proper occlusion between virtual & real objects
  • 3D interaction using depth sensing & spatial triggers
  • Dynamic scene elements, such as adaptive audio and lighting
  • Support for large groups both indoors and outdoors, with no hard limits on capacity

Goals & Objectives

  • Integrate ZED 2 stereo camera with the Unity game engine
  • Prototype a "mirror" display that can detect and augment 1-3 objects in 3D, as well as reveal info / animated surprises for each one
  • Pull in creative experimentation with sound & light from our Dynamic Soundscape and Light as Narrator proposals
  • Conjure up bigger, mind-bending concepts for a room-scale mirror (puzzles, illusions, 3D minigames to play with trackable objects...)

Main Objectives

  • Run ZED tests with pre-built Unity scenes included in SDK
    • Skeleton tracking
      • 3D Avatar (rigged mannequin is included!)
      • 2D Skeleton
    • Object detection note: bodies register as objects too! full list of detectable objects
      • 2D bounding box and mask
      • 3D bounding box
    • Virtual object placement
    • Planetarium good example for mixed reality & virtual lights
    • Movie Screen use it as a test backdrop to see how sharp the silhouette mask is
  • Map a custom avatar to the user’s body
  • Place virtual objects/scenery in a physical room
  • Create interactions between the visitor and the virtual objects using trigger volumes

Reach Objectives

  • Add adaptive audio
    • ...using FMOD?
  • Simulate Unity lights illuminating real objects
  • Add dynamic light source(s): changing color, intensity, direction…
  • Create a short AR cinematic that respects the geometry of the physical room using Unity Timeline
  • Detect a custom object by training our own machine learning model
    (a.k.a. not a person, vehicle, bag, animal, electronic, fruit/vegetable, or sports ball)
  • Map a virtual object (3D model) to an artifact
    *Edit: I discovered from the tests that we can’t track a detected object’s rotation at any given time - only its position in the scene and which pixels on screen belong to it (2D mask). But! We can still try this with larger/standing objects whose real-world counterparts wouldn’t rotate during a session - like statues, columns…
    • Display annotations/pop-ups over it

Credits

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published