By Cam Lunt and Ryan Reede
Goal
To illustrate what goes on behind the scenes of Virtual and Augmented Reality applications. To do so, a Javascript visualization shows linear tranformations (quaternions) taking place in realtime so as to show what portion of a 3D environment a GPU will render (and at what quality) so that a user in VR can look and shift their viewpoint at any given point and still have frames prepared for them at over 75hz.
Process
- Translational (acceleromter) and Rotational (gyroscope) data generated by Project Tango device
- Data sent via socket to Kafka Producer
- Data queued and sent to Kafka Consumer (Jetty Server)
- Logic preformed in Jetty backend and sent via html5 WebSockets to browser
- THREE.js visualization of data
Resources:
-
Data generated from a Project Tango device
-
Then sent to the server via the PubNub Android API
-
Processed with Scala/Python in Apache Spark
-
Visualized in a web browser over D3.js
By Cam Lunt and Ryan Reede