RealiType

Natural Gesture Controlled VR

I had seen all kinds of clips of people using VR in various applications ranging from medical to engineering. At the time of building this application, every one of those VR applications required a clunky controller or required the user to use a mouse and buttons on the keyboard, but I had recently seen the Microsoft Kinect in action and knew that gesture control in-application avatar movement could be controlled using natural movements. If you remember, Oculus didn't have the touch controllers until December 2016 and the HoloLens wasn't even announced until March of 2016.

While at the Gatton Academy, I became more familiar with software systems and wrote a proposal for the school to fund my idea. I led a team of 3 other students over 3 weeks with no previous Kinect, Oculus, or Unity experience. In the end, we built a demo application combining the Oculus Rift and Kinect into one software package before any other similar project had been made.

This project allowed free movement in the VR environment using just your body. You could walk around and pick up and interact with objects in the virtual environment with just your hands (no controller needed). A future plan was to add in support for loading in the environment surrounding you using multiple Kinects, but we never got that far!