Hand Gesture Navigation in Virtual Reality

Research and implementation of intuitive hand gesture-based locomotion methods for virtual reality environments, focusing on natural movement without controllers.

Virtual Reality
Unity
C#
Meta Quest 2
Hand Tracking
Gesture Recognition
Oculus SDK
MRTK
Human-Computer Interaction
Movement Systems

This research project explores and implements various hand gesture-based navigation methods for virtual reality environments, eliminating the need for traditional controllers. The system utilizes the Meta Quest 2's hand tracking capabilities and implements three distinct movement techniques: teleportation, continuous locomotion, and rope-pulling locomotion.

#Project Highlights:

  • Gesture Recognition System: Developed a dynamic gesture recognition system using invisible collision objects for precise hand movement detection
  • Multiple Navigation Methods: Implemented three different locomotion techniques to accommodate various user preferences and use cases
  • Natural Interaction: Created intuitive gestures based on real-world movements to enhance immersion
  • Speed Control: Integrated variable movement speeds based on hand position and gesture intensity
  • Unity Integration: Built using Unity with Oculus Integration SDK and Mixed Reality Toolkit (MRTK) components
  • Usability Testing: Designed comprehensive evaluation methods using System Usability Scale (SUS) and NASA Task Load Index (NASA-TLX)

The project demonstrates the potential for more natural and immersive VR interactions, particularly focusing on navigation methods that feel intuitive to users while maintaining precise control over movement in virtual environments.

Banner image for Hand Gesture Navigation in Virtual Reality