ISMAR 2018
IEEEIEEE computer societyIEEE vgtcACM In-CooperationACM In-Cooperation

Sponsors

Platinum Apple
Silver MozillaIntelDaqriPTCAmazon
Bronze FacebookQualcommUmajinDisney ResearchUniSA VenturesReflektOccipital
SME EnvisageARKhronos
Academic TUMETHZ

Meghal Dani, Gaurav Garg, Ramakrishna Perla, and Ramya Hebbalaguppe. Mid-air fingertip-based user interaction in mixed reality. In Adjunct Proceedings of the IEEE International Symposium for Mixed and Augmented Reality 2018 (To appear). 2018.
[BibTeX▼]

Abstract

With data growing at a huge rate, there arises a need for advanced data visualization techniques. Visualizing these data sets in Mixed Reality(MR) mode provides an immersive experience to the user in the context of the real world applications. Most of the existing works can only be used with inordinately priced devices such as Microsoft HoloLens, Meta Glass that use proprietary hardware for data visualization and user interaction through hand gestures. In this paper, we demonstrate a cost-effective solution for data visualization using frugal devices such as Google Cardboard, VR Box etc. in MR mode. However, these devices still employ only primitive modes of interaction such as the magnetic trigger, conductive lever and have a limited user-input capability. To interact with visualizations and facilitate rich user experience, we propose the use of intuitive pointing fingertip gestural interface in the user�s Field of View(FoV). The proposed pointing hand gesture recognition framework is driven by cascade of state-of-the-art deep learning model - Faster RCNN for localizing the hand followed by a proposed regression CNN for fingertip localization. We conducted both objective and subjective evaluation to demonstrate the performance of our proposed method. Objective metrics are fingertip recognition accuracy and computational time. The subjective evaluation includes user comfort and effectiveness of fingertip interaction that is proposed.