« Back to Publications list

Multi-scale gestural interaction for augmented reality

We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.

https://doi.org/10.1145/3132787.3132808

Barrett Ens, Aaron Quigley, Hui-Shyong Yeo, Pourang Irani, and Mark Billinghurst. 2017. Multi-scale gestural interaction for augmented reality. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (SA '17). Association for Computing Machinery, New York, NY, USA, Article 11, 1–2. DOI:https://doi.org/10.1145/3132787.3132808

Bibtext Entry

@inproceedings{10.1145/3132787.3132808,
author = {Ens, Barrett and Quigley, Aaron and Yeo, Hui-Shyong and Irani, Pourang and Billinghurst, Mark},
title = {Multi-Scale Gestural Interaction for Augmented Reality},
year = {2017},
isbn = {9781450354103},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3132787.3132808},
doi = {10.1145/3132787.3132808},
abstract = {We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.},
booktitle = {SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications},
articleno = {11},
numpages = {2},
keywords = {gesture interaction, augmented reality, microgestures},
location = {Bangkok, Thailand},
series = {SA '17}
}

Authors

Barrett Ens

Barrett Ens

Alumni
Pourang Irani

Pourang Irani

Professor
Canada Research Chair