Object-based touch manipulation for remote guidance of physical tasks
Date
2014
Authors
Adcock, Matt
Ranatunga, Dulitha
Smith, Ross
Thomas, Bruce
Journal Title
Journal ISSN
Volume Title
Publisher
Association for Computing Machinery (ACM)
Abstract
This paper presents a spatial multi-touch system for the remote guidance of physical tasks that uses semantic information about the physical properties of the environment. It enables a remote expert to observe a video feed of the local worker's environment and directly specify object movements via a touch display. Visual feedback for the gestures is displayed directly in the local worker's physical environment with Spatial Augmented Reality and observed by the remote expert through the video feed. A virtual representation of the physical environment is captured with a Kinect that facilitates the context-based interactions. We evaluate two methods of remote worker interaction, object-based and sketch-based, and also investigate the impact of two camera positions, top and side, for task performance. Our results indicate translation and aggregate tasks could be more accurately performed via the object based technique when the top-down camera feed was used. While, in the case of the side on camera view, sketching was faster and rotations were more accurate. We also found that for object-based interactions the top view was better on all four of our measured criteria, while for sketching no significant difference was found between camera views.
Description
Keywords
Citation
Collections
Source
SUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction
Type
Conference paper
Book Title
Entity type
Access Statement
License Rights
Restricted until
2037-12-31
Downloads
File
Description