Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction Conference SFI MediaFutures IEEE VR 2021 2021. Abstract | BibTeX | Links: @conference{Le2021,
title = {VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction},
author = {Khanh-Duy Le and Tanh Quang Tran and Karol Chlasta and Krzysztof Krejtz and Morten Fjeld and Andreas Kunz},
url = {https://mediafutures.no/vxslate_ieee_vr_2021-2/
https://www.youtube.com/watch?v=N8ZJlKWj4mk&ab_channel=DuyL%C3%AAKh%C3%A1nh},
year = {2021},
date = {2021-02-12},
pages = {1-2},
organization = {IEEE VR 2021},
series = {SFI MediaFutures},
abstract = {Virtual Reality (VR) headsets can open opportunities for users to accomplish complex tasks on large virtual displays, using compact setups. However, interacting with large virtual displays using existing interaction techniques might cause fatigue, especially for precise manipulations, due to the lack of physical surfaces. We designed VXSlate, an interaction technique that uses a large virtual display, as an expansion of a tablet. VXSlate combines a user’s head movements, as tracked by the VR headset, and touch interaction on the tablet. The user’s head movements position both a virtual representation of the tablet and of the user’s hand on the large virtual display. The user’s multi-touch interactions perform finely-tuned content manipulations.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Virtual Reality (VR) headsets can open opportunities for users to accomplish complex tasks on large virtual displays, using compact setups. However, interacting with large virtual displays using existing interaction techniques might cause fatigue, especially for precise manipulations, due to the lack of physical surfaces. We designed VXSlate, an interaction technique that uses a large virtual display, as an expansion of a tablet. VXSlate combines a user’s head movements, as tracked by the VR headset, and touch interaction on the tablet. The user’s head movements position both a virtual representation of the tablet and of the user’s hand on the large virtual display. The user’s multi-touch interactions perform finely-tuned content manipulations. |