Ing. Tomáš Nováček

Publikace

Overview of Controllers of User Interface for Virtual Reality

Rok
2022
Publikováno
PRESENCE: Virtual and Augmented Reality. 2022, 29 37-90. ISSN 1054-7460.
Typ
Článek
Anotace
Virtual reality has been with us for several decades already, but we are still trying to find the right ways to control it. There are a lot of controllers with various purposes and means of input, each with its advantages and disadvantages, but also with specific ways to be handled. Our hands were the primary means of input for human-computer interaction for a long time. However, now we can use movements of our eyes, our feet or even our whole body to control the virtual environment, interact with it, or move from one place to another. We can achieve this with various controllers and wearable interfaces, like eye tracking, haptic suits or treadmills. There are numerous devices that we can choose from for every category, but sometimes it can be hard to pick the one that suits our intentions best. This article summarises all types of user interface controllers for virtual reality, with their main pros and cons and their comparison.

Project MultiLeap: Fusing Data from Multiple Leap Motion Sensors

Autoři
Nováček, T.; Marty, Ch.; Jiřina, M.
Rok
2021
Publikováno
Proceedings of 7th IEEE International Conference on Virtual Reality. Beijing: IEEE, 2021. p. 19-25. ISSN 2331-9569. ISBN 9781665423090.
Typ
Stať ve sborníku
Anotace
Finding a simple and precise way to control the virtual environment is one of the goals of a lot of human-computer interaction research. One of the approaches is using a Leap Motion optical sensor, which provides hand and finger tracking without the need for any hand-held device. However, the Leap Motion system currently supports only one sensor at a time. To overcome this limitation, we proposed a set of algorithms to combine the data from multiple Leap Motion sensors to increase the precision and the usability of hand tracking. First, we suggested a way how to improve the calibration of the current hand pose alignment proposed by Leap Motion. Then, we proposed an approach to fuse the tracking data from multiple Leap Motion sensors to provide more precise interaction with the virtual world. For this, we implemented our very own algorithm for computing the confidence level of the tracking data that can be used to distinguish which Leap Motion sensor detects the tracked hands best. We implemented those algorithms into our MultiLeap library. We also created two demo scenes that we used to validate the correctness of our work - one for evaluation of the fusing algorithms and one for mimicking the interaction with control panels in a helicopter cockpit.

Project MultiLeap: Making Multiple Hand Tracking Sensors to Act Like One

Rok
2021
Publikováno
Proceedings of 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR). Beijing: IEEE, 2021. p. 77-83. ISBN 978-1-6654-3225-2.
Typ
Stať ve sborníku
Anotace
We present a concept that provides hand tracking for virtual and extended reality, only with the use of optical sensors, without the need for the user to hold any physical controller. In this article, we propose five new algorithms further to extend our previous research and the possibilities of the hand tracking system whilst also making it more precise. The first algorithm deals with the need to calibrate the tracking system. Thanks to the new approach, we improved tracking precision by 37% over our previous solution. The second algorithm allows us to compute the precision of the hand tracking data when multiple sensors are used. The third algorithm further improves the computation of hand tracking data confidence by correctly handling the edge cases, for example, when the tracked hand is at the edge of the sensor's field of view. The fourth algorithm provides a new way to fuse the hand tracking data by using only the hand tracking data with the highest hand tracking data confidence. The fifth algorithm deals with the issue when the optical sensor misclassifies the hand chirality.