DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16171
Title: VOEDHgesture: A Multi-Purpose Visual Odometry/ Simultaneous Localization and Mapping and Egocentric Dynamic Hand Gesture Data-Set for Virtual Object Manipulations in Wearable Mixed Reality
Authors: Rohil, Mukesh Kumar
Keywords: Computer Science
Visual Odometry
Wearable Computing
Augmented reality (AR)
Mixed Reality
Pose Estimation
Issue Date: 2024
Publisher: SciTech
Abstract: Visual Odometry/ Simultaneous Localization and Mapping (VO/ SLAM) and Egocentric hand gesture recognition are the two major technologies for wearable computing devices like AR (Augmented Reality)/ MR (Mixed Reality) glasses. However, the AR/MR community lacks a suitable dataset for developing both hand gesture recognition and RGB-D SLAM methods. In this work, we use a ZED mini Camera to develop challenging benchmarks for RGB-D VO/ SLAM tasks and dynamic hand gesture recognition. In our dataset VOEDHgesture, we collected 264 sequences using a ZED mini camera, along with precisely measured and time-synchronized ground truth camera positions, and manually annotated the bounding box values for the hand region of interest. The sequences comprise both RGB and depth images, captured at HD resolution (1920 × 1080) and recorded at a video frame rate of 30Hz. To resemble the Augmented Reality environment, the sequences are captured using a head-mounted ZED mini camera, with unrestricted
URI: https://www.scitepress.org/Link.aspx?doi=10.5220/0012473900003636
http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16171
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.