DSpace Repository

Human Gender Classification Based on Hand Images Using Deep Learning

Show simple item record

dc.contributor.author Bera, Asish
dc.date.accessioned 2023-01-16T05:42:06Z
dc.date.available 2023-01-16T05:42:06Z
dc.date.issued 2023-01
dc.identifier.uri https://link.springer.com/chapter/10.1007/978-3-031-22485-0_29
dc.identifier.uri http://dspace.bits-pilani.ac.in:8080/xmlui/handle/123456789/8484
dc.description.abstract Soft biometric traits (e.g., gender, age, etc. can characterize very relevant personal information. The hand-based traits are studied for traditional/hard biometric recognition for diverse applications. However, little attention is focused to tackle soft biometrics using hand images. In this paper, human gender classification is addressed using the frontal and dorsal hand images of a human. A new hand dataset is created at the Jadavpur University, India denoted as JU-HD for experiments. It represents significant posture variations in an uncontrolled laboratory environment. Sample hand images of 57 persons are collected to incorporate more user-flexibility in posing the hands that incur additional challenges to discriminate the person’s gender. Five backbone CNNs are used to develop a deep model for gender classification. The method achieves 90.49% accuracy on JU-HD using Inception-v3. en_US
dc.language.iso en en_US
dc.publisher Springer en_US
dc.subject Computer Science en_US
dc.subject Convolutional Neural Networks (CNN) en_US
dc.subject Gender classification en_US
dc.subject Hand biometrics en_US
dc.subject Soft biometrics en_US
dc.title Human Gender Classification Based on Hand Images Using Deep Learning en_US
dc.type Article en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account