DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/8484
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBera, Asish-
dc.date.accessioned2023-01-16T05:42:06Z-
dc.date.available2023-01-16T05:42:06Z-
dc.date.issued2023-01-
dc.identifier.urihttps://link.springer.com/chapter/10.1007/978-3-031-22485-0_29-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/xmlui/handle/123456789/8484-
dc.description.abstractSoft biometric traits (e.g., gender, age, etc. can characterize very relevant personal information. The hand-based traits are studied for traditional/hard biometric recognition for diverse applications. However, little attention is focused to tackle soft biometrics using hand images. In this paper, human gender classification is addressed using the frontal and dorsal hand images of a human. A new hand dataset is created at the Jadavpur University, India denoted as JU-HD for experiments. It represents significant posture variations in an uncontrolled laboratory environment. Sample hand images of 57 persons are collected to incorporate more user-flexibility in posing the hands that incur additional challenges to discriminate the person’s gender. Five backbone CNNs are used to develop a deep model for gender classification. The method achieves 90.49% accuracy on JU-HD using Inception-v3.en_US
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.subjectComputer Scienceen_US
dc.subjectConvolutional Neural Networks (CNN)en_US
dc.subjectGender classificationen_US
dc.subjectHand biometricsen_US
dc.subjectSoft biometricsen_US
dc.titleHuman Gender Classification Based on Hand Images Using Deep Learningen_US
dc.typeArticleen_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.