dc.description.abstract |
Soft biometric traits (e.g., gender, age, etc. can characterize very relevant personal information. The hand-based traits are studied for traditional/hard biometric recognition for diverse applications. However, little attention is focused to tackle soft biometrics using hand images. In this paper, human gender classification is addressed using the frontal and dorsal hand images of a human. A new hand dataset is created at the Jadavpur University, India denoted as JU-HD for experiments. It represents significant posture variations in an uncontrolled laboratory environment. Sample hand images of 57 persons are collected to incorporate more user-flexibility in posing the hands that incur additional challenges to discriminate the person’s gender. Five backbone CNNs are used to develop a deep model for gender classification. The method achieves 90.49% accuracy on JU-HD using Inception-v3. |
en_US |