dc.description.abstract |
The advances in Deep Learning (DL) resulted in the development Convolutional Neural Network (CNN) and Recurrent Neural Networks (RNN) based Online Signature Verification (OSV) frameworks. The main drawback with LSTM based networks is the limited parallelization of model training. The CNN based frameworks are efficient in learning local feature dependencies, but fail to apprehend long term feature dependencies. The current works confirmed the success of Transformer based models in long term time series classification (LTTSC) problems due to efficient capturing of context-dependent global feature interactions. Hence, to achieve higher classification accuracy, in this work, we propose a first of its kind of an attempt, in which, we combine CNN and Transformer for Online Signature Verification, named OSVConTramer. The proposed OSVConTramer efficiently learns optimal local and global dependencies of an input signature feature vector and outperforms previous CNN and LSTM based OSV frameworks achieving state-of-the-art classification accuracy. On the widely used MCYT-100, SVC, and SUSIG datasets, specific to one shot learning, our model achieves a SOTA EER of 10.85%, 5.45%, and 6.32%, respectively. The results of the experimental analysis confirms that the accuracy outcomes of OSV frameworks is improved significantly by the optimal learning of the relationships between local and global feature dependency. |
en_US |