
Please use this identifier to cite or link to this item:
http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/19195
Title: | Women sport actions dataset for visual classification using small-scale training data |
Authors: | Bera, Asish |
Keywords: | Computer Science Women’s sports dataset Deep learning Convolutional neural network (CNN) Image-based sports analysis |
Issue Date: | Jul-2025 |
Publisher: | Sage |
Abstract: | Sports action classification representing complex body postures and player-object interactions, is an emerging area in image-based sports analysis. Some works have contributed to automated sports action recognition using machine learning techniques over the past decades. However, sufficient image datasets representing women’s sports actions with enough intra- and inter-class variations are not available to the researchers. To overcome this limitation, this work presents a new dataset named WomenSports for women’s sports classification using small-scale training data. This dataset includes a variety of sports activities, covering wide variations in movements, environments, and interactions among players. In addition, this study proposes a convolutional neural network (CNN) for deep feature extraction. A channel attention scheme upon local contextual regions is applied to refine and enhance feature representation. The experiments are carried out on three different sports datasets and one dance dataset for generalizing the proposed algorithm, and the performances on these datasets are noteworthy. The deep learning method achieves 89.15% top-1 classification accuracy using ResNet-50 on the proposed WomenSports dataset, which is publicly available for research at Mendeley Data. |
URI: | https://journals.sagepub.com/doi/abs/10.1177/17543371251353662 http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/19195 |
Appears in Collections: | Department of Computer Science and Information Systems |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.