DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/xmlui/handle/123456789/9857
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChamola, Vinay-
dc.date.accessioned2023-03-20T09:27:01Z-
dc.date.available2023-03-20T09:27:01Z-
dc.date.issued2021-05-
dc.identifier.urihttps://onlinelibrary.wiley.com/doi/full/10.1111/coin.12451-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/xmlui/handle/123456789/9857-
dc.description.abstractThe traditional target recognition and classification is mostly done manually, with low efficiency and high cost. Improving the level of target recognition automatically has become an important research topic. This paper proposes a target recognition method based on transfer learning to effectively complete the classification and recognition of targets using a brain–computer interface (BCI) model. Based on the construction of the faster-RCNN deep learning model, the pre-training of the model is achieved by VGG-16 and Inception-v2, and the transfer learning algorithm is used to optimize the faster-RCNN deep learning model based on the kinematics model. Experiments are carried out with the aim to detect tableware by the persons whose brain signals recognition rate has been substantially improved using faster-RCNN. Compared with the traditional recognition methods, the results at the lab-scale level illustrated that the proposed algorithm can effectively improve the speed and accuracy of target recognition by using the BCI model to classify tableware of different colors and shapes in a complex background.en_US
dc.language.isoenen_US
dc.publisherWileyen_US
dc.subjectEEEen_US
dc.subjectBrain–computer interfaceen_US
dc.titleBrain–computer interface-based target recognition system using transfer learning: A deep learning approachen_US
dc.typeArticleen_US
Appears in Collections:Department of Electrical and Electronics Engineering

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.