DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16353
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSharma, Yashvardhan-
dc.date.accessioned2024-11-12T11:07:55Z-
dc.date.available2024-11-12T11:07:55Z-
dc.date.issued2023-
dc.identifier.urihttps://www.scitepress.org/Papers/2023/116560/116560.pdf-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16353-
dc.description.abstractThe evaluation of answer scripts is vital for assessing a student’s performance. The manual evaluation of the answers can sometimes be biased. The assessment depends on various factors, including the evaluator’s mental state, their relationship with the student, and their level of expertise in the subject matter. These factors make evaluating descriptive answers a very tedious and time-consuming task. Automatic scoring approaches can be utilized to simplify the evaluation process. This paper presents an automated answer script evaluation model that intends to reduce the need for human intervention, minimize bias brought on by evaluator psychological changes, save time, maintain track of evaluations, and simplify extraction. The proposedmethod can automatically weigh the assessing element and produce results nearly identical to an instructor’s. We compared the model’s grades to the grades of the teacher, as well as the results of several keyword matching and similarity check techniques, in order to evaluate the developed modelen_US
dc.language.isoenen_US
dc.publisherICPRAMen_US
dc.subjectComputer Scienceen_US
dc.subjectNatural Language Processing (NLP)en_US
dc.subjectMachine learning (ML)en_US
dc.subjectSubjective Answer Evaluationen_US
dc.subjectLearning Assessmentsen_US
dc.titleAutomatic Subjective Answer Evaluationen_US
dc.typeArticleen_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.