Please use this identifier to cite or link to this item:
http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16353
Title: | Automatic Subjective Answer Evaluation |
Authors: | Sharma, Yashvardhan |
Keywords: | Computer Science Natural Language Processing (NLP) Machine learning (ML) Subjective Answer Evaluation Learning Assessments |
Issue Date: | 2023 |
Publisher: | ICPRAM |
Abstract: | The evaluation of answer scripts is vital for assessing a student’s performance. The manual evaluation of the answers can sometimes be biased. The assessment depends on various factors, including the evaluator’s mental state, their relationship with the student, and their level of expertise in the subject matter. These factors make evaluating descriptive answers a very tedious and time-consuming task. Automatic scoring approaches can be utilized to simplify the evaluation process. This paper presents an automated answer script evaluation model that intends to reduce the need for human intervention, minimize bias brought on by evaluator psychological changes, save time, maintain track of evaluations, and simplify extraction. The proposedmethod can automatically weigh the assessing element and produce results nearly identical to an instructor’s. We compared the model’s grades to the grades of the teacher, as well as the results of several keyword matching and similarity check techniques, in order to evaluate the developed model |
URI: | https://www.scitepress.org/Papers/2023/116560/116560.pdf http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16353 |
Appears in Collections: | Department of Computer Science and Information Systems |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.