DSpace Repository

Automatic Subjective Answer Evaluation

Show simple item record

dc.contributor.author Sharma, Yashvardhan
dc.date.accessioned 2024-11-12T11:07:55Z
dc.date.available 2024-11-12T11:07:55Z
dc.date.issued 2023
dc.identifier.uri https://www.scitepress.org/Papers/2023/116560/116560.pdf
dc.identifier.uri http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16353
dc.description.abstract The evaluation of answer scripts is vital for assessing a student’s performance. The manual evaluation of the answers can sometimes be biased. The assessment depends on various factors, including the evaluator’s mental state, their relationship with the student, and their level of expertise in the subject matter. These factors make evaluating descriptive answers a very tedious and time-consuming task. Automatic scoring approaches can be utilized to simplify the evaluation process. This paper presents an automated answer script evaluation model that intends to reduce the need for human intervention, minimize bias brought on by evaluator psychological changes, save time, maintain track of evaluations, and simplify extraction. The proposedmethod can automatically weigh the assessing element and produce results nearly identical to an instructor’s. We compared the model’s grades to the grades of the teacher, as well as the results of several keyword matching and similarity check techniques, in order to evaluate the developed model en_US
dc.language.iso en en_US
dc.publisher ICPRAM en_US
dc.subject Computer Science en_US
dc.subject Natural Language Processing (NLP) en_US
dc.subject Machine learning (ML) en_US
dc.subject Subjective Answer Evaluation en_US
dc.subject Learning Assessments en_US
dc.title Automatic Subjective Answer Evaluation en_US
dc.type Article en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account