Voice based answer evaluation system for physically disabled students using natural language processing and machine learning
https://doi.org/10.17586/2226-1494-2023-23-2-299-303
Abstract
In the modern educational process, there is a need to automate response assessment systems. The task of the reviewer becomes more difficult when analyzing theoretical answers, because online assessment of answers is available only for questions with multiple choice answers. The teacher carefully examines the answer before giving the appropriate mark. The existing approach requires additional staff and time to study the responses. This article introduces a natural language processing and machine learning response-based app that includes a voice prompt for visually impaired students. The application automates the process of checking subjective responses by considering text extraction, feature extraction, and score classification. Evaluation measures, such as Term Frequency-Inverse Document Frequency (TF-IDF) similarity, vector similarity, keyword similarity, and grammar similarity, are considered to determine the overall similarity between teacher outcome and system evaluation. The conducted experiments showed that the system evaluates the answers with an accuracy of 95 %. The proposed methodology is designed to assess the results of exams for students who cannot write but who can speak. The application of the developed application allows reducing the labor costs and time of the teacher by reducing manual labor.
About the Authors
M. ThaloIndia
Талор Минакши — PhD, доцент
Пуне, 411001
sc 57190340673
P. Mane
India
Мане Прадип — PhD, руководитель
Пуне, 411001
sc 36450914400
References
1. Abhishek Girkar, Mohit khambayat, Ajay Waghmare, Supriya Chaudhary. Subjective answer evaluation using natural language processing and machine learning. International Research Journal of Engineering and Technology (IRJET), 2021, vol. 8, no. 4, pp. 5040– 5043.
2. Shweta M. Patil, Sonal Patil. Evaluating student descriptive answers using natural language processing. International Journal of Engineering Research & Technology (IJERT), 2014, vol. 3, no. 3, pp. 1716–1718.
3. Mercy R.A. Automated explanatory answer evaluation using machine learning approach. Design Engineering (Toronto), July 2021.
4. Lakshmi V., Ramesh V. Evaluating students’ descriptive answers using natural language processing and artificial neural networks. IJCRT, 2017, vol. 5, no. 4, pp. 3168–3173.
5. Mittal H., Devi M.S. Computerized evaluation of subjective answers using hybrid technique. Advances in Intelligent Systems and Computing, 2016, vol. 413, pp. 295–303. https://doi.org/10.1007/978-981-10-0419-3_35
6. Sakhapara A., Pawade D., Chaudhari B., Gada R., Mishra A. Bhanushali S. Subjective answer grader system based on machine learning. Advances in Intelligent Systems and Computing, 2019, vol. 898, pp. 347–355. https://doi.org/10.1007/978-981-13-3393-4_36
7. Mahmud T.A., Hussain M.G., Kabir S., Ahmad H., Sobhan M. A keyword based technique to evaluate broad question answer script. Proc. of the 9th International Conference on Software and Computer ( I C S C A ) , 2 0 2 0 , p p . 1 6 7 – 1 7 1 . https://doi.org/10.1145/3384544.3384604
8. Bano S., Jithendra P., Niharika G.L., Sikhi Y. Speech to text translation enabling multilingualism. Proc. of the IEEE International Conference for Innovation in Technology (INOCON), 2020. https://doi.org/10.1109/inocon50539.2020.9298280
9. Bhatia M.S., Aggarwal A., Kumar N. Speech-to-text conversion using GRU and one hot vector encodings. Palarch’s Journal of Archaeology of Egypt/Egyptology, 2020, vol. 17, no. 9, pp. 8513–8524.
10. Johri E., Dedhia N., Bohra K., Chandak P., Adhikari H. ASSESS-automated subjective answer evaluation using semantic learning. Proc. of the 4 th International Conference on Advances in Science & Technology (ICAST2021), 2021. http://dx.doi.org/10.2139/ssrn.3861851
11. Deotare S., Khan R.A. Automatic online subjective text evaluation using text mining. International Journal of Recent Technology and Engineering (IJRTE), 2019, vol. 8, no. 2, pp. 1238–1242. https://doi. org/10.35940/ijrte.I8725.078219
12. Bahel V., Thomas A. Text similarity analysis for evaluation of descriptive answers. arXiv, 2021, arXiv:2105.02935. https://doi.org/10.48550/arXiv.2105.02935
13. Meenakshi Anurag T., Pradeep B.M., Vishaka M. Web app for quick evaluation of subjective answers using natural language processing. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2022, vol. 22, no. 3, pp. 594–599. https://doi.org/10.17586/2226-1494-2022-22-3-594-599
14. Mandge V.A., Thalor M.A. Revolutionize cosine answer matching technique for question answering system. Proc. of the International Conference on Emerging Smart Computing and Informatics (ESCI), 2021, pp. 335–339. https://doi.org/10.1109/ESCI50559.2021.9396864
15. Rish I. An empirical study of the naïve bayes classifier. IJCAI Work Empir Methods Artificial Intelligence. V. 3, 2001
Review
For citations:
Thalo M., Mane P. Voice based answer evaluation system for physically disabled students using natural language processing and machine learning. Scientific and Technical Journal of Information Technologies, Mechanics and Optics. 2023;23(2):299-303. https://doi.org/10.17586/2226-1494-2023-23-2-299-303