Classification of human motor activity based on multisensory data analysis
https://doi.org/10.17586/2226-1494-2025-25-5-833-843
Abstract
An analysis of multisensor data obtained from an electromyograph, inertial measurement devices, a computer-vision system, and virtual-reality trackers was performed in order to solve the problem of classifying human motor activity. The relevance of solving this problem is determined by the necessity of analyzing and recognizing human motor activity when using various hardware and software complexes, for example, rehabilitation and training systems. For the optimal solution of the task of recognizing the type of hand movements with the highest accuracy, the contribution of each signal source is evaluated, and a comparison of various machine-learning models is performed. The approach to processing multisensor data includes: synchronized acquisition of streams from different sources; labeling of the initial data; signal filtering; dual alignment of time series by frequency and duration with approximation to a common constant; formation of a common dataset; training and selection of a machine-learning model for recognizing motor activity of the hands. Nine machine-learning models are considered: logistic regression, k-nearest neighbors, naïve Bayes classifier, decision tree, and ensembles based on them (Random Forest, AdaBoost, Extreme Gradient Boosting, Voting, and Stacking Classifier). The developed approach of synchronization, filtering, and dual alignment of data streams makes it possible to form a unified dataset of multisensor data for model training. An experiment was carried out on the classification of nine categories of hand movements based on the analysis of multisensor data (629 recordings collected from 15 participants). Training was performed on 80 % of the collected data with five-fold cross-validation. The AdaBoost ensemble provides a classification accuracy of 98.8 % on the dataset composed of the combined information from four different sources. In the course of ablation analysis for comparing the data sources, the greatest influence on the final classification accuracy is exerted by information from virtual-reality trackers (up to 98.73 % ± 1.78 % accuracy on the AdaBoost model), while data on muscle activity from the electromyograph turned out to be the least informative. It was determined that high classification accuracy of motor activity can be obtained using inertial measurement devices. The considered study formalizes a reproducible approach to processing multisensor data and makes it possible to objectively compare the contribution of different sources of information and machine-learning models in solving the problem of classifying the motor activity of the user’s hands within rehabilitation and virtual training systems. It is shown that under resource limitations it is possible to refuse part of the data sources without significant loss of classification accuracy, simplifying the hardware configuration of tracking systems and making it possible to move from closed commercial systems (virtual-reality trackers) to more accessible and compact inertial measurement devices.
Keywords
About the Author
A. D. ObukhovRussian Federation
Artem D. Obukhov — D.Sc., Professor, Leading Researcher
sc 56104232400
Tambov, 392000
References
1. Obukhov A., Volkov A., Pchelintsev A., Nazarova A., Teselkin D., Surkova E., Fedorchuk I. Examination of the accuracy of movement tracking systems for monitoring exercise for musculoskeletal rehabilitation. Sensors, 2023, vol. 23, no. 19, pp. 8058. https://doi.org/10.3390/s23198058
2. Obukhov A., Dedov D., Volkov A., Teselkin D. Modeling of nonlinear dynamic processes of human movement in virtual reality based on digital shadows. Computation, 2023, vol. 11, no. 5, pp. 85. https://doi.org/10.3390/computation11050085
3. Islam M.M., Nooruddin S., Karray F., Muhammad G. Human activity recognition using tools of convolutional neural networks: a state of the art review, data sets, challenges, and future prospects. Computers in Biology and Medicine, 2022, vol. 149, pp. 106060. https://doi.org/10.1016/j.compbiomed.2022.106060
4. Ergun B.G., Şahiner R. Embodiment in virtual reality and augmented reality games: an investigation on user interface haptic controllers. Journal of Soft Computing and Artificial Intelligence, 2023, vol. 4, no. 2, pp. 80–92. https://doi.org/10.55195/jscai.1409156
5. Franček P., Jambrosic K., Horvat M., Planinec V. The performance of inertial measurement unit sensors on various hardware platforms for binaural head-tracking applications. Sensors, 2023, vol. 23, no. 2, pp. 872. https://doi.org/10.3390/s23020872
6. Ghorbani F., Ahmadi A., Kia M., Rahman Q., Delrobaei M. A decision-aware ambient assisted living system with IoT embedded device for in-home monitoring of older adults. Sensors, 2023, vol. 23, no. 5, pp. 2673. https://doi.org/10.3390/s23052673
7. Eliseichev E.A., Mikhailov V.V., Borovitskiy I.V., Zhilin R.M., Senatorova E.O. A review of devices for detection of muscle activity by surface electromyography. Biomedical Engineering, 2022, vol. 56, no. 1, pp. 69–74. https://doi.org/10.1007/s10527-022-10169-4
8. Chung J.L., Ong L.Y., Leow M.C. Comparative analysis of skeletonbased human pose estimation. Future Internet, 2022, vol. 14, no. 12, pp. 380. https://doi.org/10.3390/fi14120380
9. Zhang S., Li Y., Zhang S., Shahabi F., Xia S., Deng Y., Alshurafa N. Deep learning in human activity recognition with wearable sensors: a review on advances. Sensors, 2022, vol. 22, no. 4, pp. 1476. https://doi.org/10.3390/s22041476
10. Lin J.J., Hsu C.K., Hsu W.L., Tsao T.C., Wang F.C., Yen J.Y. Machine learning for human motion intention detection. Sensors, 2023, vol. 23, no. 16, pp. 7203. https://doi.org/10.3390/s23167203
11. Mazon D.M., Groefsema M., Schomaker L.R.B., Carloni R. IMUbased classification of locomotion modes, transitions, and gait phases with convolutional recurrent neural networks. Sensors, 2022, vol. 22, no. 22, pp. 8871. https://doi.org/10.3390/s22228871
12. Gonzales-Huisa O.A., Oshiro G., Abarca V.E., Chavez-Echajaya J.G., Elias D.A. EMG and IMU data fusion for locomotion mode classification in transtibial amputees. Prosthesis, 2023, vol. 5, no. 4, pp. 1232–1256. https://doi.org/10.3390/prosthesis5040085
13. Vásconez J.P., López L.I.B., Caraguay A.L.V., Benalcázar M.E. Hand gesture recognition using EMG-IMU signals and deep q-networks. Sensors, 2022, vol. 22, no. 24, pp. 9613. https://doi.org/10.3390/s22249613
14. Sulla-Torres J., Gamboa A.C., Llanque C.A., Osorio J.A., Carnero M.Z. Classification of motor competence in schoolchildren using wearable technology and machine learning with hyperparameter optimization. Applied Sciences, 2024, vol. 14, no. 2, pp. 707. https:// doi.org/10.3390/app14020707
15. Stančić I., Music J., Grujic T., Vasic M.K., Bonkovic M. Comparison and evaluation of machine learning-based classification of hand gestures captured by inertial sensors. Computation, 2022, vol. 10, no. 9, pp. 159. https://doi.org/10.3390/computation10090159
16. Ogundokun R.O., Maskeliunas R., Misra S., Damasevicius R. Hybrid inceptionv3-svm-based approach for human posture detection in health monitoring systems. Algorithms, 2022, vol. 15, no. 11, pp. 410. https://doi.org/10.3390/a15110410
17. Farhadpour S., Warner T.A., Maxwell A.E. Selecting and interpreting multiclass loss and accuracy assessment metrics for classifications with class imbalance: Guidance and best practices. Remote Sensing, 2024, vol. 16, no. 3, pp. 533. https://doi.org/10.3390/rs16030533
18. Jiang Y., Song L., Zhang J., Song Y., Yan M. Multi-category gesture recognition modeling based on sEMG and IMU signals. Sensors, 2022, vol. 22, no. 15, pp. 5855. https://doi.org/10.3390/s22155855
19. Lin W.C., Tu Y.C., Lin H.Y., Tseng M.H. A comparison of deep learning techniques for pose recognition in Up-and-Go pole walking exercises using skeleton images and feature data. Electronics, 2025, vol. 14, no. 6, pp. 1075. https://doi.org/10.3390/electronics14061075
20. Mohammadzadeh A.K., Alinezhad E., Masoud S. Neural-NetworkDriven intention recognition for enhanced Human–Robot Interaction: a Virtual-Reality-Driven approach. Machines, 2025, vol. 13, no. 5, pp. 414. https://doi.org/10.3390/machines13050414
Review
For citations:
Obukhov A.D. Classification of human motor activity based on multisensory data analysis. Scientific and Technical Journal of Information Technologies, Mechanics and Optics. 2025;25(5):833-843. (In Russ.) https://doi.org/10.17586/2226-1494-2025-25-5-833-843































