Preview

Scientific and Technical Journal of Information Technologies, Mechanics and Optics

Advanced search

Preprocessing of skeletal keypoints trajectories in the task of laboratory animal behavior recording automation

https://doi.org/10.17586/2226-1494-2025-25-2-295-302

Abstract

The automation of action recognition in laboratory animals is a crucial step in simplifying behavioral tests in the fields of pathophysiology and rehabilitation research. The most common method of action recognition is to analyze the trajectories of key skeletal points. However, the existing methods are strongly tied to the specific animal species, selected skeletal points, and set of activities to be recognized. Furthermore, there is a dearth of mathematical formulations of this problem and research on algorithms for filtering obtained trajectories. The research task involves the collection of a dataset for key points detection of Wistar rats and evaluation of algorithms for filtering trajectories from noisy measurements. In considered skeletal model of the rat, a total of thirteen points were selected for the purpose of estimating the behavior along trajectories. A mathematical description of the dynamics of point movement between frames for use in a Kalman filter is provided. Four filtering algorithms are evaluated in terms of accuracy and curve smoothness. The technique of constructing the covariance matrix of the detector noise by analyzing the key point detection errors is developed. The comparison of filtering algorithms shows that the Unscented Kalman filter with nonlinear model and moving average filter yield the most optimal results in this task. The findings of this study allow the use of a mathematical description of system dynamics to estimate the actual trajectory from noisy measurements. Furthermore, the described methodologies are not exclusive to laboratory animals, but can also be applied to human subjects.

About the Authors

D. I. Krasnov
ITMO University
Russian Federation

Dmitrii I. Krasnov — PhD Student.

Saint Petersburg, 197101, sc 59411982500



M. A. Volynsky
ITMO University
Russian Federation

Maxim A. Volynsky — PhD, Associate Professor, Director (Technical Vision Laboratory), Associate Professor.

Saint Petersburg, 197101, sc 23006901100



A. A. Gusev
ITMO University
Russian Federation

Alexander A. Gusev — PhD, Leading Engineer.

Saint Petersburg, 197101, sc 57207731147



References

1. Friedrich J., Lindauer U., Höllig A. Procedural and methodological quality in preclinical Stroke Research-A Cohort Analysis of the Rat MCAO model comparing periods before and after the publication of STAIR/ARRIVE. Frontiers in Neurology, 2022, vol. 13, pp. 834003. https://doi.org/10.3389/fneur.2022.834003

2. Gulinello M., Mitchell H.A., Chang Q., O’Brien W.T., Zhou Z., Abel T., Wang L., Corbin J.G., Veeraragavan S., Samaco R.C., Andrews N.A., Fagiolini M., Cole T.B., Burbacher T.M., Crawley J.N. Rigor and reproducibility in rodent behavioral research. Neurobiology of Learning and Memory, 2019, vol. 165, pp. 106780. https://doi.org/10.1016/j.nlm.2018.01.001

3. Altun M., Bergman E., Edström E., Johnson H., Ulfhake B. Behavioral impairments of the aging rat. Physiology & Behavior, 2007, vol. 92, no. 5, pp. 911–923. https://doi.org/10.1016/j.physbeh.2007.06.017

4. Lauer J., Zhou M., Ye S., Menegas W., Schneider S., Nath T., Rahman M.M., Di Santo V., Soberanes D., Feng G.P., Murthy V.N., Lauder G., Dulac C., Mathis M.W., Mathis A. Multi-animal pose estimation, identification and tracking with DeepLabCut. Nature Methods, 2022, vol. 19, no. 4, pp. 496–504. https://doi.org/10.1038/s41592-022-01443-0

5. Song L., Yu G., Yuan J., Liu Z. Human pose estimation and its application to action recognition: A survey. Journal of Visual Communication and Image Representation, 2021, vol. 7, pp. 103055. https://doi.org/10.1016/j.jvcir.2021.103055

6. Segalin C., Williams J., Karigo T., Hui M., Zelikowsky M., Sun J.J., Perona P., Anderson D.J., Kennedy A. The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice. eLife, 2021, vol. 10, pp. e63720. https://doi.org/10.7554/eLife.63720

7. Hsu A.I., Yttri E.A. B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors. Nature Communications, 2021, vol. 12, no. 1, pp. 5188. https://doi.org/10.1038/s41467-021-25420-x

8. Monsees A., Voit K.M., Wallace D.J., Sawinski J., Charyasz E., Scheffler K., Macke J.H., Kerr J.N.D. Estimation of skeletal kinematics in freely moving rodents. Nature Methods, 2022, vol. 19, no. 11, pp. 1500–1509. https://doi.org/10.1038/s41592-022-01634-9

9. Pereira T.D., Tabris N., Matsliah A. Turner D.M., Li J., Ravindranath S., Papadoyannis E.S., Normand E., Deutsch D.S., Wang Z.Y., McKenzie-Smith G.C., Mitelut C.C., Castro M.D., D’Uva J., Kislin M., Sanes D.H., Kocher S.D., Wang S.S.H., Falkner A.L., Shaevitz J.W., Murthy M. SLEAP: A deep learning system for multi-animal pose tracking. Nature Methods, 2022, vol. 19, no. 4, pp. 486–495. https://doi.org/10.1038/s41592-022-01426-1

10. Karashchuk P., Rupp K.L., Dickinson E.S., Walling-Bell S., Sanders E., Azim E., Brunton B.W., Tuthill J.C. Anipose: A toolkit for robust markerless 3D pose estimation. Cell Reports, 2021, vol. 36, no. 13, pp. 109730. https://doi.org/10.1016/j.celrep.2021.109730

11. Kam H.C., Yu Y.K., Wong K.H. An improvement on ArUco marker for pose tracking using kalman filter. Proc. of the 19th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD) , 2018, pp. 65–69. https://doi.org/10.1109/sNPD.2018.8441049

12. Buizza C., Fischer T., Demiris Y. Real-Time Multi-Person pose tracking using data assimilation. Proc. of the IEEE Winter Conference on Applications of Computer Vision (WACV), 2020, pp. 438–447. https://doi.org/10.1109/WACV45572.2020.9093442

13. Svensson D. Derivation of the discrete-time constant turn rate and acceleration motion model. Proc. of the Sensor Data Fusion: Trends, Solutions, Applications (SDF), 2019, pp. 1–5. https://doi.org/10.1109/sDF.2019.8916654

14. Wan E.A., Van Der Merwe R. The unscented Kalman filter for nonlinear estimation. Proc. of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, 2000, pp. 153–158. https://doi.org/10.1109/ASSPCC.2000.882463

15. Casiez G., Roussel N., Vogel D. 1 € filter: a simple speed-based lowpass filter for noisy input in interactive systems. Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘12), 2012, pp. 2527–2530. https://doi.org/10.1145/2207676.2208639

16. Varghese R., Sambath M. YOLOv8: a novel object detection algorithm with enhanced performance and robustness. Proc. of the International Conference on Advances in Data Engineering and Intelligent Computing Systems (ADICS), 2024, pp. 1–6. https://doi.org/10.1109/ADICS58448.2024.10533619

17. Kalman R.E. A new approach to linear filtering and prediction problems. Journal of Fluids Engineering, 1960, vol. 82, no. 1, pp. 35–45. https://doi.org/10.1115/1.3662552


Review

For citations:


Krasnov D.I., Volynsky M.A., Gusev A.A. Preprocessing of skeletal keypoints trajectories in the task of laboratory animal behavior recording automation. Scientific and Technical Journal of Information Technologies, Mechanics and Optics. 2025;25(2):295-302. (In Russ.) https://doi.org/10.17586/2226-1494-2025-25-2-295-302

Views: 30


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2226-1494 (Print)
ISSN 2500-0373 (Online)