Share:


Recognition of dynamic Lithuanian language gestures

Abstract

This paper proposes a method for automated Lithuanian hands gestures data collection and for Lithuanian hands gestures classification. The dataset of 1100 samples was collected for 10 different classes of Lithuanian hands gesture. The features of hands gestures were extracted with CNN network. The classification was made with LSTM network. The trained LSTM network classified the Lithuanian hands gestures with 85% accuracy.


Article in Lithuanian.


Dinaminių lietuvių kalbos gestų atpažinimas


Santrauka


Rankų gestų kalba yra žmonių, turinčių klausos negalią, pagrindinis įrankis savo mintims bei žinioms perteikti. Retas žmogus, neturintis klausos negalios, supranta gestų kalbą, todėl rankų gestų atpažinimo sistemų kūrimas ir tobulinimas yra aktualus šiuolaikinis uždavinys, leidžiantis padidinti žmonių su negalia bendravimo galimybes. Rankų gestų atpažinimas taip pat leidžia bekontakčiu būdu valdyti įvairius įrenginius. Straipsnyje nagrinėjami gestų atpažinimo metodai ir pasiūlytas algoritmas, leidžiantis atpažinti dinaminius lietuvių kalbos gestus. Tyrimui buvo sukurtas dinaminių gestų duomenų rinkinys, sudarytas iš vaizdo įrašų, kurių kiekvieno trukmė yra 3 sekundės. Iš viso buvo surinkta 1100 vaizdo įrašų. Duomenų rinkinį sudarė 10 klasių. Požymiams išskirti iš vaizdo įrašo kadrų buvo naudojamas pirminio apmokymo „Inception-v3“ konvoliucinis neuronų tinklas. Išskirti požymiai buvo naudojami LSTM tinklui mokyti. Apmokytas tinklas buvo testuotas su patikros bei testavimo duomenimis ir pasiekė 85 % tikslumą.


Reikšminiai žodžiai: dinaminių gestų atpažinimas, LSTM, CNN, neuronų tinklai.

Keyword : dynamic Lithuanian hands gesture recognition, LSTM, CNN, neuron networks

How to Cite
Karmonas, A., & Katkevičius, A. (2023). Recognition of dynamic Lithuanian language gestures. Mokslas – Lietuvos Ateitis / Science – Future of Lithuania, 15. https://doi.org/10.3846/mla.2023.18834
Published in Issue
May 30, 2023
Abstract Views
324
PDF Downloads
200
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

References

Agrawal, M., Ainapure, R., Agrawal, S., Bhosale, S., & Desai, S. (2020, October 30–31). Models for hand gesture recognition using deep learning. In IEEE 5th International Conference on Computing Communication and Automation (ICCCA) (pp. 589–594), Greater Noida, India. https://doi.org/10.1109/ICCCA49541.2020.9250846

Alom, M. S., Hasan, M. J., & Wahid, M. F. (2019, December 24–25). Digit recognition in sign language based on convolutional neural network and support vector machine. In 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI) (pp. 1–5), Dhaka, Bangladesh. https://doi.org/10.1109/STI47673.2019.9067999

Al-Shamayleh, A., Ahmad, R., Abushariah, M., Alam, K., & Jomhari, N. (2018). A systematic literature review on vision based gesture recognition techniques. Multimedia Tools and Applications, 77(1), 28121–28184. https://doi.org/10.1007/s11042-018-5971-z

Bantupalli, K., & Xie, Y. (2018, December 10–13). American sign language recognition using deep learning and computer vision. In 2018 IEEE International Conference on Big Data (Big Data) (pp. 4896–4899), Seattle, WA, USA. https://doi.org/10.1109/BigData.2018.8622141

Bousbai, K., & Merah, M. (2019, November 24–25). A comparative study of hand gestures recognition based on MobileNetV2 and ConvNet models. In 2019 6th International Conference on Image and Signal Processing and their Applications (ISPA) (pp. 1–6), Mostaganem, Algeria. https://doi.org/10.1109/ISPA48434.2019.8966918

Das, A., Gawde, S., Suratwala, K., & Kalbande, D. (2018, January 5). Sign language recognition using deep learning on custom processed static gesture images. In 2018 International Conference on Smart City and Emerging Technology (ICSCET) (pp. 1–6), Mumbai, India. https://doi.org/10.1109/ICSCET.2018.8537248

Hassaballah, M., & Awad, A. I. (2020). Deep learning in computer vision: Principles and applications (1 ed.). CRC Press/Taylor and Francis. https://doi.org/10.1201/9781351003827

He, S. (2019, October 16–18). Research of a sign language translation system based on deep learning. In 2019 International Conference on Artificial Intelligence and Advanced Manufacturing (AIAM) (pp. 392–396), Dublin, Ireland. https://doi.org/10.1109/AIAM48774.2019.00083

Hussain, S., Saxena, R., Han, X., Khan, J. A., & Shin, H. (2017, November 5–8). Hand gesture recognition using deep learning. In 2017 International SoC Design Conference (ISOCC) (pp. 48–49), Seoul, Korea (South). https://doi.org/10.1109/ISOCC.2017.8368821

Islam, M. R., Mitu, U. K., Bhuiyan, R. A., & Shin, J. (2018, September 24–27). Hand gesture feature extraction using deep convolutional neural network for recognizing american sign language. In 2018 4th International Conference on Frontiers of Signal Processing (ICFSP) (pp. 115–119), Poitiers, France. https://doi.org/10.1109/ICFSP.2018.8552044

Köpüklü, O., Gunduz, A., Kose, N., & Rigoll, G. (2019, May 14–18). Real-time hand gesture detection and classification using convolutional neural networks. In 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) (pp. 1–8), Lille, France. https://doi.org/10.1109/FG.2019.8756576

Kurhekar, P., Phadtare, J., Sinha, S., & Shirsat, K. P. (2019, April 23–25). Real time sign language estimation system. In 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI) (pp. 654–658), Tirunelveli, India. https://doi.org/10.1109/ICOEI.2019.8862701

Liao, Y., Xiong, P., Min, W., Min, W., & Lu, J. (2019). Dynamic sign language recognition based on video sequence with BLSTM-3D residual networks. IEEE Access, 7, 38044–38054. https://doi.org/10.1109/ACCESS.2019.2904749

Matuzevičius, D., & Serackis, A. (2021). Three-dimensional human head reconstruction using smartphone-based close-range video photogrammetry. Applied Sciences, 12(1), 1–26. https://doi.org/10.3390/app12010229

Molchanov, P., Yang, X., Gupta, S., Kim, K., Tyree, S., & Kautz, J. (2016, June 27–30). Online detection and classification of dynamic hand gestures with recurrent 3D convolutional neural networks. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 4207–4215), Las Vegas, NV, USA. https://doi.org/10.1109/CVPR.2016.456

Patel, R., Dhakad, J., Desai, K., Gupta, T., & Correia, S. (2018, December 14–15). Hand gesture recognition system using convolutional neural networks. In 2018 4th International Conference on Computing Communication and Automation (ICCCA) (pp. 1–6), Greater Noida, India. https://doi.org/10.1109/CCAA.2018.8777621

Perimal, M., Basah, S., Safar, M., & Yazid, H. (2018). Hand-gesture recognition-algorithm based on finger counting. Journal of Telecommunication, Electronic and Computer Engineering (JTEC), 10(1–13), 19–24. https://jtec.utem.edu.my/jtec/article/view/4115

Rafi, A. M., Nawal, N., Bayev, N. S. N., Nima, L., Shahnaz, C., & Fattah, S. A. (2019, October 17–20). Image-based bengali sign language alphabet recognition for deaf and dumb community. In 2019 IEEE Global Humanitarian Technology Conference (GHTC) (pp. 1–7), Seattle, WA, USA. https://doi.org/10.1109/GHTC46095.2019.9033031

Raudonis, V., & Jonaitis, D. (2014, May 8–9). Gesture sign language recognition method using artificial neural network, as a translator tool of deaf people. In 9th International Conference on Electrical and Control Technologies ECT (pp. 13–17), Kaunas, Lithuania.

Siriak, R., Skarga-Bandurova, I., & Boltov, Y. (2019, September 18–21). Deep convolutional network with long short-term memory layers for dynamic gesture recognition. In 2019 10th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS) (pp. 158–162), Metz, France. https://doi.org/10.1109/IDAACS.2019.8924381

Tumas, P., Nowosielski, A., & Serackis, A. (2020). Pedestrian detection in severe weather conditions. IEEE Access, 8, 62775–62784. https://doi.org/10.1109/access.2020.2982539

Vuletic, T., Duffy, A., Hay, L., McTeague, C., Campbell, G., & Grealy, M. (2019). Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies, 129, 74–94. https://doi.org/10.1016/j.ijhcs.2019.03.011

World Health Organization. (2023, March 25). Deafness and hearing loss. https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss

Zakariya, A. M., & Jindal, R. (2019, July 6–8). Arabic sign language recognition system on smartphone. In 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT) (pp. 1–5), Kanpur, India. https://doi.org/10.1109/ICCCNT45670.2019.8944518