A real-time gesture recognition system using near-infrared imagery
Autoři:
Tomás Mantecón aff001; Carlos R. del-Blanco aff001; Fernando Jaureguizar aff001; Narciso García aff001
Působiště autorů:
Grupo de Tratamiento de Imágenes, Information Processing and Telecommunications Center and ETSI Telecomunicación, Universidad Politécnica de Madrid, Madrid, Spain
aff001
Vyšlo v časopise:
PLoS ONE 14(10)
Kategorie:
Research Article
doi:
https://doi.org/10.1371/journal.pone.0223320
Souhrn
Visual hand gesture recognition systems are promising technologies for Human Computer Interaction, as they allow a more immersive and intuitive interaction. Most of these systems are based on the analysis of skeleton information, which is in turn inferred from color, depth, or near-infrared imagery. However, the robust extraction of skeleton information from images is only possible for a subset of hand poses, which restricts the range of gestures that can be recognized. In this paper, a real-time hand gesture recognition system based on a near-infrared device is presented, which directly analyzes the infrared imagery to infer static and dynamic gestures, without using skeleton information. Thus, a much wider range of hand gestures can be recognized in comparison with skeleton-based approaches. To validate the proposed system, a new dataset of near-infrared imagery has been created, from which good results that outperform other state-of-the-art strategies have been obtained.
Klíčová slova:
Cameras – Fingers – Hidden Markov models – Nonverbal communication – Semiotics – k means clustering
Zdroje
1. Niechwiej-Szwedo E, Gonzalez D, Nouredanesh M, Tung J. Evaluation of the Leap Motion Controller during the performance of visually-guided upper limb movements. PLOS ONE. 2018;13(3):1–25. doi: 10.1371/journal.pone.0193639
2. Kim J, Ryu J, Han T. Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System. ETRI J. 2015;37(4):793–803. doi: 10.4218/etrij.15.0114.0076
3. Weech S, Moon J, Troje NF. Influence of bone-conducted vibration on simulator sickness in virtual reality. PLOS ONE. 2018;13(3):1–21. doi: 10.1371/journal.pone.0194137
4. Shang W, Cao X, Ma H, Zang H, Wei P. Kinect-Based Vision System of Mine Rescue Robot for Low Illuminous Environment. J of Sensors. 2016;1(1):1–10.
5. Mantecón T, del Blanco CR, Jaureguizar F, García N. New generation of human machine interfaces for controlling UAV through depth-based gesture recognition. In: SPIE Defense, Security and Sensing Conf. vol. 9084. Baltimore, MD, USA; 2014. p. 90840C–1–90840C–11.
6. Mantecón T, Mantecón A, del Blanco CR, Jaureguizar F, García N. Enhanced gesture-based human-computer interaction through a Compressive Sensing reduction scheme of very large and efficient depth feature descriptors. In: IEEE Int. Conf. on Advanced Video and Signal Based Surveillance. Karlsruhe, Germany; 2015. p. 1–6.
7. Pavlidis I, Symosek PF, Fritz BS. Near-IR human detector; 2002.
8. Pavlidis I, Symosek P, Fritz B, Bazakos M, Papanikolopoulos N. Automatic detection of vehicle occupants: the imaging problemand its solution. Machine Vision and Applications. 2000;11(6):313–320. doi: 10.1007/s001380050116
9. Lu W, Tong Z, Chu J. Dynamic Hand Gesture Recognition With Leap Motion Controller. IEEE Signal Process Lett. 2016;23(9):1188–1192. doi: 10.1109/LSP.2016.2590470
10. Schmidt T, Araujo FP, Pappa GL, Nascimento ER. Real-Time Hand Gesture Recognition Based on Sparse Positional Data. In: Brazilian Workshop on Comput. Vision. Uberlandia, Brazil; 2014. p. 243–248.
11. Du G, Zhang P, Liu X. Markerless Human-Manipulator Interface Using Leap Motion With Interval Kalman Filter and Improved Particle Filter. IEEE Trans Ind Informat. 2016;12(2):694–704. doi: 10.1109/TII.2016.2526674
12. Cho Y, Lee A, Park J, Ko B, Kim N. Enhancement of gesture recognition for contactless interface using a personalized classifier in the operating room. Computer Methods and Programs in Biomedicine. 2018;161:39–44. doi: 10.1016/j.cmpb.2018.04.003 29852966
13. Chuan CH, Regina E, Guardino C. American Sign Language Recognition Using Leap Motion Sensor. In: Int. Conf. on Mach. Learning and Applicat. Detroit, MI, USA; 2014. p. 541–544.
14. Kumar P, Saini R, Roy PP, Dogra DP. Study of Text Segmentation and Recognition Using Leap Motion Sensor. IEEE Sensors J. 2017;17(5):1293–1301. doi: 10.1109/JSEN.2016.2643165
15. Li WJ, Hsieh CY, Lin LF, Chu WC. Hand gesture recognition for post-stroke rehabilitation using Leap Motion. In: Int. Conf. on Appl. System Innovation; 2017. p. 386–388.
16. Naglot D, Kulkarni M. ANN based Indian Sign Language numerals recognition using the leap motion controller. In: Int. Conf. on Inventive Computation Technologies. vol. 2; 2016. p. 1–6.
17. Kai-Yin F, Ganganath N, Chi-Tsun C, Tse CK. A Real-Time ASL Recognition System Using Leap Motion Sensors. In: Int. Conf. on Cyber-Enabled Distributed Computing and Knowledge Discovery. Xi’an, China; 2015. p. 411–414.
18. Marin G, Dominio F, Zanuttigh P. Hand gesture recognition with Leap Motion and Kinect devices. In: IEEE Int. Conf. on Image Process. Paris, France; 2014. p. 1565–1569.
19. Ferreira PM, Cardoso JS, Rebelo A. Multimodal Learning for Sign Language Recognition. In: Iberian Conf. Pattern Recognition and Image Analysis. Faro, Portugal; Jun 2017. p. 313–321.
20. Kumar P, Gauba H, Roy PP, Dogra DP. Coupled HMM-based multi-sensor data fusion for sign language recognition. Pattern Recognition Lett. 2017;86:1–8. doi: 10.1016/j.patrec.2016.12.004
21. Zhang P, Li B, Du G, Liu X. A Wearable-based and Markerless Human-manipulator Interface with Feedback Mechanism and Kalman Filters. Int J on Advanced Robot Syst. 2015;12(12):164–170. doi: 10.5772/61535
22. Ponraj G, Ren H. Sensor Fusion of Leap Motion Controller and Flex Sensors Using Kalman Filter for Human Finger Tracking. IEEE Sensors Journal. 2018;18(5):2042–2049. doi: 10.1109/JSEN.2018.2790801
23. Zhang R, Ming Y, Sun J. Hand gesture recognition with SURF-BOF based on Gray threshold segmentation. In: Int. Conf. on Signal Processing; 2016. p. 118–122.
24. Otsu N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans on Syst Man, and Cybern. 1979;9(1):62–66. doi: 10.1109/TSMC.1979.4310076
25. Dalal N, Triggs B. Histograms of oriented gradients for human detection. In: IEEE Computer Society Conf. on Computer Vision and Pattern Recognition. vol. 1. San Diego, CA, USA; 2005. p. 886–893.
26. Mantecón T, del Blanco CR, Jaureguizar F, García N. Visual Face Recognition Using Bag of Dense Derivative Depth Patterns. IEEE Signal Process Lett. 2016;23(6):771–775. doi: 10.1109/LSP.2016.2553784
27. Choi E, Lee C. Feature extraction based on the Bhattacharyya distance. Pattern Recognition. 2003;36(8):1703–1709. doi: 10.1016/S0031-3203(03)00035-9
28. Cozien RF. Distributed image processing for automatic target recognition. In: Machine Vision and Three-Dimensional Imaging Systems for Inspection and Metrology. vol. 4189; 2001. p. 21–30. doi: 10.1117/12.417209
Článek vyšel v časopise
PLOS One
2019 Číslo 10
- S diagnostikou Parkinsonovy nemoci může nově pomoci AI nástroj pro hodnocení mrkacího reflexu
- Je libo čepici místo mozkového implantátu?
- Pomůže v budoucnu s triáží na pohotovostech umělá inteligence?
- AI může chirurgům poskytnout cenná data i zpětnou vazbu v reálném čase
- Nová metoda odlišení nádorové tkáně může zpřesnit resekci glioblastomů
Nejčtenější v tomto čísle
- Correction: Low dose naltrexone: Effects on medication in rheumatoid and seropositive arthritis. A nationwide register-based controlled quasi-experimental before-after study
- Combining CDK4/6 inhibitors ribociclib and palbociclib with cytotoxic agents does not enhance cytotoxicity
- Experimentally validated simulation of coronary stents considering different dogboning ratios and asymmetric stent positioning
- Risk factors associated with IgA vasculitis with nephritis (Henoch–Schönlein purpura nephritis) progressing to unfavorable outcomes: A meta-analysis
Zvyšte si kvalifikaci online z pohodlí domova
Všechny kurzy