Back to Search
Start Over
Towards robust ego-centric hand gesture analysis for robot control
- Source :
- 2016 IEEE International Conference on Signal and Image Processing (ICSIP).
- Publication Year :
- 2016
- Publisher :
- IEEE, 2016.
-
Abstract
- Wearable device with an ego-centric camera would be the next generation device for human-computer interaction such as robot control. Hand gesture is a natural way of ego-centric human-computer interaction. In this paper, we present an ego-centric multi-stage hand gesture analysis pipeline for robot control which works robustly in the unconstrained environment with varying luminance. In particular, we first propose an adaptive color and contour based hand segmentation method to segment hand region from the ego-centric viewpoint. We then propose a convex U-shaped curve detection algorithm to precisely detect positions of fingertips. And parallelly, we utilize the convolutional neural networks to recognize hand gestures. Based on these techniques, we combine most information of hand to control the robot and develop a hand gesture analysis system on an iPhone and a robot arm platform to validate its effectiveness. Experimental result demonstrates that our method works perfectly on controlling the robot arm by hand gesture in real time.
- Subjects :
- business.industry
Computer science
Wearable computer
02 engineering and technology
Image segmentation
Convolutional neural network
Robot control
Gesture recognition
020204 information systems
0202 electrical engineering, electronic engineering, information engineering
Robot
020201 artificial intelligence & image processing
Computer vision
Artificial intelligence
business
Robotic arm
Gesture
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2016 IEEE International Conference on Signal and Image Processing (ICSIP)
- Accession number :
- edsair.doi...........dbdaabcd98eea0f8d25aa075f6500460
- Full Text :
- https://doi.org/10.1109/siprocess.2016.7888345