Abstract
A prosthetic device, also known as a prosthesis, can help with rehabilitation when an arm or other limbs is severed or lost. The upper-limb prosthesis aids restoration of motor skills, however, tactile sensations are deprived to an amputee that is used for grip control. As a result, it's a frequent assumption that restoring force feedback will help with prosthetic gripping force management. This paper presents a vision-based analysis to collate data using neural network hand landmarks to distinguish between various grip patterns that will aid the prosthesis. The prosthetic arm is developed using 3D printing technology, which makes an efficient low-cost solution that can be personalized for every user in terms of size, shape, or color. The computer vision hand landmark technique uses a machine learning (ML) pipeline. This encompasses various models such as BlazePalm used to deduce 21 3D landmarks in a bounding box of a hand from a single picture. Along with vision data, information regarding grasp motion is best relayed through tactile feedback. In this project, the hand landmark comparison functions along with the pressure data from the tactile sensors placed on the prosthetic arm under the same settings revealed distinct patterns that can be utilized to distinguish different grasp motions.
Original language | English |
---|---|
Title of host publication | 2021 IEEE International Symposium on Robotic and Sensors Environments (ROSE) proceedings |
Place of Publication | Piscataway, NJ |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 1-7 |
Number of pages | 7 |
ISBN (Electronic) | 9781665440622 |
ISBN (Print) | 9781665411677 |
DOIs | |
Publication status | Published - 2021 |
Event | IEEE International Symposium on Robotic and Sensors Environments (ROSE) - Virtual, United States Duration: 28 Oct 2021 → 29 Oct 2021 |
Conference
Conference | IEEE International Symposium on Robotic and Sensors Environments (ROSE) |
---|---|
Country/Territory | United States |
Period | 28/10/21 → 29/10/21 |
Keywords
- prosthetic arm
- computer vision
- gesture recognition
- grip pattern
- neural network