Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses
Type of publication: | Inproceedings |
Citation: | |
Booktitle: | 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob) |
Year: | 2018 |
Month: | August |
Pages: | 783-788 |
Publisher: | IEEE |
Location: | Enschede, The Netherlands |
URL: | http://ieeexplore.ieee.org/sta... |
DOI: | https://doi.org/10.1109/BIOROB.2018.8487923 |
Abstract: | The instability of myoelectric signals over time complicates their use to control poly-articulated prosthetic hands. To address this problem, studies have tried to combine surface electromyography with modalities that are less affected by the amputation and the environment, such as accelerometry and gaze information. In the latter case, the hypothesis is that a subject looks at the object he or she intends to manipulate, and that the visual characteristics of that object allow to better predict the desired hand posture. The method we present in this paper automatically detects stable gaze fixations and uses the visual characteristics of the fixated objects to improve the performance of a multimodal grasp classifier. Particularly, the algorithm identifies online the onset of a prehension and the corresponding gaze fixations, obtains high-level feature representations of the fixated objects by means of a Convolutional Neural Network, and combines them with traditional surface electromyography in the classification stage. Tests have been performed on data acquired from five intact subjects who performed ten types of grasps on various objects during both static and functional tasks. The results show that the addition of gaze information increases the grasp classification accuracy, that this improvement is consistent for all grasps and concentrated during the movement onset and offset. |
Keywords: | |
Authors | |
Added by: | [] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|
|
Topics
|
|
|