TOWARD SEMI-AUTONOMOUS PROSTHETIC HAND CONTROL: APPLYING EMBEDDED NEURAL NETWORKS TO IMPROVE SENSOR FUSION IN PROSTHETIC FINGERTIP SENSORS
AbstractWe present an application of embedded, real-time neural network predictions to produce reliable sensing and enable semi-autonomous control of a prosthetic hand with embedded tactile sensors at the fingertips. We simultaneously predict the force magnitude and the position of contact, requiring on average 32.8ms, thereby enabling real-time measurements. We demonstrate 97.2% offline classification accuracy on the contact position, and a root mean squared error of 1.38 N (mean absolute error of 0.68 N) in predicting the force magnitude. Neural networking training is performed off-line on a Desktop computer using Keras, and compiled into efficient C-code for a nrf52840 microcontroller using the open-source tool “nn4mc.” The training model, as well as the nn4mc compiler, are available online, allowing prosthetic engineers to incorporate real-time, sensor-based inference into their prosthetic design.
How to Cite
B. Pulver, “TOWARD SEMI-AUTONOMOUS PROSTHETIC HAND CONTROL: APPLYING EMBEDDED NEURAL NETWORKS TO IMPROVE SENSOR FUSION IN PROSTHETIC FINGERTIP SENSORS”, MEC Symposium, Aug. 2022.
Myo Control and Sensory Feedback Implementations