TOWARD SEMI-AUTONOMOUS PROSTHETIC HAND CONTROL: APPLYING EMBEDDED NEURAL NETWORKS TO IMPROVE SENSOR FUSION IN PROSTHETIC FINGERTIP SENSORS

Authors

  • Ben Pulver
  • Sarah Aguasvivas Manzano
  • Aaron Selnick
  • Serena Kishek
  • Levin Sliker
  • Nikolaus Correll
  • Jacob Segil

Abstract

We present an application of embedded, real-time neural network predictions to produce reliable sensing and enable semi-autonomous control of a prosthetic hand with embedded tactile sensors at the fingertips. We simultaneously predict the force magnitude and the position of contact, requiring on average 32.8ms, thereby enabling real-time measurements. We demonstrate 97.2% offline classification accuracy on the contact position, and a root mean squared error of 1.38 N (mean absolute error of 0.68 N) in predicting the force magnitude. Neural networking training is performed off-line on a Desktop computer using Keras, and compiled into efficient C-code for a nrf52840 microcontroller using the open-source tool “nn4mc.” The training model, as well as the nn4mc compiler, are available online, allowing prosthetic engineers to incorporate real-time, sensor-based inference into their prosthetic design.

Downloads

Published

2022-08-09

How to Cite

[1]
B. Pulver, “TOWARD SEMI-AUTONOMOUS PROSTHETIC HAND CONTROL: APPLYING EMBEDDED NEURAL NETWORKS TO IMPROVE SENSOR FUSION IN PROSTHETIC FINGERTIP SENSORS”, MEC Symposium, Aug. 2022.

Issue

Section

Myo Control and Sensory Feedback Implementations