A Virtual Reality Training Environment For Myoelectric Prosthesis Grasp Control With Sensory Feedback

Authors

  • Mitchell Dumba
  • Michael Dawson
  • Glyn Murgatroyd
  • Patrick Pilarski
  • Jacqueline Hebert
  • Ahmed Shehata

DOI:

https://doi.org/10.57922/mec.2478

Abstract

Upper limb myoelectric prosthesis control is difficult to learn. Virtual reality has seen increased deployment in recent years for prosthesis training because it is repeatable, engaging, and can be implemented in the home. While most virtual reality prosthesis simulators do not challenge grasp function, this paper presents the Virtual Prosthesis Emulator (ViPEr), a virtual reality environment for prosthesis grasp control with sensory feedback. For sensory feedback in ViPEr, we have derived data-driven transfer functions that best approximate the applied force from a physical prosthesis and integrated them into a sensory feedback system. This system allows us to relay the interaction force using mechanotactile tactors and recreate realistic interactions, including objects' specific lift, crush, and deformation characteristics. We will use ViPEr in an upcoming study to evaluate the skill transfer to physical prosthesis performance and the effect of providing sensory feedback in virtual reality training.

Downloads

Published

2024-08-15

How to Cite

[1]
M. Dumba, M. Dawson, G. Murgatroyd, P. Pilarski, J. Hebert, and A. Shehata, “A Virtual Reality Training Environment For Myoelectric Prosthesis Grasp Control With Sensory Feedback”, MEC Symposium, Aug. 2024.

Conference Proceedings Volume

Section

Myo Control and Sensory Feedback Implementations