Prosthetic Hand Capable of Learning and Mimicking Natural Hand Gestures

Limb loss, especially hand loss, strips away a person’s ability to perform critical daily tasks and reduces an amputee’s quality of life. The growing demand for affordable and available prosthetics is becoming a global issue that could be solved with the recent advancements in AI combined with 3D printing and EMG signal detection. This project proposes a mechanism to design a 3D printed prosthetic hand capable of learning and mimicking natural hand gestures presented in a relatively affordable and accessible package. The gestures are classified based on EMG signals retrieved through a Myo armband fed to a deep learning model CNN + LSTM processed on a Raspberry Pi microprocessor. The actuation system consists of 5 servo motors for each of the 5 fingers wired to the output channels of a PWM (Pulse Width Modulation) driver. The fingers of the hand are rigid parts connected through flexible joints to enable gestures including open hand, fist, pinch, phone grasp, and cup grasp. The​‍​‌‍​‍‌​‍​‌‍​‍‌ mechanical framework is based on the InMoov model and made with PLA through 3D printing. Recognition performance using leave-one-session-out cross-validation achieved a mean gesture classification accuracy of 95.46%, thereby indicating a robust generalization capability. The fully integrated system combines gesture recognition, motor control, and mechanical actuation in a compact unit capable of low latency and continuous operation, demonstrating the feasibility of AI-driven prosthetic hands as accessible assistive technologies.

Project Details

[photo]