EasyArm is the prosthetic arm that can catch! Using a simple camera as a sensor and different computer vision algorithms, EasyArm tracks objects and predicts trajectories to catch things like a regular arm.
Working with special education children in the past, I've seen how expensive and clunky prosthetic limbs can be. For my project I'd like to give people with physical disabilities a chance to play catch just like any other person can.
Using a simple web camera, images are extracted and sent to the image classifier where the image is passed over with a derivative filter to find the contours of the ball. The position of the ball is then calculated and the trajectory is passed into the arm controls which give messages to the individual servos in the arm.
A 3D printed manipulator arm, an Arduino Uno and a web camera was the only hardware used. OpenCV and SciKit Learn were used for image processing and image classification with Python. C++ was used to interface with the arm and to calculate trajectories and motor instructions.
The largest constraint of the project was that it had to work in real-time in order to see the ball and then catch it in time. To do this, only computationally inexpensive machine learning models were used and the resolution of the camera was limited greatly. To make the catch easier, a foam ball was used for easier gripping. Running most computations on the computer instead of on the micro-controller also helped the arm run in real-time.
I'm very happy that my project combines the worlds of prosthetic limbs and computer vision, and I'm that I can give back to a community thats been great to me for a long time.
I learned that making a real-time camera sensor was almost impossible without a high-end GPU, because the reaction time of my arm could have been greatly improved with a good graphics card. I also learned about calculating projectile motion and image classification, which are definitely valuable skills for an embedded systems engineer.
In the long term, I hope my project inspires people to use computer vision with prosthetic limbs because it is something that has only become possible very recently, and can make artificial motion much more robust. Even if my project doesn't take off I'm glad that I can bring it to the kids I work with and its something cool that they can show their friends.
A 3D printed arm, Arduino Uno, and web camera were the hardware used. Python, C++, OpenCV, Scikit learn were the softwares used.
Hexacopter Drone
Social Entrepreneurship Award
Raspberry Pi Arcade Gaming Kit
Intel® Movidius™ Neural Compute Stick