Description:

EasyArm is the prosthetic arm that can catch! Using a simple camera as a sensor and different computer vision algorithms, EasyArm tracks objects and predicts trajectories to catch things like a regular arm.

Inspiration:

Working with special education children in the past, I've seen how expensive and clunky prosthetic limbs can be. For my project I'd like to give people with physical disabilities a chance to play catch just like any other person can.

What it does:

Using a simple web camera, images are extracted and sent to the image classifier where the image is passed over with a derivative filter to find the contours of the ball. The position of the ball is then calculated and the trajectory is passed into the arm controls which give messages to the individual servos in the arm.

How we built it:

A 3D printed manipulator arm, an Arduino Uno and a web camera was the only hardware used. OpenCV and SciKit Learn were used for image processing and image classification with Python. C++ was used to interface with the arm and to calculate trajectories and motor instructions.

Challenges we ran into:

The largest constraint of the project was that it had to work in real-time in order to see the ball and then catch it in time. To do this, only computationally inexpensive machine learning models were used and the resolution of the camera was limited greatly. To make the catch easier, a foam ball was used for easier gripping. Running most computations on the computer instead of on the micro-controller also helped the arm run in real-time.

Accomplishments that we're proud of:

I'm very happy that my project combines the worlds of prosthetic limbs and computer vision, and I'm that I can give back to a community thats been great to me for a long time.

What we learned:

I learned that making a real-time camera sensor was almost impossible without a high-end GPU, because the reaction time of my arm could have been greatly improved with a good graphics card. I also learned about calculating projectile motion and image classification, which are definitely valuable skills for an embedded systems engineer.

What's next:

In the long term, I hope my project inspires people to use computer vision with prosthetic limbs because it is something that has only become possible very recently, and can make artificial motion much more robust. Even if my project doesn't take off I'm glad that I can bring it to the kids I work with and its something cool that they can show their friends.

Built with:

A 3D printed arm, Arduino Uno, and web camera were the hardware used. Python, C++, OpenCV, Scikit learn were the softwares used.

Prizes we're going for:

Hexacopter Drone

Social Entrepreneurship Award

Raspberry Pi Arcade Gaming Kit

Intel® Movidius™ Neural Compute Stick

Team Members

Harrison Packer
View on Github