Description:

Our team are a group of people who are passionate in making our life better with advanced Machine Learning technology. By capturing finger print, we convert your movement to command and control you device. Our interaction tech are so powerful, and it could give you so many probabilities. So please fell free to come and try our finger dancer!

Inspiration:

Better human-computer interaction is always engaging. And do you know your laps are also waiting for your response. Immersive experience let things board, Mac's murmur may be better than your lover's complaints.

What it does:

recognize hand movement, and simulate keyboard press

How we built it:

- Search - Improve - Design - Tweak - Debug

Challenges we ran into:

* Our biggest challenge was to avoid the hand shaking. * And we faced challenge as how to tackle with the lines come out of the screen * Some color are difficult to trace * Background noise

Accomplishments that we're proud of:

We made it. Develop some intersect for OpenCV, Parallel, Threading. Thinking itself is attracting

What we learned:

OpenCV, Python, Threading

What's next:

Make accurate algorithm to detect color. Add gesture control.

Built with:

Mind.

Prizes we're going for:

Arteck HB030 Portable Keyboard

Call of Duty: Black OPS 4 (XBOX ONE)

$100 Amazon Gift Cards

Grand Prize

Misfit Shine 2

Team Members

Katharine Zhong, Jack Zhang, Xuanyu Chen, WPI Goats!
View on Github