Description:

This project is a motion activated music player that uses Leap Motion to detect gestures. These gestures are linked to media commands and can be used to control any music player.

Inspiration:

When we listened to music while doing involved activities like cooking, we found it was impossible to do simple media commands like pause with greasy hands. After hearing about Leap Motion, we realized we could utilize modern VR technology to execute media commands without touching the keyboard.

What it does:

Our project connects three motions to essential media commands. Our project supports Play/Pause, track changes, and volume changes. A quick, downward tapping movement, or keytap, is Play/Pause. A linear movement side to side motion changes the track. A circle motion changes volume.

How we built it:

We built our project with Leap Motion's V2 Tracking on OSX with the Python SDK.

Challenges we ran into:

1. Leap Motion's Python SDK tutorial code could not compile. 2. Simulating keyboard clicks was difficult to implement especially when libraries did not support OSX. 3. Two of us had Windows desktops while the other two had Macs. With different OS, we faced compatibility issues and opted to go with OSX instead of Windows. 4. The frames were rendered too quickly for users to make distinct commands. We had to set a timer so that the user could enter a complete gesture before . 5. To display the data retrieved from the Leap Motion hardware, we had to send a POST request to a Flask web server and then render the front-end through a web socket. This involved multiple libraries working together to pass data and learning how to connect a web socket from Flask to multiple clients.

Accomplishments that we're proud of:

We corrected the Python SDK tutorial to run python code that recorded the positions of each finger's joints in a 3D Cartesian Coordinate System. We also connected this code to OSX media commands on the keyboard with Python so it can work with any media player instead of using API calls to one specific app. Finally, we connected a web socket with a Flask server so we could render HTML from the backend.

What we learned:

We researched VR and learned about how to work with Leap Motion. None of us had worked with Leap Motion nor its hardware before and had to fix the tutorial. Furthermore, we learned about the OSX keyboard shortcuts to media commands. We also learned how to setup socketIO connections for Flask so that multiple clients could connect to one Flask server.

What's next:

Our project could use greater user interaction. The next step would be to use Unity and create a 3D menu of options to configure personal settings. With Unity, we would also create interactive blocks for more complicated commands such as choosing which song to play next.

Built with:

We built it with Leap Motion V2 Tracking, Python SDK, and Python 2.7.10.

Prizes we're going for:

Oculus Go (32 GB)

Grand Prize

HAVIT RGB Mechanical Keyboard

Google Home Mini

Team Members

Crystal Rhee, Ivan Chen, Jonathan Shee, Timothy Shee
View on Github