Is there a way of using leap motion as an input to an android app? I know that the SDK currently only supports Windows and MAC, but is there a way (any open library/interface via Windows) to make the device talk to an android phone (could be rooted)? Any other depth sensing alternatives for hand gestures for android aside form Kinect?
Might be a bit of a late answer, but check out this link. It deals with connecting leapmotion and android throught a node.js server.
http://marctan.com/blog/2013/05/26/leap-motion-and-android-a-match-made-in-heaven/
Introduction
Thanks for the hints of using node.js server as proxy. I have come up to a smooth acceptable solution to make LeapMotion work on Android (indirectly).
Pre-requisites
The basic idea of this solution is to use the PROXY to make the impossible (Android+LeapMotion) to become possible. The PROXY will read the data from LEAP MOTION using the Javascript SDK and stream (ya, it's streaming rather than posting) the data to node.js instance running on it.
The DEVICE will connect to the PROXY and stream (ya, it's again streaming rather than polling) the hand position data and present it as a red circle on screen.
Step-by-step guide
Download the PROXY project here
Extract the project
Run in on your instance of node.js server on your PC or Mac (e.g. node index.js
)
3a. You might need to install binaryjs and sleep module by npm here
Mark down the IP of the PROXY
Open your browser (Chrome and Safari are proven working) and browse http://localhost:5000
to verify it is running
Download the Android project for your DEVICE here
Import the project in ADT
Open strings.xml to modify the IP address to the IP address of PROXY
Run the project on YOUR DEVICE
Move your hand above the LEAP MOTION and see the red circle moving according to your hand's direction
Enjoy!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With