Verge has the technical details on the Leap Motion Controller
The Leap uses a number of camera sensors to map out a workspace of sorts — it's a 3D space in which you operate as you normally would, with almost none of the Kinect's angle and distance restrictions. Currently the Leap uses VGA camera sensors, and the workspace is about three cubic feet; Holz told us that bigger, better sensors are the only thing required to make that number more like thirty feet, or three hundred. Leap's device tracks all movement inside its force field, and is remarkably accurate, down to 0.01mm. It tracks your fingers individually, and knows the difference between your fingers and the pencil you're holding between two of them.
Developers that do take advantage of the Leap's SDK will be able to do much more, however, and the possibilities appear to be limited only by your imagination. All kinds of different apps are being developed: some could improving remote surgery, others allow easier navigation through complex models and data, and others might put you square in the middle of a first-person shooter. It's like holding the Mario Kart steering wheel, but on a whole new level.
Rather than mapping particular gestures (cross your arms to close the app, draw a circle to open a new window), Holz said developers are being encouraged to provide constant dynamic feedback. No one needed to be taught what pinch-to-zoom meant — it's the natural thing to try and do on a touchscreen, and as soon as you start pinching or spreading it becomes clear what happens. That's the paradigm for the Leap, Holz says: you should always be able to just do something, and the app or device should respond.
Leap Motion's plans are huge (Holz mentioned a few times wanting to totally upend traditional computing methods) but the company's playing its cards close. The Leap will cost $70 when it's released — sometime between December and February — and Leap Motion is also working with OEMs to embed its technology into devices. The Leap is about the size of a USB drive, but Holz says it could easily be no larger than a dime, so adding it to a laptop or tablet shouldn't be difficult.
Developers are apparently beating down the company's doors for access to the technology — Holz said thousands of Leaps will be given away in the next few months, before it's released to the public.
Unlike a touchscreen interface, with the Leap, there's no friction. That sounds trivial, but it isn't. It's the difference between attempting to conduct a symphony with a wand and attempting to conduct the same symphony by sketching out what the orchestra should do next via chalk on a blackboard.
Plus, Leap operates in three dimensions rather than two. Forget pinch-to-zoom; imagine "push to scroll," rotating your flattened hand to control the orientation of an object with a full six degrees of freedom, or using both hands at once to control either end of a bezier surface you're casually sculpting as part of an object you'll be sending to your 3D printer.
The fact that the Leap can see almost any combination of objects - a pen, your fingers, all 10 fingers at once, should make every interface designer on the planet giddy with anticipation. If you thought that the touchscreen interface on the iPhone and subsequent tablets opened up a whole new way to interact with your device, imagine something that combines the intuitiveness of that experience with the possibility of such fine-grained control that you could do away with the trackpad or mouse entirely.
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks