Well, this is interesting. A startup called Leap Motion has announced a new gestural controller which they claim is more accurate than the Kinect, and sells at an even lower price point ($70!). From Edgadget:
It’s about the size of a pack of gum, and once connected to your computer via USB, it creates a four-cubic-foot virtual workspace. Within that area, it tracks all ten of your fingers simultaneously to within 1/100 of a millimeter — that level of accuracy allows for rudimentary gestures like pinch-to-zoom and more complex actions like manipulating 3D-rendered objects. Naturally, the company isn’t telling much about the black magic making it happen, but Leap Motion claims that its software can be embedded in almost anything with an onboard computer, from phones to refrigerators. Users can customize it to suit their needs with custom gestures and sensitivity settings, in addition to chaining multiple Leap devices together to create a larger workspace. Plus, Leap Motion has created an SDK for devs to create Leap-compatible applications and an app discovery platform to distribute them to others. That means the Leap can work in a variety of use cases, from simply navigating your desktop to gaming and computer-aided design. The best part? Leap brings you this next-gen UX for a mere $69.99, and a select few can pre-order them now, with the full roll-out coming this winter. Full details follow in the PR below, and you can see the Leap in action in the videos after the break.
It’s nice to see them leading with an SDK — the Kinect, which was marketed originally as a game controller, did not have an SDK at launch — but I haven’t been able to figure yet what (if any) restrictions there are for developers. Hopefully, the terms will be more free than the Microsoft SDK.
Anyway, it sounds like it might be fun to play around with. Leap Motion claims the device: “creates a 3D interaction space of 4 cubic feet to precisely interact with and control software on your laptop or desktop computer.” — that’s what it was designed for, but I wonder what else you could make it do? Hmmm…
I started to get interested in what others were doing with the Xbox Kinect after reading many interesting blog posts and seeing what a recent maker has done with it. Microsoft is quietly, in my view, building a robust community of developers who are hacking and creating in all sorts of powerful, useful, and fun ways as you’ll read here.
We think the Kinect hacking has been one of the best things that has happening to Microsoft and they seem to be embracing it as well.
The original Kinect helped make the Xbox 360 last year’s bestselling game console; Microsoft has sold more than 18 million Kinects since November 2010. It’s also inspired tinkerers to put the device to unanticipated uses, such as guiding robots and doing 3D modeling. With Kinect for Windows, Microsoft aims to coax professional developers and big companies to create apps that make Kinect as essential in the home, office, and showroom as smartphones are to those on the go. “This is a turnaround chance for Microsoft,” says James McQuivey, an analyst at Forrester Research (FORR). “A chance for them to say this isn’t about video gaming, it isn’t about Windows, it’s about the future of everything.”
The open source community did a great job showing the possibilities once hardware is set “free”.
…we’re delighted to announce the general availability of Microsoft Robotics Developer Studio 4 (RDS 4) which can be downloaded for free from the Microsoft Robotics website. It was just over five months ago that we announced the availability of RDS 4 Beta and since then, the Microsoft Robotics team has been hard at work putting the final touches on RDS 4 to give developers access to the software they need to build robotics applications… our own team has been using RDS 4 for a while now and we’ve come up with a few cool and unique applications. Check out the video of the Kinect Follow Me robot which was created by our team.
“There is a rapidly expanding online community of people who have been able to use the Microsoft Kinect to do really amazing things,” Gould said in an e-mail. “Thanks to their hard work, we have been able to adapt what is essentially a toy to be a part of our video show.”
In this episode of Waterloo Labs we show you how we combined an XBox Kinect, an Arduino, LabVIEW and an off the shelf Etch-a-Sketch to make the Kinect-a-Sketch. This system allows you to control the Etch-a-Sketch just by standing in front of the Kinect. You can you a gigantic pencil or even just your hand.
Ever since Rosey the Robot took care of “The Jetsons” in the early 1960s, the promise of robots making everyday life easier has been a bit of a tease.
With Ava, left, iRobot is trying to do Rosey the Robot of “The Jetsons” one better. Ava will have an iPad or Androidtablet for a brain and Xbox motion sensors to help her get around.
Rosey, a metallic maid with a frilly apron, “kind of set expectations that robots were the future,” said Colin M. Angle, the chief executive of the iRobot Corporation. “Then, 50 years passed.”
Now Mr. Angle’s company is trying to do Rosey one better — with Ava, a 5-foot-4 assistant with an iPad or an Android tablet for a brain and Xbox motion sensors to help her get around. But no apron, so far.
In a future of user-complicity in surveillance can we create a parallel narrative allowing those who are seen to abstract and enjoy their own image?
We intend for these images to represent a hint of the potential for play and experimentation in a world of advanced imaging technology.
The images depict fragments of candid photographs placed into 3-dimensional space. They use depth data captured from a Microsoft XBOX Kinect video game controller with hacked drivers, digital SLR images, and custom software.
Flying a S107 RC Helicopter using the Microsoft Kinect and the Arduino Uno. The Kinect detects my hands, head, and hips. This information is translated into x, y, z coordinates, processed with some 7th grade Algebra, and then sent to the Arduino over the serial port. The Arduino receives the signal, and converts it to a 38 kHz Infrared signal that the S107 can understand.
This is something I worked on over the summer last year and its finally out from under wraps. The idea is to create a version of the Mirror Box; effectively copying the real limb onto the Phantom Limb in order to relieve the pain that such people feel. This has been done once before with VR but now we have the kinect and cheaper VR goggles and XBee units from Adafruit, we can build a much cheaper rig and begin to investigate what works and what doesnt.