Lost in the iPad hype is the news that Microsoft has sold the rather stunning figure of eight million Kinects in the sixty days since it hit the market. In fact, more Kinects were sold in the fourth quarter than iPads. The product’s remarkable user experience is responsible in part for its success. But that isn’t the whole story behind Kinect sales number: When Microsoft released the product on November 4th, my friends Phil Torrone and Limor Fried at Ada Fruit Industries offered $3,000 to the first person who could hack the Kinect and post the information to GitHub, a public repository for code. Eleven days later when the hack appeared, officials at Microsoft didn’t go nuts. They actually went on NPR to embrace the deed.
Torrone, who is creative director at Adafruit as well as a senior editor of MAKE magazine, says that Microsoft is smart to embrace hacking’s benefit as a corporate development tool. “Microsoft quickly realized that user innovation was helping, not hurting, their biggest product launch in recent years,” he says. “Within weeks there were dozens of examples of makers, hackers, artists, engineers and tinkerers doing things that even Microsoft didn’t expect. I think we’ll see an entire industry of commercializing experiments in to games and experiences for Kinect users.” (One of the best Kinect hacks, by the way, is this Minority Report demo.)
Matt decided to run his own Open Kinect contest after being inspired by Adafruit Industries contest.
The first $1000 prize goes to the person or team that writes the coolest open-source app, demo, or program using the Kinect. The second prize goes to the person or team that does the most to make it easy to write programs that use the Kinect on Linux.
He was so inspired he ended up declaring seven $1000 winners!
The first two winners fell into the categories of coolest open-source app, demo, or program and the last five winners for making an easy way to write programs under Linux.
“We are enthusiastic about the consumer response to our holiday lineup of products, including the launch of Kinect. The 8 million units of Kinect sensors sold in just 60 days far exceeded our expectations,” said Peter Klein, chief financial officer at Microsoft in a statement.
Congrats Microsoft! We are glad this is working out We are still being wowed by the vast uses of the device and I am sure now Microsoft is too.
This video shows a GeckoSystems’ Carebot(tm), equipped with a pair of Microsoft Kinect sensors, navigating through a narrow passageway cluttered with various obstacles. This represents the worst case for in-home navigation.
Kinect processing is handled by a piece of software called GeckoImager, running on a dual core 1.66GHz Intel Atom motherboard. The navigation is handled by two other GeckoSavants(tm), GeckoNav and GeckoSuper, running on a separate dual core Atom machine. Both computers were located on the robot during the video.
I wonder if the motor drive uses Spacely Sprockets or Cogswell’s Cogs…
Pantomation was a very early tracking chromakey system from the 1970s. Originally intended for music scoring, the system was adapted to other styles of performance art. While crude by modern standards, the concept was decades ahead of its time; it can reasonably be considered an early forebear of systems like Microsoft’s Project Natal.
To Philip Torrone, a principal at Adafruit, and therefore one of the people with a legitimate claim to kicking off the entire field of Kinect hacking, this is a very exciting time to be trying to find, or stretch, the controller’s limits. “The Kinect as it is wasn’t meant to do anything it’s doing now,” Torrone said. “We’re only at version 1 [and] the best part about all these hacks will be all the things we cannot possibly imagine.
That said, with his prognostication hat on, Torrone sees a rich future of new projects made with the device.
Robots and mo-cap ”A few things come to mind, since the Kinect–once hacked–is a great input device specifically [for] whole-body movements,” Torrone said. “I think we’ll see the Kinect hackers start out using it to control machinery in some manner. There are some small robotics and telepresence examples already [but] that’s just the start.”
Torrone explained that he imagines Kinect telepresence projects along the lines of people employing the device as “puppet strings for controlling robots with vision systems over small and large distances.” At the same time, he envisions the Kinect being used to run anything from industrial machinery to giant Burning Man projects. In each case, “the hacked Kinect allows the user to control extremely complicated machines just [by] using their bodies, distance, and shapes.”
But limiting the Kinect to controlling Earth-bound projects is small potatoes, Torrone suggested. He finds it easy to imagine hacked Kinects “being used to control planetary or moon probes as we land new rovers in our solar system.”
NASA, of course, might prefer to work within Microsoft-authorized uses.
At the same time, Torrone thinks that the Kinect could be a very attractive new tool for young robot enthusiasts and predicted that First, a large high school robotics organization, could start implementing the device in its competitions. And then there’s the obvious military, industrial, and research uses. For one, Kinects could be employed to operate underwater robots, and for another, scientists might be able to use the controller to conduct microscopic studies. “I suppose we might see weaponized robots,” Torrone said. “‘Robot wars’ with human puppeteers [using Kinects] and giant robots fighting is certainly going to happen.”
The goal of this hack is to import the data from the Xbox Kinect, process it, and create various effects in the Minecraft universe. The project is currently in its third phase. The first phase aimed to take a snapshot from a Kinect and transform it into a Minecraft save file. The depth and color information was used to approximate the position and types of blocks to best represent the 3D point cloud in MineCraft. The second phase aimed to animate the data captured by the Kinect at 30FPS by means of stop-motion animation.
The Kinect is a great add-on for the XBox 360. Many people had their doubts, and although core gamers will decry the Kinect, it can be really good fun to play around with. Literally, you are the controller. There has been some interest in hacking the Kinect to work with computers other than the XBox, and there are drivers available now to use with Windows, MacOSX and Linux.
I became interested in coding Processing to work with the Kinect, but soon became disheartened when it appeared that bone data (or joint positions) were not readily available, instead only 3D depth data (cool but not enough) was readable.
Then I found the OpenNI drivers (see links below) and the OSC server from Sensebloom. They were able to send joint data encoded as OSC commands to Processing, and I wrote up my own little Processing receiver to test it. It worked so well, I delved into their C++ code to see if I could send the same data to Scratch.
When I couldn’t see a way for Scratch to interpret the OSC code, I read about the remote sensing over the network that Scratch allows. This was perfect; I could send Scratch commands from a C program which was reading joint data from the Kinect.
So after making a few Scratch proof-of-concept games to test it, I decided to release it all for you guys to test out.
Ladyada “Limor Fried” will be on CNBC’s FAST MONEY TV show tonight around 5 or 6pm talking about the hacked Kinect! Tune in! If it doesn’t run tonight it will run on Friday, same time. If anyone records it, we’d appreciate it (we do not have a TV).
Hacked Kinect turns you in to Ultraman! hogehoge335 writes -
Ultra Seven is an old-time tokusatsu superhero in Japan. I developed a Kinect hack to create an Augmented Reality (AR) where you can be Ultra Seven and execute his power. It’s merely a joke program, but I LOVE to make whatever effort for this kind of stuff. I will make the code publicly available some time later
The Kinect allows tracking of users without additional markers. We develop a magic mirror that generated an overlay of a video image with volume visualization from a CT volume. Such a system could be used for education of anatomy. The Microsoft Kinect provides a color and a depth image. Using OpenNI and PrimeSense NITE we can get the skeleton of a person standing in front of the Kinect in real-time. We register a CT dataset to this skeleton and do an Augmented Reality overlay of the CT and the person. Using context and focus visualization we only show the CT through a virtual window, while still showing the person. The Kinect is positioned next to a big screen. When standing in front of the system, the system acts like a magic mirror that allows seeing inside yourself. Such a system can be used for educational purposes to teach anatomy. The system is currently a prototype. It does not show the CT of the person in front of the monitor, but a CT of another person. We use the Visible Korean Human dataset that has been provided to us by Prof. Min Suk Chung.
Hector Martin (aka Marcan) is a busy guy. Not long after winning the Adafruit OpenKinect Bounty back in November, he released another awesome project: OpenLase — an open laser graphics framework. Here he is at 27C3, giving one of the best lightning talks I’ve ever seen. It starts out good, and then it just gets better!
Happy New Year!
That infectious song at the end is “Bad Apple” by Masayoshi Minoshima & Nomico. You’re welcome.