The Open Kinect project – THE OK PRIZE – get $3,000 bounty for Kinect for Xbox 360 open source drivers

Pt 10508

Hi from team Adafruit, we’re going to do our first ever “X prize” type project. Hack the Kinect for Xbox 360 and claim the $2,000 bounty! NOW $3,000

What is Kinect?

Kinect for Xbox 360, or simply Kinect (originally known by the code name Project Natal (pronounced /nəˈtɒl/ nə-tahl)), is a “controller-free gaming and entertainment experience” by Microsoft for the Xbox 360 video game platform, and may later be supported by PCs via Windows 8. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller through a natural user interface using gestures, spoken commands, or presented objects and images. The project is aimed at broadening the Xbox 360’s audience beyond its typical gamer base. It will compete with the Wii Remote with Wii MotionPlus and PlayStation Move motion control systems for the Wii and PlayStation 3 home consoles, respectively. Kinect is scheduled to launch worldwide starting with North America in November.

What is the hardware?

The Kinect sensor is a horizontal bar connected to a small base with a motorized pivot, and is designed to be positioned lengthwise below the video display. The device features an “RGB camera, depth sensor and multi-array microphone running proprietary software”, which provides full-body 3D motion capture, facial recognition, and voice recognition capabilities.

According to information supplied to retailers, the Kinect sensor outputs video at a frame rate of 30 Hz, with the RGB video stream at 32-bit color VGA resolution (640×480 pixels), and the monochrome video stream used for depth sensing at 16-bit QVGA resolution (320×240 pixels with 65,536 levels of sensitivity). The Kinect sensor has a practical ranging limit of 1.2–3.5 metres (3.9–11 ft) distance. The sensor has an angular field of view of 57° horizontally and a 43° vertically, while the motorized pivot is capable of tilting the sensor as much as 27° either up or down. The microphone array features four microphone capsules, and operates with each channel processing 16-bit audio at a sampling rate of 16 kHz.

How does it work?
Wired has a great article about!

Canesta-Howitworks1

Sound cool? Imagine being able to use this off the shelf camera for Xbox for Mac, Linux, Win, embedded systems, robotics, etc. We know Microsoft isn’t developing this device for FIRST Robotics, but we could! Let’s reverse engineer this together, get the RGB and distance out of it and make cool stuff! So……

What do we (all) want?
Open source drivers for this cool USB device, the drivers and/or application can run on any operating system – but completely documented and under an open source license. To demonstrate the driver you must also write an application with one “window” showing video (640 x 480) and one window showing depth. Upload all of this to GitHub.

How get the bounty ($3,000 USD)
Anyone around the world can work on this, including Microsoft 🙂 Upload your code, examples and documentation to GitHub. First person / group to get RGB out with distance values being used wins, you’re smart – you know what would be useful for the community out there. All the code needs to be open source and/or public domain. Email us a link to the repository, we and some “other” Kinect for Xbox 360 hackers will check it out – if it’s good to go, you’ll get the $3,000 bounty!

Update: We’ve increased it to $3,000 – why? We just read this at CNET

But Microsoft isn’t taking kindly to the bounty offer. Bounty offered for open-source Kinect driver – “Microsoft does not condone the modification of its products,” a company spokesperson told CNET. “With Kinect, Microsoft built in numerous hardware and software safeguards designed to reduce the chances of product tampering. Microsoft will continue to make advances in these types of safeguards and work closely with law enforcement and product safety groups to keep Kinect tamper-resistant.”


Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards

Join Adafruit on Mastodon

Adafruit is on Mastodon, join in! adafruit.com/mastodon

Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.

Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7pm ET! To join, head over to YouTube and check out the show’s live chat – we’ll post the link there.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Join over 36,000+ makers on Adafruit’s Discord channels and be part of the community! http://adafru.it/discord

CircuitPython – The easiest way to program microcontrollers – CircuitPython.org


Maker Business — “Packaging” chips in the US

Wearables — Enclosures help fight body humidity in costumes

Electronics — Transformers: More than meets the eye!

Python for Microcontrollers — Python on Microcontrollers Newsletter: Silicon Labs introduces CircuitPython support, and more! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi

Adafruit IoT Monthly — Guardian Robot, Weather-wise Umbrella Stand, and more!

Microsoft MakeCode — MakeCode Thank You!

EYE on NPI — Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey

New Products – Adafruit Industries – Makers, hackers, artists, designers and engineers! — #NewProds 7/19/23 Feat. Adafruit Matrix Portal S3 CircuitPython Powered Internet Display!

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !



81 Comments

  1. I think the biggest win for the bounty isn’t the value to the person who creates the Kinect drivers, but the PR that it is getting which makes this a very visible challenge. Just Google “$1000 Kinect bounty” to see how many high-profile publications have written about this.

    This isn’t about adafruit, it’s about the open hardware movement that is gaining momentum. It’s a wakeup call to big companies to get on board and embrace it since there’s no putting this genie back in the bottle.

  2. I believe that Adafruit is doing a wonderful job by supporting this kind of hacks that eventually are extremely useful for all the DIY electronics and robotics enthusiasts.

  3. Incredible idea by Adafruit, every true hacker knows its not about the money but the challenge. Let the games begin and have fun 😀

  4. Good luck! And don’t worry about MS!

  5. I think Mark is right. It is unlikely that the Kinect itself is doing any of the 3d processing. A quick Google search shows several mentions of how the x-box is using 10 to 15 % of its processing capacity to do the depth map work.

    What would be interesting to know is what the projected image is and does it change over time. It may vary the structured light dot array so that farther objects have a higher dot density.

    I think the dot system looks very like Livescribe, the digital pen that records what you write and uses special paper with a unique dot array.

    Cheers

    Diarmuid

  6. Why are people complaining about expensive logic analyzers? There are many cheap ones out there as well – like USBee, DigiView, GoLogic, etc. — most for less than $500, and some for less than even $200. It’s not tough to hook one of these up to start monitoring the USB traffic. Although I wouldn’t be surprised if there was some encryption handshaking involved in the initial device enumeration…

  7. If Microsoft tries to take legal action against you, you should protect yourself invoking the DMCA exemption clauses:
    Chap 12, Section 1201 (f):
    “Reverse Engineering. — (1) Notwithstanding the provisions of subsection (a)(1)(A), a person who has lawfully obtained the right to use a copy of a computer program may circumvent a technological measure that effectively controls access to a particular portion of that program for the sole purpose of identifying and analyzing those elements of the program that are necessary to achieve interoperability of an independently created computer program with other programs, and that have not previously been readily available to the person engaging in the circumvention, to the extent any such acts of identification and analysis do not constitute infringement under this title.”

    Good luck!!!

  8. uhh ohh someone started something 😉 http://nuigc.com/johny5

  9. Why not post a "donation link" on your website? I’d contribute to the cause! The K-Prize sounds like a wonderful way to jumpstart open-source to a much wider audience.

  10. "Incredible idea by Adafruit, every true hacker knows its not about the money but the challenge. Let the games begin and have fun" – Comment by eCs8769 — November 6, 2010

    So true, a few friends of mine have decided to give it a shot, and they say that they would give the money (if they did it and won) to charity or donations to other projects (like openshot)

    it was about the challenge, but now its about giving the finger to ms too!

  11. I think Mark is right. It is unlikely that the Kinect itself is doing any of the 3d processing. A quick Google search shows several mentions of how the x-box is using 10 to 15 % of its processing capacity to do the depth map work.
    —-
    It is likely that the Kinect does preprocessing of image data from both the color and IR cam and outputs RGB-D data, that is point clouds with color information (or image data with depth, whatever you want to call it).
    10-15% can easily account for processing of this point cloud data that comes from the Kinect. We´re talking 320x240x30 points here, which is quite a lot. Interpreting and processing point cloud data is a hot topic in robotics research at the moment and known to consume quite a lot of CPU (or GPU) resources.

  12. If you do the drivers, I will do some Blender integration… Think about the fun to be had with an open game engine.

  13. Would like to contribute of $50 too. Please consider setting a donation link.

    Yes, there are people doing that for free, and how is it bad to reward them for that ?

  14. Hi, guys.

    If anyone is willing to send me some PCB photos of Kinect innards and can give remote access to a Linux machine with plugged-in Kinect, then I’ll start a development (can buy myself a Kinect, but it would take additional month).

    Will give out any needed tips how to set up Kinect box.. so it will be unable to do any malicious stuff from machine, and will be easy to reboot/reset. It is not equipment-cost, if you have a spare Pentium-4 that can work 2-3 hours a day, it’s yours.

    Currently have some exp. with Linux USB stack (actually, ALSA USB) and a custom hardware, also currently working with in-kernel network stuff.

  15. ыфвыф

  16. ok..
    seen the photos (Teardown at ifixit), so the chips are identified. Also found a libkinect project @ Github.

    Are there any developers of libkinect reading this? If you need another Linux guy, ping @contextenable at twitter.

  17. @Antony Merquis:
    We have similar ideas. You (and anyone interested in making this a community effort) should join my group:
    http://groups.google.com/group/openkinect
    We already have several people sharing there.

    We are also using the github wiki that Kyle Machulis setup:
    http://www.github.com/qdot/libkinect
    Some info is on it already.

    Josh

  18. AlexP has hacked Kinetic

    User AlexP from the NUI Group forums (former EyeToy hacker) has posted a quick video depicting the Kinect connected to a PC running Windows 7 and delivering more or less the same level of functionality as when connected to an Xbox 360.

    Microsoft Kinect for Xbox 360 Gets Hacked:
    http://news.softpedia.com/news/The-Microsoft-Kinect-Gets-Hacked-165189.shtml

  19. Microsoft has clarified it’s position on people writing drivers for Kinect to use it on other devices:
    http://www.gamespot.com/xbox360/sports/thebiggestloserultimateworkout/news.html?sid=6283696

    Microsoft has issued the following statement through a spokesperson: "Kinect for Xbox 360 has not been hacked–in any way–as the software and hardware that are part of Kinect for Xbox 360 have not been modified. What has happened is someone has created drivers that allow other devices to interface with the Kinect for Xbox 360. The creation of these drivers, and the use of Kinect for Xbox 360 with other devices, is unsupported. We strongly encourage customers to use Kinect for Xbox 360 with their Xbox 360 to get the best experience possible."

  20. Folks – we’ve been PrimeSense developers for a while – check out an early project we did over a year ago at http://www.youtube.com/watch?v=pUA3Z4fBLFo&sns=em

    I can confirm the PrimeSense IS doing the depth mapping and handing it off at high frame rates, despite the opinion of #37 mark. We’d done research on other z options including stereo cams, time of flight sensors etc and the work that the PrimeSense silicon is doing offloaded a TON of crunching on the software side.

    As to the inverse kinematics and skeleton stuff, and additional bells and whistles, that is clearly coming from the MS side, but I wonder how far the PrimeSense in the toy is from the reference design that goes along with PS’s sdk?

  21. Hey guys,

    I did some work to reverse engineer the depth video frame format. Currently, I have the kinect connected through a USB analyzer and to the xbox. I then source the data from the analyzer in real-time and display it in a little QT app. You may need to install the QT libraries for your OS.

    I wasn’t able to get github to accept my ssh key, so I posted the code on google code.

    This shows how to parse the video format. If you want to run the code as is, you will need the Beagle USB 480 analyzer:

    http://www.totalphase.com/products/beagle_usb480/

    You’ll also need to download the appropriate API for your OS and place beagle.so/beagle.dll into the output folder.

    Here’s the code:
    http://code.google.com/p/libkinect/

  22. l45256, you’re really close. just use libusb as a driver to get the usb data straight from the device and you might be the winner!

  23. The first time I’ve seen a video of kinect IR dot pattern it made me think about Annoto dot pattern. Is there a link between the two ?

  24. @l45256:

    I wanted to invite you to join the OpenKinect mailing I started:
    http://groups.google.com/group/openkinect
    We have over 100 members already who are interested in making and supporting a true open source community around Kinect. We’re not after the bounty but if we do happen to get it we plan to use it for community/charity rather than personal gain. We have USB experts, driver experts as well as people interested in using Kinect for robotics to interaction design.

    I hope you’ll considering joining the group, sharing what you know, and taking advantage of the experts in the group. A real community effort will produce much better results than scattered individuals.

    You can also email me at joshblake at gmail dotcom.

    Thanks,
    Josh

  25. check out code labs they have got the camera working with windows7

  26. Here’s my take on it:

    http://git.marcansoft.com/?p=libfreenect.git

    Supports depth and RGB images and displays them on an OpenGL window. It’s very hacky right now but it does prove the concept 🙂

    http://marcansoft.com/transf/libfreenect.png

  27. Looks like we have a libusb winner: https://twitter.com/marcan42/status/2329407779774464 http://marcansoft.com/transf/libfreenect.png http://git.marcansoft.com/?p=libfreenect.git;a=summary

  28. Now people are seeing the raw output of the Kinect, perhaps people can stop refering to the non-existant depth camera. The Kinect uses the crudest of light scattering techniques to do what is essentially outline extraction, combined with some extremely inaccurate (but sometimes useful) z-depth motion vectors. The tech from primesense looks like a gigantic con, compared to their claims.

    Remember, for the year or so that Kinect was being designed, Primesense allowed publicists to make the laughable assertion that they were measuring the speed of their projected light, in order to create a z-depth map.

    Given that the Kinect monochrome camera has a far lower resolution than the projected IR dot grid, it looks as if their so-called depth map is no more than the brightness and contrast adjusted image output from that camera. The mono camera can probably output images at 4 times the speed of the colour one, so maybe some kind of averaging of 2 or 4 consecutive frames is being done.

    Here is a lesson guys- if it sounds too good to be true, it is. The same applied to Sony’s Move if you research the motion tracking method they thought they could use, versus the Wii identical system they ended up releasing when the fancy tech they had paid a fortune for turned out to be useless junk (or Sony’s hilarious attempt to use the ‘Cell’ CPU as their graphics chip, before they went crawling on their knees to Nvidia).

    The faux depth camera on the Kinect reminds me of a ‘voice recognition’ peripheral I bought for my Sinclair Spectrum so many years ago. That device used zero crossing to measure a sequence of average frequencies, and hoped that the words targeted might have different enough ‘signatures’ for a statistical pattern match. Did it work? If you understand anything about engineering, you will already know the long answer to that question.

    In the end, Microsoft wanted a Human outline, and some kind of indication of gross movement toward or away from the screen. Add a massive dose of assumption and imagination in the minds of the ‘player’ and there is your so-called next-generation motion control system.

    In reality, Move, and Wii with the gyroscope addon, are infinitely better for accuracy and user selected input. Microsoft’s innovation is the body skeleton derived from outline data at a lower processing cost and a somewhat greater accuracy than say what Sony could do with the Eyetoy camera alone. However, I would expect improvements in visual processing methods to allow future use of the single colour webcam to replicate most of the tricks of Kinect.

    The real advantage of the two camera system comes when the player wears clothes very similar in colour response to the background, but how likely is that problem? Everything else is just a matter of how cleverly you can image process, and how quickly. When the xbox + Kinect does the clever stuff (full skeleton), it is extremely laggy, so even Microsoft’s expensive hardware assist doesn’t do well on the quickly side of the equation.

    I will say again that despite the disappointing reality of Kinect, open source docs and drivers are most welcome because every peripheral has some uses, even that sad little ‘voice recognition’ Spectrum device. Just ensure your expectations are based on a real understanding of how Kinect actually works, not the outrageous hyperbole of Microsoft and Primesense. Likewise, know that Kinect is really an advanced software package on the console, and without that software, your Kinect is pretty much the world’s most expensive console webcam.

  29. Here’s a very recent paper from Andy Wilson at Microsoft Research about using the Kinect as a touch sensor; also contains some details about the hardware.

    http://www.dfki.de/~jschoen/dropbox/ITS2010/proceedings/papers/sp239.html

  30. Sounds great. I’m quite impressed that it was cracked this fast. Does this mean I can finally throw away my mouse? How about a really good, gesture-based interface for Linux? Preferably one that can distinguish the cat from me doing a “delete all” gesture…

  31. I’d like to see a two KINECT system. The first KINECT would do the typical full body scanning in order to accurately locate the position of your hands. The position would be supplied to the second KINECT which would be “zoomed in” on your hands to perform very fine position tracking. This would allow for “fine grain” hand commands. A possible application is a virtual keyboard. The second KINETC would use simple geometry to position a larger than normal virtual keyboard as if it were floating in air in front of you. As your hand enters the plane of the virtual keyboard, a letter would be “selected” and highlighted on-screen, re-positioning of the hand while still within the plane would adjust selection of the key (similar to what smart phone keyboards do). Removing your hand from the plane of the keyboard would select a letter. This would be more pecking than typing, but should still be effective. A more sophisticated implementation would determine the position of your fingers as your hand enters the plane in order to “double check” the intended key. For example if your left hand enters the plane with the left pinky “extended” an “A” would be “verified”. If your hand enters the plane slightly elevated from the initial or “home” position, a “Q” would be verified, etc. So come on, let’s see it, so we can find out if there’s of any real world benefit!

Sorry, the comment form is closed at this time.