Shape-It-Up: Hand Gesture Based Creative Expression of 3D Shapes Using Intelligent Generalized Cylinders
We present a novel interaction system, “Shape-It-Up”, for creative expression of 3D shapes through the naturalistic integration of human hand gestures with a modeling scheme dubbed intelligent generalized cylinders (IGC). To achieve this naturalistic integration, we propose a novel paradigm of shape-gesture-context interplay (SGCI) wherein the interpretation of gestures in the spatial context of a 3D shape directly deduces the designers’ intent and the subsequent modeling operations. Our key contributions towards SGCI are three-fold. Firstly, we introduce a novel representation (IGC) of generalized cylinders as a function of the spatial hand gestures (postures and motion) during the creation process. This representation allows for fast creation of shapes while retaining their aesthetic features like symmetry and smoothness. Secondly, we define the spatial contexts of IGCs as proximity functions of their representational components, namely cross-sections and skeleton with respect to the hands. Finally, we define a natural association of modification and manipulation of the IGCs by combining the hand gestures with the spatial context. Using SGCI, we implement intuitive hand-driven shape modifications through skeletal bending, sectional deformation and sectional scaling schemes. The implemented prototype involves human skeletal tracking and hand-posture classification using the depth data provided by a low-cost depth sensing camera (Microsoft Kinect). With Shape-It-Up, our goal is to make the designer an integral part of the shape modeling process during early design, in contrast to current CAD tools which segregate 3D sweep geometries into procedural 2D inputs in a non-intuitive and cumbersome process requiring extensive training. We conclusively demonstrate the modeling of a wide variety of 3D shapes within a few seconds….
While Hollywood often fails to accurately portray hacking, one researcher has made the art of exploitation look more like the big screen.
Security researcher and creator of p0wnlabs, Jeff Bryner, showcased the Kinectasploit game at Defcon 20. The game is a product of the improbable melding of Microsoft’s Kinect gaming motion-sensor with hacking tools such as Metasploit.
Together with the Blender 3D environment toolkit, Kinectasploit allows hackers to break wireless networks, launch web attacks and run forensics using body gestures in the style of a first person shooter.
Players are represented as an avatar within a series of three-dimensional rooms, each one housing different hacking tools which materialise from the walls in an event inspired from a scene in The Matrix.
Hacked Kinect – Skill badge, iron-on patch – You can made a cool project using the (hacked) Kinect! Adafruit offers a fun and exciting “badges” of achievement for electronics, science and engineering. We believe everyone should be able to be rewarded for learning a useful skill, a badge is just one of the many ways to show and share. (read more)
Total Phase makes a high-speed USB 2.0 protocol analyzer for $1200, or a regular-speed USB protocol analyzer for $400. Here’s a trick someone mentioned: if you get the cheaper protocol analyzer and need to work with a high-speed USB device, you may be able to plug the high-speed device into a low-speed USB hub to slow the device down.
I decided to start with ladyada’s excellent guide to hacking a Kinect by reverse engineering USB packets. So here’s what I did.
Universal Everything created an installation to promote Nike’s latest technology, Flyknit. The four sides of a video cube stream a slightly altered version of the visitor’s reflection. Kinect cameras capture his presence and translate it in a swarm of colorful particles that follow the visitor’s movement. The installation was presented during the Milano Design week and will be touring the world through out October.
AMUSEMENT RATE: Nike is always great at surrounding itself with the best creative people in order to promote its products. This Universal Everything installation is no exception. The studio did a great job at combining Nike’s promotion needs with an artistic and interactive approach that would entertain the visitors. The freedom of movement brought by the Flyknit technology find a direct translation in the generative design piece created by Universal Everything.
Made exclusively using Kinect and RGBDToolkit, Sugarkane created an incredible music video for the new single “Quand’ero Giovane” from Franco Battiato, an Italian songwriter recognized for his enduring commitment to experimentation. To understand the intricate design details of the innovative project, we recently spoke with Manuel Emede.
The debut of Coney Island Scan-A-Rama at Westport Makerfaire on Saturday was a huge success! We had a continuous line of people waiting to be scanned all day, and managed to scan well over 100 people! I was a little nervous about how things would go, but things were very smooth and the scans all seem quite good. I’m getting file cleanup down to a science and am getting faster at processing them for printing. So far I’ve processes about a third of the files and will aim to get them all done and posted by next weekend. In the meantime I have a couple screenshots of some of the scans. Some of the family shots were quite touching. I even ran off one as a test print today and did an acetone polish on it. Looks great! I’m really excited about it.
With Skanect, capturing a full color 3d model of an object, a person or a room has never been so easy and affordable. Skanect transforms your Microsoft Kinect or Asus Xtion camera into an ultra-low cost scanner able to create 3D meshes out of real scenes in a few minutes. Enter the world of 3D scanning now!
Skanect leverages consumer-grade 3D cameras, thereby limiting the hardware cost to a fraction of the cost of previous scanning solutions. For a personal and hobbyist use, you can even download a free version of Skanect!
Unlike existing technologies, Skanect can acquire dense 3D information about a scene at up to 30 frames per second. Just move around your camera to capture a full set of viewpoints, and you will get a mesh at interactive speeds.
Skanect makes it easy to scan different kind of scenes by providing a set of predefined scenarios, suitable for most use cases. Then share your models online in a few clicks, no need to be a trained professional to start scanning! …
Every Thursday is #3dthursday here at Adafruit! The DIY 3D printing community has passion and dedication for making solid objects from digital models. Recently, we have noticed electronics projects integrated with 3D printed enclosures, brackets, and sculptures, so each Thursday we celebrate and highlight these bold pioneers!
Have you considered building a 3D project around an Arduino or other microcontroller? How about printing a bracket to mount your Raspberry Pi to the back of your HD monitor? And don’t forget the countless LED projects that are possible when you are modeling your projects in 3D!
The Adafruit Learning System has dozens of great tools to get you well on your way to creating incredible works of engineering, interactive art, and design with your 3D printer! If you’ve made a cool project that combines 3D printing and electronics, be sure to let us know, and we’ll feature it here!
As attention to the 3D scanners appearing on crowdfunding sites such as Kickstarter and Indiegogo and the announcement from MakerBot about their scanner development project, here is a fun introduction to one form of DIY 3D scanning already starting to reach wide circulation: 3D scanning with the Kinect. From Open Electronics:
There are a lot of well-known approaches based on laser, video projectors, cameras to create “point clouds” of a 3D surface (thanks to partially open softwares). Now, we have a cheap dedicated hardware that is ready to provide a 3D representation of what’s in front of it. Microsoft Kinect, hacked thanks to some tenacious developers, is now emerging as a simple and effective tool to acquire three-dimensional models. From desktop size stuff up to furniture or the whole person, Kinect can be miraculous on its own.
This is not the right solution to duplicate small object: it’s “resolution” and the volume in which Kinect works best, make the Kinect not suitable for small items such as figurines or small, detailed, objects .
You need a Kinect (Xbox, PC versions or the “compatible” Asus Xtion Pro), a personal computer with ATI or NVIDIA graphics accelerator card and then the ReconstructMe software with the appropriate driver.
Unfortunately, this is not a fully open source solution: it’s a free SDK for non-commercial use (a license fee applies for professional purposes)….
Hacked Kinect – Skill badge, iron-on patch: You can made a cool project using the (hacked) Kinect! Adafruit offers a fun and exciting “badges” of achievement for electronics, science and engineering. We believe everyone should be able to be rewarded for learning a useful skill, a badge is just one of the many ways to show and share. (read more)
Screens as they exist today are flat, 2D and rigid; even the 3D displays we have today are not true 3D – they are optical illusions. We created a 2.5D display that is shape changing with the help of actuators, depth cameras, projector and a silicone screen. ‘Obake’ (o-baa-keh) as we lovingly call it, imagines how we would interact with elastic display. We could literally pinch and pull them!
Create mountains by pulling them out of the screen, draw rivers with your fingers, elevate an entire terrain to see a cut section view. Make your data come alive. The video shows our working prototype.
Hardware built with – Wood, linear actuators, liquid rubber casted into a screen, Kinect, Projector
Software written in – openFrameworks
with Rob Hemsley
thanks Hiroshi ISHII, Rahul Budhiraja
Here’s another cool project that was on display at SXSW. 2 Kinects and a projector sit above a pool table adding a graphic overlay and sound effects to your game in real-time. And the whole thing is open source!
Just like a photobooth, you can memorialize silly faces, candid moments or fabulous poses of you and your crew. But instead of a flat picture of yourself, we put together the contemporary predecessor to the photobooth where participants can get a 3-dimensional replica of their poses from all angles! Add a little Valentines spirit, and the 3D Printing Kissing Booth was born.
We set up the 3-day pop-up event at the fun and fashionable Untitled & Co storefront on Toronto’s Queen Street West, and collaborated with Draft Print 3D for the 3D printing 3D scanning extravaganza! With the same equipment used for our Candy Hacking Project, we brought something that was done with our custom projects to the public for the first time. Participants got a chance to check out some 3D printers in action, get scanned for free, and choose to get their scans 3D printed as a statuette, bobble head, necklace or ring! …
We are happy to announce we are releasing the Kinect for Windows samples under an open source license. You can find everything on CodePlex: http://kinectforwindows.codeplex.com/. We have posted a total of 22 unique samples in C#, C++, and Visual Basic.
With a little duct tape, a touch screen tablet, and their new Kinect API, the Microsoft Research Cambridge team built an augmented reality system to help brain surgeons visualize 3D brain scans. Kinect Fusion supplies 3D modeling of anything, which could fuel some seriously neat medical innovations. (The Cambridge team also built KinEtre, which lets you posses anything.) At the 13th annual Microsoft TechFest, Ben Glocker demoed a prototype system that would allow neurosurgeons to prepare for surgery by looking inside a patient’s brain before they cut it open. Doctors could see the skeleton, brain, blood vessels, and the targeted tumor on a tablet—which they can move around the patient’s head—helping them to plot the best brain surgery path.
The company is called Leap Motion, and if you want to get an idea of how much everyone in San Francisco is buzzing about them, consider this: A few weeks ago I was visiting a different hot new startup in San Francisco, and in the middle of their demo the executives said, “By the way, have you heard about Leap Motion?” Then they interrupted their own demo to show me a video showing what Leap Motion’s software does.
That mindblowing video has been viewed more than 7 million times since Leap Motion put it on YouTube last May. Basically the engineers at Leap Motion have invented the 3D user interface of the future. You don’t use a keyboard and mouse; you don’t even use a touch screen. You just move your fingers in the air, and, as if by magic, with zero latency and pinpoint accuracy, stuff happens on your screen. Think of Microsoft’s Kinect controller, but way better. Leap Motion claims its device is 200 times more accurate than anything on the market and can track your finger movements down to 1/100th of a millimeter.
The first version of the product will be delivered in a little plastic hockey puck that you connect to your laptop or desktop computer. This little device will start shipping early in 2013 at an incredible price — only $69.99 gives you a new toy that isn’t quite like the user interface from Minority Report, but it’s not that far off, either.
I know that Adafruit was a big supporter of the Microsoft Kinect when it first came out. I just wanted to share with you a piece of software for the Kinect that we have developed called Gesture Studio. It allows anyone to record, edit and recognize gestures using the Kinect and it is completely free for non commercial use. Using Gesture Studio Lite, users can then take these gestures and bind them to key presses for use with other applications. I know that Adafruit has been focusing on the Raspberry Pi and more electronics then the Kinect but I thought people might want to dig out their Kinect units and try this software out. Thanks for your time!
Welcome to our new weekly feature Time travel Tuesday #timetravel – It’s a look back at the Adafruit, maker, science, technology and engineering world. Each week we’ll pick what was happening in the world of making – from what Adafruit was up to 1,2,3,4,5+ years ago, to stories around the web of yesteryear, to historic moments in science and beyond. As new team members join Adafruit they’ll be working on this feature so they can see where we’ve all been, with an eye to where we are going. Don’t worry – We’ll avoid paradoxes and if our calculations are correct, when this baby hits 88 miles per hour… you’re gonna see some serious… info
This week Ladyada & pt wind back the clock, let’s go BACK IN TIME – enjoy!
Stop in Sunday at 6pm for the Closing Party, open to the public!
Ever wanted to make a game with a Kinect, Wii, Playstation Move, mobile, Arduino or teddy bear for the controller? Games can be more than console experiences: Come and make a game that explores different and fun ways to play. Ideas using everything from Kinects to robots are encouraged. The more inventive and crazy the game concept, the better! Participants are competing in an manic 48 hour race against the clock to create an experience exploring alternative game controllers.
Groups are judged based on how creative, usable and down right awesome their games are. Top secret prizes will be awarded. Prizes donated by Make Magazine and Adafruit.
Closing event is open to the public on Sunday at 6pm. All events are at the Game Innovation Lab. 5 Metrotech Place. Dibner Building, Room 102. Brooklyn, NY 11201.
What is NYU Poly’s Game Innovation Lab?
The Lab’s emphasis is on the technical/engineering/science side of games and simulations. Sample projects include user interface innovation (sensor-based tracking, multi-touch), network and video quality research, and research on games for learning.”
Work-in-progress prototype for an upcoming project involving volumetric slitscanning using kinect (should it be called surface-scanning?). Similar to traditional slitscanning (see flong.com/texts/lists/slit_scan for more on traditional photographic slitscanning), but instead of working with 2D images + time, this technique uses spatial + temporal data stored in a 4D Space-Time Continuum, and 3 dimensional temporal gradients (i.e. not just slitscanning on the depth/rgb images, but surface-scanning on the animated 3D point cloud).
We have developed a mobile, battery-powered, wireless depth camera based on (and compatible with) Microsoft’s Kinect. In order to promote the use of this device across a wide range of domains, we are making the circuit diagrams and PCB layouts for the additional circuitry available. Our design only uses the front ‘camera’ circuit board of the Kinect, a second bespoke board of the same small size that plugs onto the back of this board in place of the standard large kinect board, which in turn plugs via USB into a Gumstix embedded linux computer running an open source driver and streams via an 802.11n dongle. The design would work equally well with a Raspberry Pi or other SBCs with a bit of hacking.
This work is part of the RCUK Digital Economy PATINA project, for more information see www.patina.ac.uk.
Below is a video of an early version we built last year.
My mom has lived with aphasia ever since she suffered a serious stroke twelve years ago. In the meantime, there’s been a revolution in communication – powered by social media. Like a lot of people, I use the phone less. One of my areas of interest has been bridging the digital “keyboard gap” for people like my mom.
If anyone asks why hack a Kinect, here’s a good video to show them.
Kinect@Home is a place where you can help robotics and computer vision researchers around the world and get 3D models of your room, office or whatever you want in return, right in your browser!
Kinect@Home aims to use your powers to make robots more awesome than ever. Robotics and computer vision researchers need vast amount of images from everyday environments such as homes and offices to improve their algorithms.
Limor Fried is an engineer best known for her work at Adafruit Industries, the company she founded in 2005. Adafruit’s goals are simple: create the best place online for learning electronics and make the best designed products for makers of all ages and skill levels. The company sells all kinds of DIY kits, from phone chargers and learning toys topower monitoring systems and art robots. Limor has emerged as a leader of the open-source hardware movement and is a goddess (LadyAda, as they call her) among makers. She was the first to encourage hacking Microsoft’s Kinect (even offering a $3,000 prize), and she’s fiercely passionate about getting kids to explore science and engineering. Limor talked to us about Adafruit’s workshop in New York City, her current projects, and her best productivity tricks. Want to know more?
The installation demonstrates how the city surface continuously gathers information about people’s movements allowing vehicles to interact with the environment. Kollision developed a real-time graphics engine and the tracking software, which receives live input from 11 Xbox Kinect cameras mounted above the visitors’ heads. Through the cameras the movement of the visitors are processed into patterns of movement displayed on the LED surface.
Inside a spartan garage in an industrial neighborhood in Palo Alto, Calif., a robot armed with electronic “eyes” and a small scoop and suction cups repeatedly picks up boxes and drops them onto a conveyor belt.
It is doing what low-wage workers do every day around the world.
Older robots cannot do such work because computer vision systems were costly and limited to carefully controlled environments where the lighting was just right. But thanks to an inexpensive stereo camera and software that lets the system see shapes with the same ease as humans, this robot can quickly discern the irregular dimensions of randomly placed objects.
The robot uses a technology pioneered in Microsoft’s Kinect motion sensing system for its Xbox video game system.
This project combines the collective talents of musicians, dancers, programmers, designers and animators to create an amazing visual instrument. Creating music through motion is at the heart of this creation and uses the power of the Kinect to capture movement and translate it into music which is performed live and projected on a huge wall. v.co.nz/#the-motion-project
We created and designed the live visual spectacle with a music video being produced from the results. We wanted it to be clear that the technology was real and actually being played live. The interface plays a key role in illustrating the idea of the instrument and we designed it to highlight the audio being controlled by the dancer. Design elements like real time tracking and samples being drawn on as they are played all add to authenticity of the performance. The visuals are all created live and the music video is essentially a real document of the night. assemblyltd.com/
USB complexity got you down? Need a hand with enumeration? Reverse engineering a USB device? You will fall in love with the Beagle 12 USB Analyzer. This hardware analyzer is completely non-intrusive, and is much better than flaky software analyzers. Perfect for when a problem is bad enough it crashes the USB host, or for real time data capture analysis. We used the big-sister version (the Beagle 480) for our famous Open Kinect reverse-engineering bounty! It worked so well we decided to carry these in the shop.
This particular model is good for Low-Speed & Full-Speed USB debugging. If you’re debugging something that runs at High Speed (such as video or other high speed data transport devices), check out the Beagle 480.
Updated tutorial: Hacking the Kinect – Reverse engineering the Microsoft Kinect. Everyone has seen the Xbox 360 Kinect hacked in a matter of days after our “open source driver” bounty - here’s how we helped the winner and here’s how you can reverse engineer USB devices as well!
USB is a very complex protocol, much more complicated than Serial or Parallel, SPI and even I2C. USB uses only two wires but they are not used as ‘receive’ and ‘transmit’ like serial. Rather, data is bidirectional and differential – that is the data sent depends on the difference in voltage between the two data lines D+ and D- If you want to do more USB hacking, you’ll need to read Jan Axelson’s USB Complete books , they’re easy to follow and discuss USB in both depth and breadth.
USB is also very structured. This is good for reverse engineering because it means that at least the format of packets is agreed upon and you won’t have to deal with check-sums. The bad news is it means you have to have software assistance to decode the complex packet structure. The good news is that every computer now made has a USB host core, that does a lot of the tough work for you, and there are many software libraries to assist.
Today we’re going to be reverse engineering the Xbox Kinect Motor, one part of the Kinect device.
What is Bilibot
Bilibot, from the German word ‘billig’, or cheap, is a sophisticated robotics platform at an affordable price. A Bilibot consists of:
* a powerful computer
* an iRobot create
* a Kinect sensor
* mounting hardware to put it all together
* the ROS Robotic Operating system, with research contributions from roboticists all over the world!
On the 25th of June people in Mainz (Germany) are about to play the classic of all video games on the street: Beamer and xBox Kinect will project the gamescreen on the floor, paddles are controlled by body movement. mediaman-colleagues from Mainz had the idea – and a lot of fun while trying, playing and programming.
To use a Kinect with a computer instead of an Xbox, Watson needed a “driver” (basically a bit of software) that did not exist. He joined a small, far-flung, highly dedicated and technically sophisticated community effort dubbed OpenKinect, which sprang up immediately after the Kinect was introduced, to write the code that would make this possible. At the same time, Adafruit, a hobbyist-focused electronics company based in New York, offered $1,000 to the first person or group to write the necessary code in an open-source format.
At the time — this was shortly before the 2010 holiday season — Microsoft’s primary Kinect focus was the mainstream game-playing market. Its first response to OpenKinect seemed predictable: CNET reported an unnamed spokesperson declaring that the company “does not condone the modification of its products” and would “work closely with law enforcement . . . to keep Kinect tamper-resistant.” Adafruit increased its prize, ultimately to $3,000. Within days a developer in Spain posted videos demonstrating that he made his Kinect work with a P.C. OpenKinect refined and spread the open-source driver code, and a variety of “Kinect hacks,” as they came to be called, proliferated in YouTube videos. (An early example involved a Kinect used to create a version of the hand-swipe control contraption Tom Cruise used in “Minority Report.”) Soon Watson and his wife, Emily Gobeille, posted their own video, in which her hand movements were captured by a Kinect and translated onto a screen displaying a computer-generated bird figure, which she controlled like a high-tech puppet.
Long story about the conflict and tension at Microsoft with the Kinect hacking… One note, we published the USB dump (and example) to get folks started on the open-source driver in addition to the bounty project with Johnny Lee.
Ever since the Kinect emerged on the scene, its depth-sensing camera has fascinated legions of creative coders, but the team behind the RGB+D Toolkit is one of the few attempting to transform the gaming console into a real filmmaking tool. Using a Kinect and a standard DSLR camera, like your Canon 5D, these avant-garde image-makers have created a technique that allows you to map video from the SLR onto the Kinect’s 3D data to generate a true CGI and video hybrid.
Why is this exciting? Well, for one thing, convincing CGI is incredibly difficult to do—it took the team behind Rockstar’s L.A. Noire a full 32 cameras and god knows how many man hours to record and digitally reconstruct their characters in 360 degrees. And while the experimental output from the RGB+D team is a far cry from those painstakingly constructed game visuals, that’s kind of not the point. The point is the implications—this has the potential to change the way we think of 3D filmmaking and to significantly lower the barrier to entry using commercially available hardware and open source software.
Today, members of the RGB+D team—James George and Jonathan Minard—released the culmination of their research to date: an excerpt of an ongoing documentary project called Clouds that they’ve been developing alongside the RGB+D Toolkit, their open source video editing application (which looks like a cross between Final Cut Pro and a video game engine). Clouds features interviews with prominent computer hackers, media artists, and critics discussing the creative use of code, the future of data, interfaces, and computational visuals, presented as a series of conversational vignettes.\
Kinect as a tool of narrative film was as inevitable as sunshine in the summertime.
Space innovators at the University of Surrey and Surrey Satellite Technology Limited (SSTL) are developing ‘STRaND-2’, a twin-satellite mission to test a novel in-orbit docking system based upon XBOX Kinect technology that could change the way space assets are built, maintained and decommissioned.
STRaND-2 is the latest mission in the cutting edge STRaND (Surrey Training, Research and Nanosatellite Demonstrator) programme, following on from the smartphone-powered STRaND-1 satellite that is near completion. Similar in design to STRaND-1, the identical twin satellites will each measure 30cm (3 unit Cubesat) in length, and utilise components from the XBOX Kinect games controller to scan the local area and provide the satellites with spatial awareness on all three axes.
Kreek is a Kinect controlled interface which extends a normally two-dimensional multi-touch environment by the perception of depth. This allows the user to literally reach into the interface and gives applications the possibility to interprete parameters like pressure or solid distance.
Well, this is interesting. A startup called Leap Motion has announced a new gestural controller which they claim is more accurate than the Kinect, and sells at an even lower price point ($70!). From Edgadget:
It’s about the size of a pack of gum, and once connected to your computer via USB, it creates a four-cubic-foot virtual workspace. Within that area, it tracks all ten of your fingers simultaneously to within 1/100 of a millimeter — that level of accuracy allows for rudimentary gestures like pinch-to-zoom and more complex actions like manipulating 3D-rendered objects. Naturally, the company isn’t telling much about the black magic making it happen, but Leap Motion claims that its software can be embedded in almost anything with an onboard computer, from phones to refrigerators. Users can customize it to suit their needs with custom gestures and sensitivity settings, in addition to chaining multiple Leap devices together to create a larger workspace. Plus, Leap Motion has created an SDK for devs to create Leap-compatible applications and an app discovery platform to distribute them to others. That means the Leap can work in a variety of use cases, from simply navigating your desktop to gaming and computer-aided design. The best part? Leap brings you this next-gen UX for a mere $69.99, and a select few can pre-order them now, with the full roll-out coming this winter. Full details follow in the PR below, and you can see the Leap in action in the videos after the break.
It’s nice to see them leading with an SDK — the Kinect, which was marketed originally as a game controller, did not have an SDK at launch — but I haven’t been able to figure yet what (if any) restrictions there are for developers. Hopefully, the terms will be more free than the Microsoft SDK.
Anyway, it sounds like it might be fun to play around with. Leap Motion claims the device: “creates a 3D interaction space of 4 cubic feet to precisely interact with and control software on your laptop or desktop computer.” — that’s what it was designed for, but I wonder what else you could make it do? Hmmm…
Vaudeville is a massive 12 foot tall rideable robot made by Suidobashi Heavy Industry (which appears to just be a small group of robot builders). Vaudeville comes complete with a cockpit that one person can strap into and the beast is controlled with a combination of handlebars, a Microsoft Kinect, and a smartphone. The group plans to have the mech fully functional by the end of this year.
Despite my atrociously short attention span, I’ve always loved pinball. Maybe it is something about all the flashing lights and clunking solenoids. Maybe it is just the simple physics at the center of it all. I’m not really sure. My kids, however, don’t share my enthusiasm. I suspect part of it is that they never wandered through a fog filled arcade in the middle of the night, hopped up on Reese’s Pieces with a shrinking pile of quarters in their pocket. The other part might be the fact that they have gotten used to the Nintendo Wii and the Xbox Kinect (we just got one last week).
Watching them jump up and down playing an extremely simple and repetitive game with the Kinect gave me an idea. I envisioned pinball projected on the side of my house, the kids jumping up and down in front of it to move the paddles. Keep reading to see how I plan to build it and what I’ve done so far. There’s a full video, but also text of the entire thing.
Kiss Controller is a game input device that controls the direction and speed of a bowling ball while users are kissing.
Recently, with the improvement of camera capabilities and related tracking systems, game input systems such as Nintendo Wii controllers or Microsoft Kinect games are incorporating more body positions and movements. Kiss Controller is an experimental project that allows users to control a bowling game by moving their tongues while kissing. Unlike existing game input devices, Kiss Controller seeks to generate the emotional experience of a kinetic act while users play the game rather than control games with their body.
The Kiss Controller interface has two components: a customized headset that functions as a sensor receiver and a magnet that provides sensor input. The user affixes a magnet to his/ her tongue with Fixodent. Magnetic field sensors are attached to the end of the headset and positioned in front of the mouth. As the user moves her tongue, this creates varying magnetic fields that are used to control games.
I want to personally thank you for supporting the growth of technology. It makes me proud to know that there are people out there that care about seeing creativity and technology. I support your cause to unlock the potential of the Kinect, and I am also glad to see your donation to the EFF.
Your efforts are appreciated, and I thank you for setting a good example!
I started to get interested in what others were doing with the Xbox Kinect after reading many interesting blog posts and seeing what a recent maker has done with it. Microsoft is quietly, in my view, building a robust community of developers who are hacking and creating in all sorts of powerful, useful, and fun ways as you’ll read here.
We think the Kinect hacking has been one of the best things that has happening to Microsoft and they seem to be embracing it as well.
The original Kinect helped make the Xbox 360 last year’s bestselling game console; Microsoft has sold more than 18 million Kinects since November 2010. It’s also inspired tinkerers to put the device to unanticipated uses, such as guiding robots and doing 3D modeling. With Kinect for Windows, Microsoft aims to coax professional developers and big companies to create apps that make Kinect as essential in the home, office, and showroom as smartphones are to those on the go. “This is a turnaround chance for Microsoft,” says James McQuivey, an analyst at Forrester Research (FORR). “A chance for them to say this isn’t about video gaming, it isn’t about Windows, it’s about the future of everything.”
The open source community did a great job showing the possibilities once hardware is set “free”.
…we’re delighted to announce the general availability of Microsoft Robotics Developer Studio 4 (RDS 4) which can be downloaded for free from the Microsoft Robotics website. It was just over five months ago that we announced the availability of RDS 4 Beta and since then, the Microsoft Robotics team has been hard at work putting the final touches on RDS 4 to give developers access to the software they need to build robotics applications… our own team has been using RDS 4 for a while now and we’ve come up with a few cool and unique applications. Check out the video of the Kinect Follow Me robot which was created by our team.
“There is a rapidly expanding online community of people who have been able to use the Microsoft Kinect to do really amazing things,” Gould said in an e-mail. “Thanks to their hard work, we have been able to adapt what is essentially a toy to be a part of our video show.”