Two computer science students from the University of Pennsylvania, Eric Berdinis and Jeff Kiske, have hacked together a very impressive tactile feedback system for the visually impaired using a Microsoft Kinect device and a number of vibration actuators. The Kinecthesia is a belt worn camera system that detects the location and depth of objects in front of the wearer using depth information detected by the Kinect sensor. This information is processed on a BeagleBoard open computer platform and then used to drive six vibration motors located to the left, center and right of the user.
No comments yet.
Sorry, the comment form is closed at this time.