Using OpenKinect (drivers) + OpenCV (image processing and recognition) + FestVox (speech synthesis) + CMU Sphinx (speech recognition). Proof of concept. I'm wearing a headset because OpenKinect does not yet support audio input. All of the processing and recognition occurs in real time.Link: http://www.youtube.com/watch?v=fQ59dXOo63o
This work is part of the STARMAC Project in the Hybrid Systems Lab at UC Berkeley (EECS department). http://hybrid.eecs.berkeley.edu/Researcher: Patrick BouffardPI: Prof.
ptone805 is playing instruments with the Mac OS X iLife application Garageband. Kinect is connected to Garageband like it was a normal MIDI device. Link: http://www.youtube.com/watch?v=u2-e9YMdadE&feature=player_embedded
This uses OpenKinect with the python bindings to talk to light dimming hardware via E1.31/DMX via the OLA libraries from opendmx.net. I have a custom python library that handles the dimming etc. link: http://www.youtube.com/watch?v=Kg0Rvj-Seto
Created an interactive puppet by using skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant funky bird! They have used openFrameworks and libFreenect
This is a graphical interface inspired by the movie "Minority Report". It uses the Kinect sensor from Microsoft, and the recently released libfreenect driver for interfacing with the Kinect in linux. The graphical interface and the hand detection software were written at MIT to interface with the open source robotics package 'ROS', developed by Willow Garage (willowgarage.com). The hand detection software showcases the abilities of the Point Cloud Library (PCL), a part of ROS that MIT has been helping to optimize.