Two Kinects at different angles work together to create a 3D video stream
First test of merging the 3D video streams from two Kinect cameras into a single 3D reconstruction. The cameras were placed at an angle of about 90 degrees, aimed at the same spot in 3D space. The two cameras were calibrated internally using the method described in the previous video, and were calibrated externally (with respect to each other) using a flat checkerboard calibration pattern and manual measurements.
HMD is doing is rotating the camera angle based on the information, OpenNI only linked to the head of the joint inputs. Were able to build a virtual reality environment to be able to reflect movement of the body by itself without any special equipment. This is the second demonstration was held in Nagoya May 18, what you will be presented at the seminar, a video clip when I get a little plus. State that can be seen moving their limbs and face down a little. The scene is reflected in his non-action of his body, feeling very strange and interesting.
We are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), which is middleware to facilitate integration of full-body control with games and VR applications. FAAST currently supports the PrimeSensor and the Microsoft Kinect using the OpenNI framework. In this video, we show how FAAST can be used to control off-the-shelf video games such as World of Warcraft. Since these games would not normally support motion sensing devices, FAAST emulates keyboard input triggered by body posture and specific gestures.
Take photos and depth information of a sculpture from multiple angles, program builds it in virtual reality
I've written a program that uses the Kinect 3D camera to take 3D snapshots which record the nearest object to the camera in each part of the image. By taking multiple 3D snapshots at different times and then merging them to show the closest object, I can create a 3D sculpture that you can walk through. This runs in real time, creating a fully interactive experience.
Kinect Project in python which controls Christmas lights
This uses OpenKinect with the python bindings to talk to light dimming hardware via E1.31/DMX via the OLA libraries from opendmx.net. I have a custom python library that handles the dimming etc. link: http://www.youtube.com/watch?v=Kg0Rvj-Seto
Created an interactive puppet by using skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant funky bird! They have used openFrameworks and libFreenect
Ever wanted to throw fireballs from you hands? Now you can. This setup uses a combination of FAAST for Kinect input and GlovePIE for the chuck and scripting of special moves. Probably the most interesting aspect of this is that you can play over the internet or LAN with your friends... Give it a shot and why not modify my scripts for some other characters?
"Minority Report" inspired GUI. Tracks hands for natural real time interaction.
This is a graphical interface inspired by the movie "Minority Report". It uses the Kinect sensor from Microsoft, and the recently released libfreenect driver for interfacing with the Kinect in linux. The graphical interface and the hand detection software were written at MIT to interface with the open source robotics package 'ROS', developed by Willow Garage (willowgarage.com). The hand detection software showcases the abilities of the Point Cloud Library (PCL), a part of ROS that MIT has been helping to optimize.
We are a new a portal that provides users with a searchable database of community development groups, “starter docs” and tutorials across multiple operating systems and platforms. In addition, we've got historical information on gestural and motion-based interfaces for games and a glossary of gestures and a library of design guidelines for developers.