Kinect blocks is a project made by two students in the Emerging Themes class. Basicallly, its a 3D content editor for a uniform grid of textured blocks, although its more of a prototype than something ready for use in content creation.Technologies used:C/C++OpenNI frameworkWindowsVisual StudioOpenGLGLEWSOILTinyXML
Remote 3D video generated by Kinect cameras at each site
Remote collaboration between two sites. 3D video is generated by one Kinect camera at each site. The software is a combination of Vrui, the Vrui collaboration infrastructure, a Kinect 3D video plug-in for the latter, and a Vrui-based viewer for Doom3 maps (the map shown here is mars_city1).
Two Kinects at different angles work together to create a 3D video stream
First test of merging the 3D video streams from two Kinect cameras into a single 3D reconstruction. The cameras were placed at an angle of about 90 degrees, aimed at the same spot in 3D space. The two cameras were calibrated internally using the method described in the previous video, and were calibrated externally (with respect to each other) using a flat checkerboard calibration pattern and manual measurements.
We are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), which is middleware to facilitate integration of full-body control with games and VR applications. FAAST currently supports the PrimeSensor and the Microsoft Kinect using the OpenNI framework. In this video, we show how FAAST can be used to control off-the-shelf video games such as World of Warcraft. Since these games would not normally support motion sensing devices, FAAST emulates keyboard input triggered by body posture and specific gestures.
Created an interactive puppet by using skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant funky bird! They have used openFrameworks and libFreenect