Sorry, you need to enable JavaScript to visit this website.

Site-wide links

Installing libFreeNect Drivers (Mac OS X)

Thu, 05/05/2011 - 10:20 -- mxg7505

Warning: These steps have been tested on multiple Mac machines - not all installations worked correctly.

Using Homebrew

Note: This tutorial works best if you have a clean install of Snow Leopard. Otherwise, things often break due to library version conflicts.

last tested May 4, 2011

1. Download Homebrew

Install XCode if you don’t already have it ( )

Open up the terminal and run the following line:

Operating System: 
Programming Language: 

Installing libFreeNect Drivers (Windows 7)

Mon, 05/02/2011 - 18:03 -- mxg7505
  1. Go to abd download RGBDemo-0.4.0-Win32 when prompted
  2. Navigate to where you downloaded it and extract it.
  3. Plug in your Kinect to the USB port on your PC. (make sure you do this step)
  4. Go to the Start menu on windows
  5. Right click on Computer and select properties
  6. Select Device manager
Operating System: 
Programming Language: 

Installing libFreeNect Drivers (Ubuntu 10.10)

Mon, 05/02/2011 - 17:44 -- mxg7505
last tested April 20, 2011
Note: All instructions found below assume basic working knowledge of inputting commands via terminal

1. Open command-line terminal
2. get/use Ubuntu launchpad ppa:

     sudo add-apt-repository ppa:arne-alamut/freenect

3. update/resynchronize list of packages from repository:

     sudo apt-get update

4. install freenect 0.0, libfreenect demos and libfreenect-dev

Operating System: 
Programming Language: 

Teaching Kinect to recognize objects on the PC

Tue, 04/26/2011 - 11:48 -- jmo1533

Using OpenKinect (drivers) + OpenCV (image processing and recognition) + FestVox (speech synthesis) + CMU Sphinx (speech recognition). Proof of concept. I'm wearing a headset because OpenKinect does not yet support audio input. All of the processing and recognition occurs in real time.Link:

Operating System: 

Remote 3D video generated by Kinect cameras at each site

Tue, 04/26/2011 - 11:11 -- dcw4238

Remote collaboration between two sites. 3D video is generated by one Kinect camera at each site. The software is a combination of Vrui, the Vrui collaboration infrastructure, a Kinect 3D video plug-in for the latter, and a Vrui-based viewer for Doom3 maps (the map shown here is mars_city1).

Programming Language: 
Operating System: 

3D video stream from 2 merged Kinect cameras

Tue, 04/26/2011 - 11:10 -- dcw4238

First test of merging the 3D video streams from two Kinect cameras into a single 3D reconstruction. The cameras were placed at an angle of about 90 degrees, aimed at the same spot in 3D space. The two cameras were calibrated internally using the method described in the previous video, and were calibrated externally (with respect to each other) using a flat checkerboard calibration pattern and manual measurements.

Programming Language: 
Operating System: 


Subscribe to Kinect@RIT RSS