Kinect 3D Hand Tracking

Note: For the library version of the 3D Hand Tracker that was hosted on openni.org click here.

This work got the 1st prize at the CHALEARN Gesture Recognition demonstration competition (Check also this link). The competition was organized in conjunction with ICPR 2012 (Tsukuba, Japan, Nov. 2012).

Overview | Download | References | Links | Acknowledgments | Contact

The software, developed in CVRLICSFORTH,  tracks the 3D position, orientation and full articulation of a human hand from markerless visual observations. The developed method:

  • estimates the full articulation of a hand (26 DoFs)  involved in unconstrained motion
  • operates on input acquired by easy-to-install and widely used/supported RGB-D cameras (e.g. Kinect, Xtion)
  • does not require markers, special gloves
  • performs at a rate of 20fps in modern architectures (GPU acceleration)
  • does not require calibration
  • does not rely on any proprietary built-in tracking technologies (Nite, OpenNI, Kinect SDK)

The Downloadable demo, works either with live RGB-D input or on prerecorded sequences and it outputs the estimated hand kinematics model parameters and a visualization of the tracked hand.

Tracking a single hand with a Kinect Tracking two hands with a Kinect

Download Demo and Library

By downloading this demo you agree to the bounds and terms described in the demo license.

By downloading the library you agree to the bounds and terms described in the library licence.

Strict system requirements:

  • PC with at least 1 GB of RAM
  • 64bit Windows OS for the Demo version
  • 64bit Windows of 64bit Linux for the Library version
  • CUDA enabled GPU card (Compute Capability 1.0 and newer) with 256 ΜΒ of RAM
  • Note: The Hand Tracker Library requires the NVIDIA v304 drivers. and OpenNI 2.x to work.

Also, if you are interested in performing a live demo, make sure that you have installed the x64 version of your RGB-D camera driver.

The demo itself is provided as an installable package of Windows binaries

The library is provided for both Linux and Windows packages:

The demo package relies on a few 3rd party dependencies:

After having installed all drivers and dependencies please make sure that you reboot prior to executing the Demo.

Running the live Kinect Hand Tracking demo: Please, start it from the Start menu or the Desktop.

Running the Kinect Hand Tracking demo on prerecorded .oni file: refer to README on how to provide the .oni path to the executable.

For a short introduction over the usage of the demo please consult the following video. For best tracking results make sure that the tracked hand wears a sleave, so that skin color detection is only effective in the hand area.

Sample recorded sequences are provided for testing, in addition to the one already included in the installation package. The presented videos regard 3D hand tracking with a computational budget of 64 particles and 30 generations. Additional sequences (data only) can be downloaded here: oni_sequences.zip.

Sequence 1 download Sequence 2 download
Sequence 3 download Sequence 4 download
Sequence 5 download
This demo has been tested, as is, on a few PC configurations with the following performance results:
  • CPU: Pentium(R) Dual-Core CPU T4300 @ 2.10GHz with 4096 MBs of RAM, GPU: GeForce GT 240M with 1024 MBs of RAM, Tracking FPS: 1.73792
  • CPU: Intel(R) Core(TM)2 CPU 6600 @ 2.40GHz with 4096 MBs of RAM, GPU: GeForce 9600 GT with 1024 MBs of RAM, Tracking FPS: 2.15686
  • CPU: Intel(R) Core(TM)2 Duo CPU T7500 @ 2.20GHz with 4096 MBs of RAM, GPU: Quadro FX 1600M with 256 MBs of RAM, Tracking FPS: 2.66695
  • CPU: Intel(R) Core(TM) i7 CPU 950 @ 3.07GHz with 6144 MBs of RAM, GPU: GeForce GTX 580 with 1536 MBs of RAM, Tracking FPS: 19.9447
If your are interested in adding your configuration to this list please follow the included instructions in order to produce runtime statistics and send this information to k3Dht@ics.forth.gr.

References

  • I. Oikonomidis, N. Kyriazis, and A. Argyros, “Efficient model-based 3D tracking of hand articulations using Kinect,” in BMVC 2011, 2011.
    [Bibtex]
    @inproceedings{bmvc2011oikonom,
      title={Efficient model-based 3D tracking of hand articulations using Kinect},
      author={Oikonomidis, I. and Kyriazis, N. and Argyros, A.},
      booktitle={BMVC 2011},
      pages={},
      year={2011},
      publisher={BMVA},
      description = { web-site of A. A. Argyros, web-site of N. Kyriazis },
      file = { paper, poster }
    }
  • N. Kyriazis, I. Oikonomidis, and A. Argyros, “A GPU-powered computational framework for efficient 3D model-based vision,” ICS-FORTH, TR420, , 2011.
    [Bibtex]
    @techreport{kyriazisTR420,
      title = {A GPU-powered computational framework for efficient 3D model-based vision},
      author = {Kyriazis, N. and Oikonomidis, I. and Argyros, A. },
      institution = {ICS-FORTH},
      year = {2011},
      month ={July},
      number = {TR420},
      description = { web-site of N. Kyriazis },
      file = { paper, poster }
    }

Links

Acknowledgments

  • The contributions of Damien Michel, Pashalis Padeleris and Konstantinos Tzevanidis, members of CVRL/ICS/FORTH, are gratefully acknowledged
  • This work was partially supported by the IST-FP7-IP-215821 project GRASP. Extensions of this work (in progress) are supported by the IST-FP7-IP-288533 project robohow.cog
  • GRASP and robohow.cog are projects funded by the European Commission through the Cognition unit, Information Society and Media DG

Contact

For questions, comments and any kind of feedback please send an e-mail to k3Dht@ics.forth.gr.