-
Notifications
You must be signed in to change notification settings - Fork 109
Home
OpenPTrack is an open source project (OSS) launched in 2013 to create a scalable, multi-camera solution for person tracking that specifically aims to support applications in education, art, and culture.
With the advent of commercially available consumer depth sensors, and continued efforts in computer vision research to improve multi-modal image and point cloud processing, robust person tracking with the stability and responsiveness necessary to drive interactive applications is now possible at low cost. But the results of such research are not easy to use for application developers. We believe that a disruptive project is needed to bridge these two worlds and enable artists and creators to work with real-time person tracking. OpenPTrack aims to support "creative coders" in the arts, culture, and educational sectors who wish to experiment with real-time person tracking as an input for their applications.
The project contains numerous state-of-the-art algorithms for RGB and/or depth tracking, and has been created on top of a modular node based architecture, to support the addition and removal of different sensor streams online.
OpenPTrack is led by UCLA REMAP and Open Perception. Key collaborators include the University of Padova, [Electroland] (http://www.electroland.net/), and Indiana University Bloomington. Code is available under a BSD license. Portions of the work are supported by the National Science Foundation (IIS-1323767).
See documentation on github wiki or the equivalent PDF.
If you are working with OpenPTrack v2, use this wiki instead.
- Time Synchronization
- Pre-Calibration Configuration
- Intrinsic Calibration
- Camera Network Calibration
- Person Tracking
- Object Tracking
- World Coordinate Settings
- Tips and Tricks
- Tested Hardware
- Network Configuration
- Imager Mounting and Placement
- Calibration in Practice
- Quick Start Example
- Imager Settings
- Manual Ground Plane
- Calibration Refinement
How to receive tracking data in:
Notes on contributing to OPT.