Skip to content

Modern autonomous drone tracking using YOLOv8 deep learning and PID control for DJI Tello drones with real-time object detection and multi-target tracking.

License

Notifications You must be signed in to change notification settings

dronefreak/dji-tello-target-tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

21 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

DJI Tello Target Tracking

๐Ÿ Python 3.8+ โš–๏ธ License: Apache 2.0 ๐Ÿ–ค Code style: Black

Real-time object detection and autonomous tracking for DJI Tello drones using PyTorch and YOLOv8. This modernized version replaces the old HSV color tracking with state-of-the-art deep learning models for robust, accurate object detection and tracking.

โœจ Features

  • ๐ŸŽฏ Modern Object Detection: YOLOv8-powered detection (person, ball, car, etc.)
  • ๐Ÿ”„ Robust Tracking: Centroid-based multi-object tracking with trajectory prediction
  • ๐Ÿš Autonomous Flight: PID-controlled smooth target following
  • ๐ŸŽฎ Manual Override: Instant switch between autonomous and manual control
  • ๐Ÿ“น Webcam Demo: Test detection/tracking without drone hardware
  • ๐ŸŽจ Rich Visualization: Real-time HUD, bounding boxes, trajectories, velocity vectors
  • ๐Ÿ›ก๏ธ Safety First: Battery monitoring, emergency stop, configurable limits
  • ๐Ÿงช Well Tested: 165+ unit tests with >85% coverage

๐ŸŽฅ Demo

Webcam Demo (No Hardware Required)

python demo_webcam.py

Drone Tracking

python demo_drone.py

๐Ÿ“‹ Requirements

  • Python 3.8+
  • DJI Tello drone (for drone demos)
  • CUDA-capable GPU (optional, for faster inference)

๐Ÿš€ Quick Start

1. Installation

# Clone the repository
git clone https://github.com/dronefreak/dji-tello-target-tracking.git
cd dji-tello-target-tracking

# Create the virtual environment
python -m venv .venv

# Activate the virtual environment
source .venv/bin/activate

# Install dependencies
pip install -r requirements.txt

# Or install in development mode
pip install -e .

2. Test Without Drone (Webcam Demo)

# Run with default settings (YOLOv8n, detect all classes)
python demo_webcam.py

# Use a better model
python demo_webcam.py --model yolov8s --confidence 0.6

# Track specific objects only
python demo_webcam.py --classes person ball

# Use a video file instead of webcam
python demo_webcam.py --video test_video.mp4

Webcam Demo Controls:

  • q - Quit
  • t - Toggle tracking on/off
  • d - Switch detector (YOLO/HSV)
  • r - Reset tracker
  • s - Save screenshot
  • h - Toggle HUD
  • f - Toggle FPS display

3. Fly the Drone

# First, test with mock drone (uses webcam, no hardware)
python demo_drone.py --mock

# When ready, fly for real
python demo_drone.py

# With custom settings
python demo_drone.py --model yolov8s --confidence 0.6 --speed 50

โš ๏ธ SAFETY WARNING: Always fly in open areas away from people and obstacles!

Drone Demo Controls:

  • TAB - Takeoff
  • BACKSPACE - Land
  • ESC - Emergency stop
  • SPACE - Toggle autonomous tracking
  • w/s/a/d - Manual control (forward/back/left/right)
  • โ†‘/โ†“ - Manual altitude control
  • โ†/โ†’ - Manual rotation
  • r - Record video
  • c - Take photo
  • q - Quit (lands first)

๐Ÿ“– Usage Examples

Basic Detection

from src.config import Config
from src.detector import ObjectDetector
import cv2

# Initialize
config = Config()
config.model_name = "yolov8n"
config.confidence_threshold = 0.5
detector = ObjectDetector(config)

# Detect objects
frame = cv2.imread("test.jpg")
detections = detector.detect(frame)

for det in detections:
    print(f"Found {det.class_name} at {det.bbox} with confidence {det.confidence:.2f}")

Single Object Tracking

from src.config import Config
from src.detector import ObjectDetector
from src.tracker import SingleObjectTracker
import cv2

config = Config()
detector = ObjectDetector(config)
tracker = SingleObjectTracker(config)

cap = cv2.VideoCapture(0)

while True:
    ret, frame = cap.read()

    # Detect closest object to center
    detection = detector.detect_closest_to_center(frame)

    # Update tracker
    target = tracker.update(detection)

    if target and target.disappeared == 0:
        print(f"Tracking ID {target.id} at {target.center}")

    cv2.imshow("Tracking", frame)
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

cap.release()

Autonomous Drone Tracking

from src.config import get_drone_config
from src.detector import ObjectDetector
from src.tracker import SingleObjectTracker
from src.drone_controller import DroneController

# Initialize
config, drone_config = get_drone_config()
detector = ObjectDetector(config)
tracker = SingleObjectTracker(config)
drone = DroneController(config, drone_config)

# Connect and takeoff
drone.connect()
drone.takeoff()

# Enable autonomous tracking
drone.enable_tracking()

while drone.is_flying():
    frame = drone.get_frame()
    detection = detector.detect_closest_to_center(frame)
    target = tracker.update(detection)

    # Drone automatically follows target
    drone.track_target(target)

drone.land()
drone.disconnect()

๐Ÿ—๏ธ Project Structure

dji-tello-target-tracking/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ config.py           # Configuration management
โ”‚   โ”œโ”€โ”€ detector.py         # YOLOv8 & HSV detection
โ”‚   โ”œโ”€โ”€ tracker.py          # Object tracking algorithms
โ”‚   โ”œโ”€โ”€ drone_controller.py # Tello drone interface
โ”‚   โ””โ”€โ”€ utils.py            # Helper functions
โ”œโ”€โ”€ tests/
โ”‚   โ”œโ”€โ”€ test_config.py
โ”‚   โ”œโ”€โ”€ test_detector.py
โ”‚   โ”œโ”€โ”€ test_tracker.py
โ”‚   โ””โ”€โ”€ test_utils.py
โ”œโ”€โ”€ docs/                   # Documentation
โ”œโ”€โ”€ demo_webcam.py          # Webcam demo script
โ”œโ”€โ”€ demo_drone.py           # Drone tracking script
โ”œโ”€โ”€ requirements.txt
โ”œโ”€โ”€ requirements-dev.txt
โ”œโ”€โ”€ setup.py
โ”œโ”€โ”€ pyproject.toml
โ””โ”€โ”€ README.md

๐Ÿ”ง Configuration

Pre-configured Profiles

from src.config import get_webcam_config, get_drone_config

# Optimized for webcam testing (fast, lower accuracy)
config = get_webcam_config()

# Optimized for drone tracking (accurate, outdoor use)
config, drone_config = get_drone_config()

Custom Configuration

from src.config import ConfigBuilder

config = (ConfigBuilder()
    .with_model("yolov8m")           # Use medium model
    .with_confidence(0.7)            # Higher confidence threshold
    .with_target_classes(["person"]) # Only track persons
    .with_drone_speed(60)            # Faster movement
    .build())

Available YOLO Models

Model Speed Accuracy Use Case
yolov8n โšกโšกโšก โญโญ Webcam demos, testing
yolov8s โšกโšก โญโญโญ Default drone tracking
yolov8m โšก โญโญโญโญ High accuracy tracking
yolov8l ๐ŸŒ โญโญโญโญโญ Maximum accuracy
yolov8x ๐ŸŒ๐ŸŒ โญโญโญโญโญ Best possible accuracy

๐Ÿงช Testing

# Run all tests
pytest

# With coverage
pytest --cov=src --cov-report=html

# Run specific test file
pytest tests/test_tracker.py

# Run with verbose output
pytest -v

# Or use make commands
make test
make test-cov

๐Ÿ› ๏ธ Development

# Install development dependencies
pip install -r requirements-dev.txt

# Format code
make format

# Run linters
make lint

# Run type checking
make type-check

# Run all quality checks
make quality

๐Ÿ“Š Performance

Detection Speed (640x480, CPU i7-10750H)

Model FPS mAP@50
YOLOv8n 45 0.503
YOLOv8s 30 0.530
YOLOv8m 20 0.557

Detection Speed (640x480, GPU RTX 3060)

Model FPS mAP@50
YOLOv8n 120 0.503
YOLOv8s 95 0.530
YOLOv8m 70 0.557

๐Ÿ†š Comparison with Original (v1.0)

Feature v1.0 (2020) v2.0 (2025)
Detection HSV color masking YOLOv8 deep learning
Objects Single color only 80+ object classes
Tracking Basic centroid Multi-object with prediction
Control Manual PID tuning Auto-tuned PID
Testing No tests 165+ unit tests
Documentation Basic README Full docs + examples
Code Quality Mixed style Black formatted, typed

๐Ÿค Contributing

Contributions are welcome! Please see CONTRIBUTING.md for details.

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

๐Ÿ“ License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Ultralytics YOLOv8 for the object detection model
  • DJITelloPy for the Tello drone interface
  • Original project inspiration from PyImageSearch ball tracking tutorial

๐Ÿ“ฎ Contact

โš ๏ธ Safety Disclaimer

IMPORTANT: This software controls a flying drone. Always:

  • Fly in open, outdoor areas away from people and obstacles
  • Follow local drone regulations and laws
  • Monitor battery levels (lands automatically at 10%)
  • Keep manual control ready at all times
  • Practice in mock mode before real flights
  • Never fly over people or near airports
  • Be prepared to use emergency stop (ESC key)

The authors are not responsible for any damage or injury caused by the use of this software.

๐Ÿ—บ๏ธ Roadmap

  • Add support for more drone models (DJI Mini, Mavic)
  • Implement SLAM for indoor navigation
  • Add gesture recognition for control
  • Multi-drone coordinated tracking
  • Real-time trajectory optimization
  • Mobile app for remote monitoring
  • Cloud training pipeline for custom models

๐Ÿ“š Citation

If you use this project in your research or work, please cite:

@software{dji_tello_tracking_2025,
  author = {dronefreak},
  title = {DJI Tello Target Tracking with YOLOv8},
  year = {2025},
  url = {https://github.com/dronefreak/dji-tello-target-tracking},
  version = {2.0.0}
}

As always, Hare Krishna and Happy Tracking! Please consider starring โญ this repo if you find it useful!

About

Modern autonomous drone tracking using YOLOv8 deep learning and PID control for DJI Tello drones with real-time object detection and multi-target tracking.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published