Real-time object detection and autonomous tracking for DJI Tello drones using PyTorch and YOLOv8. This modernized version replaces the old HSV color tracking with state-of-the-art deep learning models for robust, accurate object detection and tracking.
- ๐ฏ Modern Object Detection: YOLOv8-powered detection (person, ball, car, etc.)
- ๐ Robust Tracking: Centroid-based multi-object tracking with trajectory prediction
- ๐ Autonomous Flight: PID-controlled smooth target following
- ๐ฎ Manual Override: Instant switch between autonomous and manual control
- ๐น Webcam Demo: Test detection/tracking without drone hardware
- ๐จ Rich Visualization: Real-time HUD, bounding boxes, trajectories, velocity vectors
- ๐ก๏ธ Safety First: Battery monitoring, emergency stop, configurable limits
- ๐งช Well Tested: 165+ unit tests with >85% coverage
python demo_webcam.pypython demo_drone.py- Python 3.8+
- DJI Tello drone (for drone demos)
- CUDA-capable GPU (optional, for faster inference)
# Clone the repository
git clone https://github.com/dronefreak/dji-tello-target-tracking.git
cd dji-tello-target-tracking
# Create the virtual environment
python -m venv .venv
# Activate the virtual environment
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Or install in development mode
pip install -e .# Run with default settings (YOLOv8n, detect all classes)
python demo_webcam.py
# Use a better model
python demo_webcam.py --model yolov8s --confidence 0.6
# Track specific objects only
python demo_webcam.py --classes person ball
# Use a video file instead of webcam
python demo_webcam.py --video test_video.mp4Webcam Demo Controls:
q- Quitt- Toggle tracking on/offd- Switch detector (YOLO/HSV)r- Reset trackers- Save screenshoth- Toggle HUDf- Toggle FPS display
# First, test with mock drone (uses webcam, no hardware)
python demo_drone.py --mock
# When ready, fly for real
python demo_drone.py
# With custom settings
python demo_drone.py --model yolov8s --confidence 0.6 --speed 50Drone Demo Controls:
TAB- TakeoffBACKSPACE- LandESC- Emergency stopSPACE- Toggle autonomous trackingw/s/a/d- Manual control (forward/back/left/right)โ/โ- Manual altitude controlโ/โ- Manual rotationr- Record videoc- Take photoq- Quit (lands first)
from src.config import Config
from src.detector import ObjectDetector
import cv2
# Initialize
config = Config()
config.model_name = "yolov8n"
config.confidence_threshold = 0.5
detector = ObjectDetector(config)
# Detect objects
frame = cv2.imread("test.jpg")
detections = detector.detect(frame)
for det in detections:
print(f"Found {det.class_name} at {det.bbox} with confidence {det.confidence:.2f}")from src.config import Config
from src.detector import ObjectDetector
from src.tracker import SingleObjectTracker
import cv2
config = Config()
detector = ObjectDetector(config)
tracker = SingleObjectTracker(config)
cap = cv2.VideoCapture(0)
while True:
ret, frame = cap.read()
# Detect closest object to center
detection = detector.detect_closest_to_center(frame)
# Update tracker
target = tracker.update(detection)
if target and target.disappeared == 0:
print(f"Tracking ID {target.id} at {target.center}")
cv2.imshow("Tracking", frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()from src.config import get_drone_config
from src.detector import ObjectDetector
from src.tracker import SingleObjectTracker
from src.drone_controller import DroneController
# Initialize
config, drone_config = get_drone_config()
detector = ObjectDetector(config)
tracker = SingleObjectTracker(config)
drone = DroneController(config, drone_config)
# Connect and takeoff
drone.connect()
drone.takeoff()
# Enable autonomous tracking
drone.enable_tracking()
while drone.is_flying():
frame = drone.get_frame()
detection = detector.detect_closest_to_center(frame)
target = tracker.update(detection)
# Drone automatically follows target
drone.track_target(target)
drone.land()
drone.disconnect()dji-tello-target-tracking/
โโโ src/
โ โโโ __init__.py
โ โโโ config.py # Configuration management
โ โโโ detector.py # YOLOv8 & HSV detection
โ โโโ tracker.py # Object tracking algorithms
โ โโโ drone_controller.py # Tello drone interface
โ โโโ utils.py # Helper functions
โโโ tests/
โ โโโ test_config.py
โ โโโ test_detector.py
โ โโโ test_tracker.py
โ โโโ test_utils.py
โโโ docs/ # Documentation
โโโ demo_webcam.py # Webcam demo script
โโโ demo_drone.py # Drone tracking script
โโโ requirements.txt
โโโ requirements-dev.txt
โโโ setup.py
โโโ pyproject.toml
โโโ README.md
from src.config import get_webcam_config, get_drone_config
# Optimized for webcam testing (fast, lower accuracy)
config = get_webcam_config()
# Optimized for drone tracking (accurate, outdoor use)
config, drone_config = get_drone_config()from src.config import ConfigBuilder
config = (ConfigBuilder()
.with_model("yolov8m") # Use medium model
.with_confidence(0.7) # Higher confidence threshold
.with_target_classes(["person"]) # Only track persons
.with_drone_speed(60) # Faster movement
.build())| Model | Speed | Accuracy | Use Case |
|---|---|---|---|
| yolov8n | โกโกโก | โญโญ | Webcam demos, testing |
| yolov8s | โกโก | โญโญโญ | Default drone tracking |
| yolov8m | โก | โญโญโญโญ | High accuracy tracking |
| yolov8l | ๐ | โญโญโญโญโญ | Maximum accuracy |
| yolov8x | ๐๐ | โญโญโญโญโญ | Best possible accuracy |
# Run all tests
pytest
# With coverage
pytest --cov=src --cov-report=html
# Run specific test file
pytest tests/test_tracker.py
# Run with verbose output
pytest -v
# Or use make commands
make test
make test-cov# Install development dependencies
pip install -r requirements-dev.txt
# Format code
make format
# Run linters
make lint
# Run type checking
make type-check
# Run all quality checks
make quality| Model | FPS | mAP@50 |
|---|---|---|
| YOLOv8n | 45 | 0.503 |
| YOLOv8s | 30 | 0.530 |
| YOLOv8m | 20 | 0.557 |
| Model | FPS | mAP@50 |
|---|---|---|
| YOLOv8n | 120 | 0.503 |
| YOLOv8s | 95 | 0.530 |
| YOLOv8m | 70 | 0.557 |
| Feature | v1.0 (2020) | v2.0 (2025) |
|---|---|---|
| Detection | HSV color masking | YOLOv8 deep learning |
| Objects | Single color only | 80+ object classes |
| Tracking | Basic centroid | Multi-object with prediction |
| Control | Manual PID tuning | Auto-tuned PID |
| Testing | No tests | 165+ unit tests |
| Documentation | Basic README | Full docs + examples |
| Code Quality | Mixed style | Black formatted, typed |
Contributions are welcome! Please see CONTRIBUTING.md for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- Ultralytics YOLOv8 for the object detection model
- DJITelloPy for the Tello drone interface
- Original project inspiration from PyImageSearch ball tracking tutorial
- GitHub: @dronefreak
- Issues: GitHub Issues
IMPORTANT: This software controls a flying drone. Always:
- Fly in open, outdoor areas away from people and obstacles
- Follow local drone regulations and laws
- Monitor battery levels (lands automatically at 10%)
- Keep manual control ready at all times
- Practice in mock mode before real flights
- Never fly over people or near airports
- Be prepared to use emergency stop (ESC key)
The authors are not responsible for any damage or injury caused by the use of this software.
- Add support for more drone models (DJI Mini, Mavic)
- Implement SLAM for indoor navigation
- Add gesture recognition for control
- Multi-drone coordinated tracking
- Real-time trajectory optimization
- Mobile app for remote monitoring
- Cloud training pipeline for custom models
If you use this project in your research or work, please cite:
@software{dji_tello_tracking_2025,
author = {dronefreak},
title = {DJI Tello Target Tracking with YOLOv8},
year = {2025},
url = {https://github.com/dronefreak/dji-tello-target-tracking},
version = {2.0.0}
}As always, Hare Krishna and Happy Tracking! Please consider starring โญ this repo if you find it useful!