[HW Accel Support]: Docker Compose and Installation with NVIDIA 5070 #18169
-
Describe the problem you are havingWhen trying to deploy Frigate inside a VM in Proxmox I am getting: Versionfrigate:stable-tensorrt Frigate config file# Frigate Configuration File (config.yml)
# Base configuration - MQTT, Database, Detectors
mqtt:
enabled: True # Set to True for Home Assistant integration
host: 192.168.10.* # IP of your Home Assistant VM (which should run an MQTT broker)
port: 1883
user: "user_mqtt_user" # Replace with your MQTT username
password: "pass_mqtt_password" # Replace with your MQTT password
# topic_prefix: frigate # Default, can be changed if needed
database:
path: /media/frigate/frigate.db # Path inside the container, maps to NAS
# Available detectors (NVIDIA will be primary)
detectors:
cpu_detector: # A fallback CPU detector (optional, but good to have)
type: cpu
# Coral EdgeTPU detector (if you ever add one)
# coral_detector:
# type: edgetpu
# device: usb # or pci depending on your Coral
# NVIDIA GPU Detector
# For RTX series, TensorRT is generally preferred for best performance.
# Check Frigate documentation for the most up-to-date NVIDIA detector config.
# It usually relies on the NVIDIA Container Toolkit and drivers being present.
main_nvidia_detector:
type: nvidia
# device: 0 # Usually the first GPU passed to the container. Frigate often auto-de>
# For newer Frigate versions, specific tensorrt options might be available or set >
# FFmpeg settings (usually fine with defaults unless troubleshooting)
# ffmpeg:
# Optional: Global FFmpeg hardware acceleration arguments (can be overridden per cam>
# hwaccel_args: preset-nvidia-h264 # Or preset-nvidia-hevc if your cameras use H.265>
# preset-nvidia-h264 typically includes:
# - -hwaccel
# - nvidia
# - -hwaccel_output_format
# - cuda
# - -c:v
# - h264_cuvid # or hevc_cuvid
# Ensure your ffmpeg build inside Frigate container supports this.
# Frigate's official images usually have good NVIDIA support.
# pass_through_args: [] # Default
# Model (Object detection model - Frigate usually defaults to a good one)
model:
width: 320 # Default input size for detection model
height: 320 # Default input size for detection model
# path: /cpu_models/ssdlite_mobilenet_v2.tflite # Example, Frigate manages this
# Object tracking and filters
objects:
track: # List of objects to track
- person
- car
- cat
- dog
# Add other objects as needed (e.g., bicycle, truck, etc.)
filters: # Optional: filters per object type
person:
min_area: 5000 # Minimum pixel area to consider a detection valid
max_area: 100000
threshold: 0.6 # Minimum confidence score (0.0-1.0)
car:
min_area: 10000
threshold: 0.5
# Record settings (global, can be overridden per camera)
record:
enabled: True
retain:
days: 7
mode: motion
events: # Events config lives under record
retain: # Retain config for events lives under events
default: 10 # How long to keep event data
# mode: motion # Mode for events is determined by detect logic, not set here exp>
# Snapshots settings (global, can be overridden per camera)
snapshots:
enabled: True
timestamp: False # Add timestamp to snapshots
bounding_box: True # Draw bounding box on snapshots
retain:
default: 14 # Days to keep snapshots
# Optional: Birdseye view
# birdseye:
# enabled: True # False, Restream, Transcode
# width: 1280
# height: 720
# quality: 8
# mode: objects # 'objects' or 'motion'
# Go2RTC configuration (for WebRTC, MSE, etc. - Frigate manages this by default)
# Usually no need to configure this unless you have advanced needs.
# Camera specific configurations
cameras:
# Replace with your actual camera configurations
# Example for one Reolink RLC-810A camera:
# Give each camera a unique name (no spaces, underscores allowed)
living_room_cam: # << UNIQUE CAMERA NAME
ffmpeg:
inputs:
# Input for detection (usually a lower-resolution substream)
- path: rtsp://user:[email protected].*:554/h264Previ>
roles:
- detect
# Input for recording/streaming (usually the high-resolution main stream)
- path: rtsp://user:[email protected].*:554/h264Previ>
roles:
- record
# - rtmp # If you want to restream via RTMP
detect: # Detection settings for this camera
enabled: True
width: 704 # Substream width (adjust to your Reolink RLC-810A substream resoluti>
height: 480 # Substream height (adjust to your Reolink RLC-810A substream resolu>
fps: 5 # FPS for detection processing
# detector: main_nvidia_detector # Explicitly assign detector if needed, otherwi>
record: # Recording settings for this camera (overrides global)
enabled: True
# retain: # Override global retain if needed
# days: 10
snapshots: # Snapshot settings for this camera (overrides global)
enabled: True
# retain: # Override global retain if needed
# default: 30
# Optional: Define zones for this camera
# zones:
# driveway_zone:
# coordinates: 0,0,100,0,100,100,0,100 # Example: top-left, top-right, bottom->
# objects:
# - person
# - car
# Optional: Motion masks for this camera
# motion:
# mask:
# - 0,0,100,0,100,10,0,10 # Example mask for the top 10 pixels
# Add more cameras here, copying the structure above:
# another_cam:
# ffmpeg:
# inputs:
# - path: rtsp://...
# roles:
# - detect
# - path: rtsp://...
# roles:
# - record
# detect:
# ...
# ... docker-compose file or Docker CLI commandversion: '3.9' # This line can be removed
services:
frigate:
container_name: frigate
privileged: true
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:stable-tensorrt
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1 # Or 'all'
capabilities: [gpu] # Correct placement for device capabilities
volumes:
- /etc/localtime:/etc/localtime:ro
- ./config.yml:/config/config.yml:ro
- /mnt/psmfrigate_recordings:/media/frigate
- type: tmpfs
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "5000:5000"
- "8971:8971"
- "8554:8554"
- "8555:8555/tcp"
- "8555:8555/udp"
environment:
- NVIDIA_DRIVER_CAPABILITIES=all
- NVIDIA_VISIBLE_DEVICES=all
# - LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/lib/x86_64-linux-gnu:/usr/local/lib/python3.9/dist-packages/tensorrt # Try with this commented out first Relevant Frigate log outputWARN[0000] /opt/stacks/frigate/docker-compose.yml: the attribute `version` is obsolete, it will be ignored, please remove it to avoid potential confusion
frigate | s6-rc: info: service s6rc-fdholder: starting
frigate | s6-rc: info: service s6rc-oneshot-runner: starting
frigate | s6-rc: info: service s6rc-oneshot-runner successfully started
frigate | s6-rc: info: service fix-attrs: starting
frigate | s6-rc: info: service s6rc-fdholder successfully started
frigate | s6-rc: info: service fix-attrs successfully started
frigate | s6-rc: info: service legacy-cont-init: starting
frigate | s6-rc: info: service legacy-cont-init successfully started
frigate | s6-rc: info: service trt-model-prepare: starting
frigate | s6-rc: info: service log-prepare: starting
frigate | tensorrt model preparation disabled
frigate | s6-rc: info: service trt-model-prepare successfully started
frigate | s6-rc: info: service log-prepare successfully started
frigate | s6-rc: info: service nginx-log: starting
frigate | s6-rc: info: service go2rtc-log: starting
frigate | s6-rc: info: service frigate-log: starting
frigate | s6-rc: info: service certsync-log: starting
frigate | s6-rc: info: service nginx-log successfully started
frigate | s6-rc: info: service frigate-log successfully started
frigate | s6-rc: info: service go2rtc-log successfully started
frigate | s6-rc: info: service go2rtc: starting
frigate | s6-rc: info: service certsync-log successfully started
frigate | s6-rc: info: service go2rtc successfully started
frigate | s6-rc: info: service go2rtc-healthcheck: starting
frigate | s6-rc: info: service frigate: starting
frigate | s6-rc: info: service go2rtc-healthcheck successfully started
frigate | s6-rc: info: service frigate successfully started
frigate | s6-rc: info: service nginx: starting
frigate | 2025-05-11 19:15:44.070005401 [INFO] Preparing new go2rtc config...
frigate | 2025-05-11 19:15:44.070320317 [INFO] Preparing Frigate...
frigate | 2025-05-11 19:15:44.071051281 [INFO] Starting NGINX...
frigate | 2025-05-11 19:15:44.090293442 [INFO] No TLS certificate found. Generating a self signed certificate...
frigate | 2025-05-11 19:15:44.187736153 [INFO] Starting Frigate...
frigate | 2025-05-11 19:15:44.444089621 [INFO] Starting go2rtc...
frigate | 2025-05-11 19:15:44.498677234 19:15:44.498 INF go2rtc platform=linux/amd64 revision=b2399f3 version=1.9.2
frigate | 2025-05-11 19:15:44.498681141 19:15:44.498 INF config path=/dev/shm/go2rtc.yaml
frigate | 2025-05-11 19:15:44.498947004 19:15:44.498 INF [rtsp] listen addr=:8554
frigate | 2025-05-11 19:15:44.499230692 19:15:44.499 INF [webrtc] listen addr=:8555/tcp
frigate | 2025-05-11 19:15:44.499232014 19:15:44.499 INF [api] listen addr=:1984
frigate | s6-rc: info: service nginx successfully started
frigate | s6-rc: info: service certsync: starting
frigate | s6-rc: info: service certsync successfully started
frigate | s6-rc: info: service legacy-services: starting
frigate | 2025-05-11 19:15:45.582687839 [INFO] Starting certsync...
frigate | s6-rc: info: service legacy-services successfully started
frigate | 2025-05-11 19:15:45.619556755 127.0.0.1 - - [11/May/2025:19:15:45 +0000] "" 400 0 "-" "-" "-"
frigate | 2025-05-11 19:15:46.259918125 [2025-05-11 19:15:46] frigate.util.config INFO : Checking if frigate config needs migration...
frigate | 2025-05-11 19:15:46.259932993 [2025-05-11 19:15:46] frigate.util.config ERROR : Config file is read-only, unable to migrate config file.
frigate | 2025-05-11 19:15:47.381644415 [2025-05-11 19:15:47] frigate.util.services INFO : Automatically detected nvidia hwaccel for video decoding
frigate | 2025-05-11 19:15:47.384257181 [2025-05-11 19:15:47] frigate.app INFO : Starting Frigate (0.15.0-6cb5cfb)
frigate | 2025-05-11 19:15:47.405814117 [2025-05-11 19:15:47] peewee_migrate.logs INFO : Starting migrations
frigate | 2025-05-11 19:15:47.407477967 [2025-05-11 19:15:47] peewee_migrate.logs INFO : There is nothing to migrate
frigate | 2025-05-11 19:15:47.410102907 [2025-05-11 19:15:47] frigate.app INFO : Running database vacuum
frigate | 2025-05-11 19:15:47.433150284 [2025-05-11 19:15:47] frigate.app INFO : Recording process started: 406
frigate | 2025-05-11 19:15:47.433426236 [2025-05-11 19:15:47] frigate.app INFO : Review process started: 413
frigate | 2025-05-11 19:15:47.434683707 [2025-05-11 19:15:47] frigate.app INFO : go2rtc process pid: 103
frigate | 2025-05-11 19:15:47.451782451 [2025-05-11 19:15:47] frigate.app INFO : Output process started: 438
frigate | 2025-05-11 19:15:47.471120355 [2025-05-11 19:15:47] frigate.app INFO : Camera processor started for test_camera_1: 459
frigate | 2025-05-11 19:15:47.471123130 [2025-05-11 19:15:47] frigate.app INFO : Capture process started for test_camera_1: 461
frigate | 2025-05-11 19:15:47.489869243 [2025-05-11 19:15:47] detector.main_nvidia_detector INFO : Starting detection process: 422
frigate | 2025-05-11 19:15:47.565785650 Process detector:main_nvidia_detector:
frigate | 2025-05-11 19:15:47.565788094 Traceback (most recent call last):
frigate | 2025-05-11 19:15:47.565788846 File "/usr/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
frigate | 2025-05-11 19:15:47.565789267 self.run()
frigate | 2025-05-11 19:15:47.565789737 File "/opt/frigate/frigate/util/process.py", line 41, in run_wrapper
frigate | 2025-05-11 19:15:47.565791050 return run(*args, **kwargs)
frigate | 2025-05-11 19:15:47.565793355 File "/usr/lib/python3.9/multiprocessing/process.py", line 108, in run
frigate | 2025-05-11 19:15:47.565793796 self._target(*self._args, **self._kwargs)
frigate | 2025-05-11 19:15:47.565794236 File "/opt/frigate/frigate/object_detection.py", line 121, in run_detector
frigate | 2025-05-11 19:15:47.565804906 object_detector = LocalObjectDetector(detector_config=detector_config)
frigate | 2025-05-11 19:15:47.565805418 File "/opt/frigate/frigate/object_detection.py", line 68, in __init__
frigate | 2025-05-11 19:15:47.565805848 self.detect_api = create_detector(detector_config)
frigate | 2025-05-11 19:15:47.565818893 File "/opt/frigate/frigate/detectors/__init__.py", line 18, in create_detector
frigate | 2025-05-11 19:15:47.565819313 return api(detector_config)
frigate | 2025-05-11 19:15:47.565829171 File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 243, in __init__
frigate | 2025-05-11 19:15:47.565829612 self.engine = self._load_engine(detector_config.model.path)
frigate | 2025-05-11 19:15:47.565830124 File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 87, in _load_engine
frigate | 2025-05-11 19:15:47.565830574 with open(model_path, "rb") as f, trt.Runtime(self.trt_logger) as runtime:
frigate | 2025-05-11 19:15:47.565831005 TypeError: expected str, bytes or os.PathLike object, not NoneType
frigate | 2025-05-11 19:15:47.565831426 Exception ignored in: <function TensorRtDetector.__del__ at 0x79f05df15c10>
frigate | 2025-05-11 19:15:47.565831776 Traceback (most recent call last):
frigate | 2025-05-11 19:15:47.565842797 File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 262, in __del__
frigate | 2025-05-11 19:15:47.565843209 if self.outputs is not None:
frigate | 2025-05-11 19:15:47.565843619 AttributeError: 'TensorRtDetector' object has no attribute 'outputs'
frigate | 2025-05-11 19:15:47.778580096 [2025-05-11 19:15:47] frigate.api.fastapi_app INFO : Starting FastAPI app
frigate | 2025-05-11 19:15:47.822120264 [2025-05-11 19:15:47] frigate.api.fastapi_app INFO : FastAPI started Relevant go2rtc log outputN/A FFprobe output from your cameraN/A Install methodProxmox via Docker Object DetectorTensorRT Network connectionWired Camera make and modelReolink RLC-810A Screenshots of the Frigate UI's System metrics pagesN/A Any other information that may be helpfulSun May 11 14:00:56 2025 |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 9 replies
-
You need to rebuild your trt-models folder. See the documentation: https://docs.frigate.video/configuration/object_detectors/#nvidia-tensorrt-detector |
Beta Was this translation helpful? Give feedback.
-
Sorry, but I had to smile: |
Beta Was this translation helpful? Give feedback.
Frigate 0.16 uses a slightly newer trt version, but in general if that doesn't work then you are talking about a major version upgrade to tensorrt 10 which is going to be a lot of refactoring.
Realistically though, what you should be able to do instead is just use 0.16 and then you can generate the onnx model (instead of converting to tensorrt) and run via onnx which should work fine on the 5070. If you do decide to go this way keep in mind that 0.16 is still in beta. There are dev specific docs which walk through how to download these models:
https://deploy-preview-16390--frigate-docs.netlify.app/configuration/object_detectors#onnx