[Detector Support]: installing detector on truenas #20493
-
Describe the problem you are havingI’ve been trying to figure out how to get detectors up and running with my nvidia 1050ti. I can’t figure out how to download the models using the scripts in the docs. TrueNAS doesn’t recognize the docker commands. Version0.16.1 Frigate config fileNadocker-compose file or Docker CLI commandNaRelevant Frigate log output2025-10-15 12:52:22.118971000 [INFO] Preparing Frigate...
2025-10-15 12:52:22.502772931 [INFO] Starting Frigate...
2025-10-15 12:52:24.459080597 [2025-10-15 08:52:24] frigate.util.config INFO : Checking if frigate config needs migration...
2025-10-15 12:52:24.478048934 [2025-10-15 08:52:24] frigate.util.config INFO : frigate config does not need migration...
2025-10-15 12:52:24.507825979 [2025-10-15 08:52:24] frigate.app INFO : Starting Frigate (0.16.1-e664cb2)
2025-10-15 12:52:24.523774594 [2025-10-15 08:52:24] peewee_migrate.logs INFO : Starting migrations
2025-10-15 12:52:24.524734197 [2025-10-15 08:52:24] peewee_migrate.logs INFO : There is nothing to migrate
2025-10-15 12:52:24.537870840 [2025-10-15 08:52:24] frigate.app INFO : Recording process started: 394
2025-10-15 12:52:24.538479448 [2025-10-15 08:52:24] frigate.app INFO : Review process started: 401
2025-10-15 12:52:24.540567256 [2025-10-15 08:52:24] frigate.app INFO : go2rtc process pid: 124
2025-10-15 12:52:24.552726303 [2025-10-15 08:52:24] detector.onnx INFO : Starting detection process: 418
2025-10-15 12:52:24.554239169 [2025-10-15 08:52:24] frigate.detectors.plugins.onnx INFO : ONNX: loaded onnxruntime module
2025-10-15 12:52:24.554601053 [2025-10-15 08:52:24] frigate.detectors.plugins.onnx INFO : ONNX: loading /mnt/Frigate_Rec/Frigate/model_cache/yolo_nas_s.onnx
2025-10-15 12:52:24.557447954 Process detector:onnx:
2025-10-15 12:52:24.557450944 Traceback (most recent call last):
2025-10-15 12:52:24.557452354 File "/usr/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
2025-10-15 12:52:24.557455361 self.run()
2025-10-15 12:52:24.557457177 File "/opt/frigate/frigate/util/process.py", line 41, in run_wrapper
2025-10-15 12:52:24.557458310 return run(*args, **kwargs)
2025-10-15 12:52:24.557471043 ^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557472389 File "/usr/lib/python3.11/multiprocessing/process.py", line 108, in run
2025-10-15 12:52:24.557473738 self._target(*self._args, **self._kwargs)
2025-10-15 12:52:24.557474945 File "/opt/frigate/frigate/object_detection/base.py", line 112, in run_detector
2025-10-15 12:52:24.557476177 object_detector = LocalObjectDetector(detector_config=detector_config)
2025-10-15 12:52:24.557477384 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557491816 File "/opt/frigate/frigate/object_detection/base.py", line 57, in __init__
2025-10-15 12:52:24.557492976 self.detect_api = create_detector(detector_config)
2025-10-15 12:52:24.557507962 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557509289 File "/opt/frigate/frigate/detectors/__init__.py", line 18, in create_detector
2025-10-15 12:52:24.557510338 return api(detector_config)
2025-10-15 12:52:24.557511289 ^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557512479 File "/opt/frigate/frigate/detectors/plugins/onnx.py", line 53, in __init__
2025-10-15 12:52:24.557513504 self.model = ort.InferenceSession(
2025-10-15 12:52:24.557514467 ^^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557515905 File "/usr/local/lib/python3.11/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 465, in __init__
2025-10-15 12:52:24.557525452 self._create_inference_session(providers, provider_options, disabled_optimizers)
2025-10-15 12:52:24.557527365 File "/usr/local/lib/python3.11/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 526, in _create_inference_session
2025-10-15 12:52:24.557528767 sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
2025-10-15 12:52:24.557530116 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-10-15 12:52:24.557532610 onnxruntime.capi.onnxruntime_pybind11_state.NoSuchFile: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from /mnt/Frigate_Rec/Frigate/model_cache/yolo_nas_s.onnx failed:Load model /mnt/Frigate_Rec/Frigate/model_cache/yolo_nas_s.onnx failed. File doesn't existInstall methodDocker CLI Object DetectorTensorRT Screenshots of the Frigate UI's System metrics pagesNa Any other information that may be helpfulNo response |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 12 replies
-
|
For TensorRT detectors on TrueNAS with your Nvidia 1050ti, you need to use the The TensorRT detector requires preprocessed model files that are generated on the same hardware platform where they'll run (1). Frigate will automatically generate these models during startup if they're not found, storing them in To configure this in TrueNAS:
Your detector configuration should look like this (1): detectors:
tensorrt:
type: tensorrt
device: 0
model:
path: /config/model_cache/tensorrt/yolov7-320.trt
input_tensor: nchw
input_pixel_format: rgb
width: 320
height: 320The available models include A community user successfully resolved a similar TrueNAS setup by adding the environment variable 📚 Sources:
Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
So, I had to add “additional storage” in the Frigate app and mount the exact file location, not just the Config folder. Once I did that, it sees and uses the model. Thanks for your help!!