[Detector Support]: Frigate+Openvino+ODYSSEY - X86J4105 running #20428
Replies: 2 comments
-
You're encountering two separate issues: 1. OpenVINO GPU Compilation ErrorThe primary error is:
This indicates OpenVINO cannot compile the YOLO model for your Intel GPU(1)(2). The Celeron J4105 has an Intel UHD Graphics 600 integrated GPU, which may have limited OpenCL support for complex YOLO models. Solution: Use CPU mode instead of GPU for OpenVINO with your hardware(3): detectors:
ov_detector:
type: openvino
device: CPU OpenVINO is supported on 6th Gen Intel platforms (Skylake) and newer, and will run on your CPU despite the J4105 being a Gemini Lake processor(3). For YOLO models specifically, CPU mode may be more reliable on your hardware(3). 2. VAAPI Hardware Acceleration ErrorThe second issue is:
Your configuration attempts to use VAAPI hardware acceleration, but the decoder isn't available(4). Remove the hwaccel configuration from your camera: cameras:
home:
ffmpeg:
inputs:
- path: rtsp://127.0.0.1:8554/home
roles:
- detect
# Remove hwaccel_args entirely 3. Additional Configuration IssuesYour config has several problems:
Here's a corrected configuration: detectors:
ov_detector:
type: openvino
device: CPU
model:
model_type: yolo-generic
width: 320
height: 320
input_tensor: nchw
input_dtype: float
path: /models/yolov9t.onnx
labelmap_path: /labelmap/coco-80.txt
cameras:
home:
ffmpeg:
inputs:
- path: rtsp://127.0.0.1:8554/home
roles:
- detect
detect:
enabled: true
width: 320
height: 320
fps: 5 Note that the 📚 Sources:
Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
-
The AI is correct concerning the large number of parameters you have misconfigured. I'm guessing you tried to use ChatGPT or similar to try to resolve your issues. You should use the free AI available at https://docs.frigate.video instead of other LLMs as it's trained specifically on Frigate documentation and discussions. Your biggest issue is that you are missing a |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the problem you are having
I'm encountering an error message when using yolov9t.onnx for inference, indicating it's not being driven.
The error message is as follows:
Version
0.16-0
Frigate config file
docker-compose file or Docker CLI command
Relevant Frigate log output
Install method
Docker Compose
Object Detector
OpenVino
Screenshots of the Frigate UI's System metrics pages
Any other information that may be helpful
No response
Beta Was this translation helpful? Give feedback.
All reactions