Lightweight Inference server for OpenVINO
-
Updated
Mar 8, 2025 - Python
Lightweight Inference server for OpenVINO
Most simple and minimal code to run an LLM chatbot from HuggingFace hub with OpenVINO
Add a description, image, and links to the optimum-intel topic page so that developers can more easily learn about it.
To associate your repository with the optimum-intel topic, visit your repo's landing page and select "manage topics."