Skip to content

[Feature] Customized (Local) LLM Backend Integration #58

@yuxuandexter

Description

@yuxuandexter

Plan to support (local) vLLM and Ollama LLM backend by updating:
tools\serving\api_manager.py
tools\serving\api_providers.py

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions