Plan to support (local) vLLM and Ollama LLM backend by updating: `tools\serving\api_manager.py` `tools\serving\api_providers.py`