-
Notifications
You must be signed in to change notification settings - Fork 253
Open
Description
I have started using a model with vllm, and I want to use the web-based Langsmith in conjunction with my local model. The code is as follows. I’m curious about how I should customize the model and then use it on the web page.
###code
llm = ChatOpenAI(model_name = 'Qwen2.5-14B-Instruct', base_url = "http://xxx:9009/v1", api_key = "EMPTY", temperature=0).bind(response_format={"type": "json_object"})
Metadata
Metadata
Assignees
Labels
No labels