Need an example of basic OpenAI Compatible custom model #7654
Replies: 3 comments
-
I figured this out, I needed dropParams: - "user" As followup, I have a header which requires user name, is it possible to dynamically include this in the yaml? |
Beta Was this translation helpful? Give feedback.
-
Additional follow up, this custom endpoint also fails when any MCP server is added. Using the same MCP server with xAI, for example, works with no problem. Does a custom endpoint require some specific MCP config? The same endpoint with the same MCP Server works on Cline. |
Beta Was this translation helpful? Give feedback.
-
closing this as original issue is resolved, will open another re: MCP. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm having a difficult time setting up a basic OpenAI compatible custom model. I have no issues on other platforms like Cline/Continue.dev. I'm likely missing some piece of config. Models fetch works, so connection is successful but when doing completion, I get 500s.
Example:
endpoints: custom: - name: "CustomTest" apiKey: "1234" baseURL: "https://endpoint.com/v1" addParams: max_tokens: 2048 headers: x-api-version: "beta" x-provider: "1234" x-region-scope: "us" x-user: "tester" x-http-token: "1234" models: default: ['anthropic.claude-sonnet-4-20250514-v1:0'] fetch: true titleConvo: false titleModel: 'current_model' directEndpoint: false summarize: false summaryModel: 'current_model' forcePrompt: false modelDisplayLabel: 'CustomTest' iconURL: https://images.crunchbase.com/image/upload/c_pad,f_auto,q_auto:eco,dpr_1/rjqy7ghvjoiu4cd1xjbf
Beta Was this translation helpful? Give feedback.
All reactions