-
|
Does BAML support mocking responses? As in; for a given method, return fake data mocks instead of calling an LLM. |
Beta Was this translation helpful? Give feedback.
Answered by
sxlijin
Jan 9, 2026
Replies: 1 comment
-
|
Yep! This is best done by passing in a client registry that replaces your primary LLM with an If you have followup questions, please keep in mind that we do not use GH discussions for questions: we do not monitor this channel actively and are most responsive on our Discord server. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
sxlijin
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yep! This is best done by passing in a client registry that replaces your primary LLM with an
openaiclient wherebase_urlpoints at a local OpenAI-compatible fake server.If you have followup questions, please keep in mind that we do not use GH discussions for questions: we do not monitor this channel actively and are most responsive on our Discord server.