You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This change allows a user to construct a PipelineContext with multiple
OpenAI clients, such as:
```python
PipelineContext(
clients={
"default": OpenAI(base_url="https://foo.local"),
"server_a": OpenAI(base_url="https://server_a.local"),
"server_b": OpenAI(base_url="https://server_b.local"),
}
)
```
And then, within the pipeline yaml, choose which client to apply to
which LLMBlock via a new `client` key, such as:
```yaml
version: "1.0"
blocks:
- name: server_a_client
type: LLMBlock
config:
client: server_a
...
- name: server_b_client
type: LLMBlock
config:
client: server_b
...
```
See `docs/examples/multiple_llm_clients` for more details and a full
example.
Resolves#521
Signed-off-by: Ben Browning <[email protected]>
Copy file name to clipboardExpand all lines: CHANGELOG.md
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,12 @@
2
2
3
3
### Features
4
4
5
+
### Pipelines can now have LLMBlocks with different OpenAI clients
6
+
7
+
For advanced use-cases, PipelineContext now accepts a `clients` dictionary of string to OpenAI client mappings. The special string of "default" sets the OpenAI client used for LLMBlocks by default, but individual LLMBlocks can override the client used by the `client` parameter in their yaml config.
8
+
9
+
Backwards-compatibility is maintained for Pipelines that only need a single client, where setting the `client` property on PipelineContext objects just sets the default client in the `clients` dictionary automatically.
10
+
5
11
### LLMBlocks can now specify `model_family` or `model_id` in their config
6
12
7
13
Each `LLMBlock` in a `Pipeline` can now specify `model_family` or `model_id` in their yaml configuration to set the values to use for these blocks, as opposed to setting this for the entire `Pipeline` in the `PipelineContext` object. This is useful for the cases where multiple `LLMBlocks` exist in the same `Pipeline` where each one uses a different model.
For advanced use-cases, PipelineContext accepts a `clients` dictionary of string to OpenAI client mappings. The special string of "default" sets the OpenAI client used for LLMBlocks by default, but individual LLMBlocks can override the client used by the `client` parameter in their yaml config.
4
+
5
+
See `pipeline.yaml` in this directory for an example of a Pipeline that uses different clients per `LLMBlock`.
0 commit comments