Skip to content

Commit 6e19bb8

Browse files
committed
(docs) proxy config - show how to set seed, temp on config.yaml
1 parent 6a8d518 commit 6e19bb8

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

docs/my-website/docs/proxy/configs.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@ print(response)
188188
</Tabs>
189189

190190

191-
## Save Model-specific params (API Base, API Keys, Temperature, Headers etc.)
191+
## Save Model-specific params (API Base, API Keys, Temperature, Max Tokens, Seed, Headers etc.)
192192
You can use the config to save model-specific information like api_base, api_key, temperature, max_tokens, etc.
193193

194194
[**All input params**](https://docs.litellm.ai/docs/completion/input#input-params-1)
@@ -202,11 +202,14 @@ model_list:
202202
api_base: https://openai-gpt-4-test-v-1.openai.azure.com/
203203
api_version: "2023-05-15"
204204
azure_ad_token: eyJ0eXAiOiJ
205+
seed: 12
206+
max_tokens: 20
205207
- model_name: gpt-4-team2
206208
litellm_params:
207209
model: azure/gpt-4
208210
api_key: sk-123
209211
api_base: https://openai-gpt-4-test-v-2.openai.azure.com/
212+
temperature: 0.2
210213
- model_name: mistral-7b
211214
litellm_params:
212215
model: ollama/mistral

0 commit comments

Comments
 (0)