Skip to content

Commit daada96

Browse files
feat(api): api update
1 parent eca10e2 commit daada96

File tree

3 files changed

+8
-5
lines changed

3 files changed

+8
-5
lines changed

.stats.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
configured_endpoints: 17
2-
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/groqcloud%2Fgroqcloud-321ea3c10bc0eb2b17407a99eb47e8ea88e67bf7bff8bebe3592fbd4f73b1f89.yml
3-
openapi_spec_hash: d0f5f934d8a12f79db0cbdb7b6b3d0e3
2+
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/groqcloud%2Fgroqcloud-98337e5b33a6b805acdfcd318f6acab8683c5b0afb1446cd0c62dff125fad4c0.yml
3+
openapi_spec_hash: c4ac337673fc0f2bab417fbf379776ee
44
config_hash: 6b1c374dcc1ffa3165dd22f52a77ff89

src/groq/resources/chat/completions.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -285,7 +285,8 @@ def create(
285285
286286
response_format: An object specifying the format that the model must output. Setting to
287287
`{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs
288-
which ensures the model will match your supplied JSON schema. Setting to
288+
which ensures the model will match your supplied JSON schema. json_schema
289+
response format is only supported on llama 4 models. Setting to
289290
`{ "type": "json_object" }` enables the older JSON mode, which ensures the
290291
message the model generates is valid JSON. Using `json_schema` is preferred for
291292
models that support it.
@@ -651,7 +652,8 @@ async def create(
651652
652653
response_format: An object specifying the format that the model must output. Setting to
653654
`{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs
654-
which ensures the model will match your supplied JSON schema. Setting to
655+
which ensures the model will match your supplied JSON schema. json_schema
656+
response format is only supported on llama 4 models. Setting to
655657
`{ "type": "json_object" }` enables the older JSON mode, which ensures the
656658
message the model generates is valid JSON. Using `json_schema` is preferred for
657659
models that support it.

src/groq/types/chat/completion_create_params.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -138,7 +138,8 @@ class CompletionCreateParams(TypedDict, total=False):
138138
"""An object specifying the format that the model must output.
139139
140140
Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured
141-
Outputs which ensures the model will match your supplied JSON schema. Setting to
141+
Outputs which ensures the model will match your supplied JSON schema.
142+
json_schema response format is only supported on llama 4 models. Setting to
142143
`{ "type": "json_object" }` enables the older JSON mode, which ensures the
143144
message the model generates is valid JSON. Using `json_schema` is preferred for
144145
models that support it.

0 commit comments

Comments
 (0)