-
Like in https://chat.openai.com/ |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @AndrewBPC You can set the https://platform.openai.com/docs/api-reference/completions/create When you specify You can then access the stream interface and loop over it, outputting it in chunks. Unfortunately, we don't have any detailed documentation on that, but hopefully, this points you in the right direction. |
Beta Was this translation helpful? Give feedback.
Hi @AndrewBPC
You can set the
stream
parameter on thecompletions
endpoint.https://platform.openai.com/docs/api-reference/completions/create
When you specify
stream=true
in your request, you would then need to retrieve the response using thegetResponse()
method rather thantoModel()
ortoArray()
.You can then access the stream interface and loop over it, outputting it in chunks.
Unfortunately, we don't have any detailed documentation on that, but hopefully, this points you in the right direction.