Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] - streaming support for API calls #19

Open
Sebusml opened this issue Apr 5, 2024 · 1 comment
Open

[Feature Request] - streaming support for API calls #19

Sebusml opened this issue Apr 5, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@Sebusml
Copy link
Contributor

Sebusml commented Apr 5, 2024

WHAT?
Add streaming support for API responses.

WHY?
Improves user experience for long or slow completions.

Additional requirements

  • Support TS.
  • Support serverless JS runtimes such as Cloudflare pages and Vercel.
  • If this feature requires creating a TS client, consider returning a ReadableStream

REFERENCE
OpenAI supports this with Chat Completions and the Assistants API.
Reference: https://platform.openai.com/docs/api-reference/streaming

@Sebusml
Copy link
Contributor Author

Sebusml commented Apr 5, 2024

As a reference, this is how I have to handle OpenAIs Stream responses in my backend and send back to frontend:

  const textStream = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [
      { role: 'system', content: SYSTEM_PROMT },
      { role: 'user', content: userText }
    ],
    stream: true
  }); 
  const encoder = new TextEncoder();
  return new Response(
    new ReadableStream({
      async start(controller) {
        // Logic to handle each chunk from original stream
        for await (const chunk of textStream) {
          // Get content from chunk as of OpenAI API response structure
          const message = chunk.choices[0]?.delta?.content || '';
          controller.enqueue(encoder.encode(message));
        }

        // Close the stream once all chunks are processed
        controller.close();
      },
      cancel() {
        console.log('cancel and abort');
      }
    }),
    {
      headers: {
        'cache-control': 'no-cache',
        'Content-Type': 'text/event-stream'
      }
    }
  );

Ideally, the API should return a ReadableStream, so all I would need to do is wrap it into a response.

@baur-krykpayev baur-krykpayev added the enhancement New feature or request label May 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants