Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ feat: Support reasoning output for OpenRouter reasoning models (like deepseek-r1). | 支持读取和展示OpenRouter深度思考输出 #5903

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

deephbz
Copy link

@deephbz deephbz commented Feb 8, 2025

💻 变更类型 | Change Type

  • [x ] ✨ feat

🔀 变更说明 | Description of Change

Business Description

Get reasoning (i.e. deep thinking) outputs for all OpenRouter reasoning models, and display them in lobe-chat UI.

Fully tested: unittests as well as integration testing.

Technical Description

Unlike most other DeepSeek-R1 providers which simply return <think> ... </think> inside reply messages. OpenRouter use a separate "reasoning" key to return reasoning outputs. Besides it requires add include_reasoning=True in payload. See their docs.

Therefore, to support it we need two changes:

  1. Set include_reasoning=True by using handlePayload callback. This does not affect non-reasoning models at all. Verification:
    -you could see reasoning outputs in response:
    image
  2. Transform the response stream from
{'role': 'assistant', 'content': None, 'reasoning': 'Okay'}
{'role': 'assistant', 'content': 'start-content', 'reasoning': ''}
{'role': 'assistant', 'content': 'foo bar', 'reasoning': ''}

to

{'role': 'assistant', 'content':  '<think>Okay'}
{'role': 'assistant', 'content': '</think>start-content'}
{'role': 'assistant', 'content': 'foo bar'}

This transformed content is consistent with other platforms - a single 'content' stream of thinking then normal output tokens. Therefore, the part gets automatically picked up and displayed.

Verification:
image

📝 补充信息 | Additional Information

Solves #5766 , related #5860

Copy link

vercel bot commented Feb 8, 2025

@deephbz is attempting to deploy a commit to the LobeHub Team on Vercel.

A member of the Team first needs to authorize it.

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Feb 8, 2025
@lobehubbot
Copy link
Member

👍 @deephbz

Thank you for raising your pull request and contributing to our Community
Please make sure you have followed our contributing guidelines. We will review it as soon as possible.
If you encounter any problems, please feel free to connect with us.
非常感谢您提出拉取请求并为我们的社区做出贡献,请确保您已经遵循了我们的贡献指南,我们会尽快审查它。
如果您遇到任何问题,请随时与我们联系。

@dosubot dosubot bot added the 🌠 Feature Request New feature or request | 特性与建议 label Feb 8, 2025
Copy link
Contributor

gru-agent bot commented Feb 8, 2025

TestGru Assignment

Summary

Link CommitId Status Reason
Detail 7c7b5ba 🚫 Skipped No files need to be tested {"src/config/modelProviders/openrouter.ts":"The code does not contain any functions or classes.","src/libs/agent-runtime/openrouter/index.ts":"no test value"}

Tip

You can @gru-agent and leave your feedback. TestGru will make adjustments based on your input

@deephbz deephbz changed the title ✨ feat: Support include_reasoning for OpenRouter provider's models. ✨ feat: Support reasoning output for OpenRouter reasoning models (like deepseek-r1). | 支持读取和展示OpenRouter深度思考输出 Feb 9, 2025
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Feb 9, 2025
@deephbz deephbz force-pushed the feature/suppport-openrouter-reasoning-tokens branch 2 times, most recently from ca6d009 to d579c59 Compare February 9, 2025 04:41
Copy link
Contributor

@arvinxx arvinxx Feb 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this implement is not ok ,please refer to https://github.com/lobehub/lobe-chat/blob/main/src/libs/agent-runtime/utils/streams/openai.ts#L90-L100 ,make OpenRouter 's chunk parser into openai stream

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done please review.

@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. and removed size:L This PR changes 100-499 lines, ignoring generated files. labels Feb 11, 2025
its "reasoning" to be consistent with most other platforms: reasoning outputs wrapped by <think> XML tag.
@deephbz deephbz force-pushed the feature/suppport-openrouter-reasoning-tokens branch from 87b944d to bb8bcfa Compare February 11, 2025 08:54
@deephbz deephbz force-pushed the feature/suppport-openrouter-reasoning-tokens branch from bb8bcfa to e471961 Compare February 11, 2025 09:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🌠 Feature Request New feature or request | 特性与建议 size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants