Skip to content

Commit

Permalink
Merge branch 'main' into wenxin
Browse files Browse the repository at this point in the history
  • Loading branch information
hezhijie0327 authored Feb 5, 2025
2 parents 17d89ff + f899c82 commit 4c03422
Show file tree
Hide file tree
Showing 47 changed files with 670 additions and 34 deletions.
1 change: 1 addition & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,7 @@ OPENAI_API_KEY=sk-xxxxxxxxx

### DeepSeek AI ####

# DEEPSEEK_PROXY_URL=https://api.deepseek.com/v1
# DEEPSEEK_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxx

### Qwen AI ####
Expand Down
60 changes: 60 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,66 @@

# Changelog

## [Version 1.51.0](https://github.com/lobehub/lobe-chat/compare/v1.50.5...v1.51.0)

<sup>Released on **2025-02-05**</sup>

#### ✨ Features

- **misc**: Add reasoning tag support for custom models via UI or ENV.

#### 🐛 Bug Fixes

- **misc**: Fix deepseek-v3 & qvq model tag fetch error from SiliconCloud, fix model ability missing.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's improved

- **misc**: Add reasoning tag support for custom models via UI or ENV, closes [#5684](https://github.com/lobehub/lobe-chat/issues/5684) ([3499403](https://github.com/lobehub/lobe-chat/commit/3499403))

#### What's fixed

- **misc**: Fix deepseek-v3 & qvq model tag fetch error from SiliconCloud, closes [#5741](https://github.com/lobehub/lobe-chat/issues/5741) ([ee61653](https://github.com/lobehub/lobe-chat/commit/ee61653))
- **misc**: Fix model ability missing, closes [#5739](https://github.com/lobehub/lobe-chat/issues/5739) ([0e1a022](https://github.com/lobehub/lobe-chat/commit/0e1a022))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>

### [Version 1.50.5](https://github.com/lobehub/lobe-chat/compare/v1.50.4...v1.50.5)

<sup>Released on **2025-02-04**</sup>

#### 💄 Styles

- **misc**: Add/Update Aliyun Cloud Models, update GitHub Models.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### Styles

- **misc**: Add/Update Aliyun Cloud Models, closes [#5613](https://github.com/lobehub/lobe-chat/issues/5613) ([95cd822](https://github.com/lobehub/lobe-chat/commit/95cd822))
- **misc**: Update GitHub Models, closes [#5683](https://github.com/lobehub/lobe-chat/issues/5683) ([ed4e048](https://github.com/lobehub/lobe-chat/commit/ed4e048))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>

### [Version 1.50.4](https://github.com/lobehub/lobe-chat/compare/v1.50.3...v1.50.4)

<sup>Released on **2025-02-04**</sup>
Expand Down
17 changes: 17 additions & 0 deletions changelog/v1.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,21 @@
[
{
"children": {
"features": ["Add reasoning tag support for custom models via UI or ENV."],
"fixes": [
"Fix deepseek-v3 & qvq model tag fetch error from SiliconCloud, fix model ability missing."
]
},
"date": "2025-02-05",
"version": "1.51.0"
},
{
"children": {
"improvements": ["Add/Update Aliyun Cloud Models, update GitHub Models."]
},
"date": "2025-02-04",
"version": "1.50.5"
},
{
"children": {
"fixes": ["Fix invalid utf8 character."]
Expand Down
4 changes: 2 additions & 2 deletions docs/changelog/2025-01-22-new-ai-provider.mdx
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
---
title: LobeChat Launches New AI Provider Management System
description: >-
LobeChat has revamped its AI Provider Management System, now supporting custom AI providers and models.
LobeChat has revamped its AI Provider Management System, now supporting custom
AI providers and models.
tags:
- LobeChat
- AI Provider
Expand Down
33 changes: 33 additions & 0 deletions docs/changelog/2025-02-02-deepseek-r1.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
---
title: >-
LobeChat Integrates DeepSeek R1, Bringing a Revolutionary Chain of Thought Experience
description: >-
LobeChat v1.49.12 fully supports the DeepSeek R1 model, providing users with an unprecedented interactive experience in the chain of thought.
tags:
- LobeChat
- DeepSeek
- Chain of Thought
---

# Perfect Integration of DeepSeek R1 and it's Deep Thinking Experience 🎉

After nearly 10 days of meticulous refinement, LobeChat has fully integrated the DeepSeek R1 model in version v1.49.12, offering users a revolutionary interactive experience in the chain of thought!

## 🚀 Major Updates

- 🤯 **Comprehensive Support for DeepSeek R1**: Now fully integrated in both the Community and Cloud versions ([lobechat.com](https://lobechat.com)).
- 🧠 **Real-Time Chain of Thought Display**: Transparently presents the AI's reasoning process, making the resolution of complex issues clear and visible.
- ⚡️ **Deep Thinking Experience**: Utilizing Chain of Thought technology, it provides more insightful AI conversations.
- 💫 **Intuitive Problem Analysis**: Makes the analysis of complex issues clear and easy to understand.

## 🌟 How to Use

1. Upgrade to LobeChat v1.49.12 or visit [lobechat.com](https://lobechat.com).
2. Select the DeepSeek R1 model in the settings.
3. Experience a whole new level of intelligent conversation!

## 📢 Feedback and Support

If you encounter any issues while using the application or have suggestions for new features, feel free to engage with us through GitHub Discussions. Let's work together to create a better LobeChat!
29 changes: 29 additions & 0 deletions docs/changelog/2025-02-02-deepseek-r1.zh-CN.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
---
title: LobeChat 重磅集成 DeepSeek R1,带来革命性思维链体验
description: LobeChat v1.49.12 已完整支持 DeepSeek R1 模型,为用户带来前所未有的思维链交互体验
tags:
- DeepSeek R1
- CoT
- 思维链
---

# 完美集成 DeepSeek R1 ,开启思维链新体验

经过近 10 天的精心打磨,LobeChat 已在 v1.49.12 版本中完整集成了 DeepSeek R1 模型,为用户带来革命性的思维链交互体验!

## 🚀 重大更新

- 🤯 **DeepSeek R1 全面支持**: 现已在社区版与 Cloud 版([lobechat.com](https://lobechat.com))中完整接入
- 🧠 **实时思维链展示**: 透明呈现 AI 的推理过程,让复杂问题的解决过程清晰可见
- ⚡️ **深度思考体验**: 通过 Chain of Thought 技术,带来更具洞察力的 AI 对话
- 💫 **直观的问题解析**: 让复杂问题的分析过程变得清晰易懂

## 🌟 使用方式

1. 升级到 LobeChat v1.49.12 或访问 [lobechat.com](https://lobechat.com)
2. 在设置中选择 DeepSeek R1 模型
3. 开启全新的智能对话体验!

## 📢 反馈与支持

如果您在使用过程中遇到任何问题,或对新功能有任何建议,欢迎通过 GitHub Discussions 与我们交流。让我们一起打造更好的 LobeChat!
6 changes: 6 additions & 0 deletions docs/changelog/index.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@
"$schema": "https://github.com/lobehub/lobe-chat/blob/main/docs/changelog/schema.json",
"cloud": [],
"community": [
{
"image": "https://github.com/user-attachments/assets/5fe4c373-ebd0-42a9-bdca-0ab7e0a2e747",
"id": "2025-02-02-deepseek-r1",
"date": "2025-02-02",
"versionRange": ["1.47.8", "1.49.12"]
},
{
"image": "https://github.com/user-attachments/assets/7350f211-61ce-488e-b0e2-f0fcac25caeb",
"id": "2025-01-22-new-ai-provider",
Expand Down
7 changes: 7 additions & 0 deletions docs/self-hosting/environment-variables/model-provider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,13 @@ If you need to use Azure OpenAI to provide model services, you can refer to the

## DeepSeek AI

### `DEEPSEEK_PROXY_URL`

- Type: Optional
- Description: If you manually configure the DeepSeek API proxy, you can use this configuration item to override the default DeepSeek API request base URL
- Default: -
- Example: `https://xxxx.models.ai.azure.com/v1`

### `DEEPSEEK_API_KEY`

- Type: Required
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,13 @@ LobeChat 在部署时提供了丰富的模型服务商相关的环境变量,

## DeepSeek AI

### `DEEPSEEK_PROXY_URL`

- 类型:可选
- 描述:如果您手动配置了 DeepSeek API 代理,可以使用此配置项覆盖默认的 DeepSeek API 请求基础 URL
- 默认值: -
- 示例: `https://xxxx.models.ai.azure.com/v1`

### `DEEPSEEK_API_KEY`

- 类型:必选
Expand Down
4 changes: 4 additions & 0 deletions locales/ar/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "معرف النموذج"
},
"modalTitle": "تكوين النموذج المخصص",
"reasoning": {
"extra": "هذا الإعداد سيفتح فقط قدرة النموذج على التفكير العميق، التأثير الفعلي يعتمد بالكامل على النموذج نفسه، يرجى اختبار ما إذا كان هذا النموذج يمتلك القدرة على التفكير العميق القابل للاستخدام",
"title": "يدعم التفكير العميق"
},
"tokens": {
"extra": "تعيين الحد الأقصى لعدد الرموز المدعومة من قبل النموذج",
"title": "أقصى نافذة سياق",
Expand Down
4 changes: 4 additions & 0 deletions locales/bg-BG/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID на модела"
},
"modalTitle": "Конфигурация на персонализиран модел",
"reasoning": {
"extra": "Тази конфигурация ще активира само способността на модела за дълбоко мислене, конкретният ефект зависи изцяло от самия модел, моля, тествайте сами дали моделът притежава налична способност за дълбоко мислене",
"title": "Поддръжка на дълбоко мислене"
},
"tokens": {
"extra": "Настройте максималния брой токени, поддържани от модела",
"title": "Максимален контекстуален прозорец",
Expand Down
4 changes: 4 additions & 0 deletions locales/de-DE/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "Modell-ID"
},
"modalTitle": "Benutzerdefinierte Modellkonfiguration",
"reasoning": {
"extra": "Diese Konfiguration aktiviert nur die Fähigkeit des Modells zu tiefem Denken. Die tatsächlichen Ergebnisse hängen vollständig vom Modell selbst ab. Bitte testen Sie selbst, ob das Modell über die Fähigkeit zum tiefen Denken verfügt.",
"title": "Unterstützung für tiefes Denken"
},
"tokens": {
"extra": "Maximale Token-Anzahl für das Modell festlegen",
"title": "Maximales Kontextfenster",
Expand Down
4 changes: 4 additions & 0 deletions locales/en-US/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "Model ID"
},
"modalTitle": "Custom Model Configuration",
"reasoning": {
"extra": "This configuration will enable the model's deep thinking capabilities, and the specific effects depend entirely on the model itself. Please test whether this model has usable deep thinking abilities.",
"title": "Support Deep Thinking"
},
"tokens": {
"extra": "Set the maximum number of tokens supported by the model",
"title": "Maximum Context Window",
Expand Down
4 changes: 4 additions & 0 deletions locales/es-ES/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID del modelo"
},
"modalTitle": "Configuración del modelo personalizado",
"reasoning": {
"extra": "Esta configuración solo activará la capacidad de pensamiento profundo del modelo, el efecto específico depende completamente del modelo en sí, por favor, pruebe si este modelo tiene la capacidad de pensamiento profundo utilizable",
"title": "Soporte para pensamiento profundo"
},
"tokens": {
"extra": "Establecer el número máximo de tokens que el modelo puede soportar",
"title": "Máximo de ventana de contexto",
Expand Down
4 changes: 4 additions & 0 deletions locales/fa-IR/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "شناسه مدل"
},
"modalTitle": "پیکربندی مدل سفارشی",
"reasoning": {
"extra": "این تنظیم فقط قابلیت تفکر عمیق مدل را فعال می‌کند و تأثیر دقیق آن کاملاً به خود مدل بستگی دارد، لطفاً خودتان آزمایش کنید که آیا این مدل قابلیت تفکر عمیق قابل استفاده را دارد یا خیر",
"title": "پشتیبانی از تفکر عمیق"
},
"tokens": {
"extra": "حداکثر تعداد توکن‌های پشتیبانی شده توسط مدل را تنظیم کنید",
"title": "حداکثر پنجره زمینه",
Expand Down
4 changes: 4 additions & 0 deletions locales/fr-FR/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID du modèle"
},
"modalTitle": "Configuration du modèle personnalisé",
"reasoning": {
"extra": "Cette configuration activera uniquement la capacité de réflexion approfondie du modèle. Les résultats dépendent entièrement du modèle lui-même, veuillez tester si ce modèle possède une capacité de réflexion approfondie utilisable.",
"title": "Support de la réflexion approfondie"
},
"tokens": {
"extra": "Définir le nombre maximal de tokens pris en charge par le modèle",
"title": "Fenêtre de contexte maximale",
Expand Down
4 changes: 4 additions & 0 deletions locales/it-IT/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID del modello"
},
"modalTitle": "Configurazione modello personalizzato",
"reasoning": {
"extra": "Questa configurazione attiverà solo la capacità di pensiero profondo del modello; l'effetto specifico dipende interamente dal modello stesso. Si prega di testare autonomamente se il modello possiede una capacità di pensiero profondo utilizzabile.",
"title": "Supporto per il pensiero profondo"
},
"tokens": {
"extra": "Imposta il numero massimo di token supportati dal modello",
"title": "Finestra di contesto massima",
Expand Down
4 changes: 4 additions & 0 deletions locales/ja-JP/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "モデル ID"
},
"modalTitle": "カスタムモデル設定",
"reasoning": {
"extra": "この設定は、モデルの深い思考能力を有効にするだけです。具体的な効果はモデル自体に依存しますので、このモデルが利用可能な深い思考能力を持っているかどうかはご自身でテストしてください。",
"title": "深い思考をサポート"
},
"tokens": {
"extra": "モデルがサポートする最大トークン数を設定する",
"title": "最大コンテキストウィンドウ",
Expand Down
4 changes: 4 additions & 0 deletions locales/ko-KR/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "모델 ID"
},
"modalTitle": "사용자 정의 모델 구성",
"reasoning": {
"extra": "이 설정은 모델의 심층 사고 능력만을 활성화합니다. 구체적인 효과는 모델 자체에 따라 다르므로, 해당 모델이 사용 가능한 심층 사고 능력을 갖추고 있는지 직접 테스트해 보시기 바랍니다.",
"title": "심층 사고 지원"
},
"tokens": {
"extra": "모델이 지원하는 최대 토큰 수 설정",
"title": "최대 컨텍스트 창",
Expand Down
4 changes: 4 additions & 0 deletions locales/nl-NL/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "Model ID"
},
"modalTitle": "Configuratie van aangepast model",
"reasoning": {
"extra": "Deze configuratie schakelt alleen de mogelijkheid voor diepgaand denken van het model in. Het specifieke effect hangt volledig af van het model zelf, test zelf of dit model in staat is tot bruikbaar diepgaand denken.",
"title": "Ondersteuning voor diepgaand denken"
},
"tokens": {
"extra": "Stel het maximale aantal tokens in dat door het model wordt ondersteund",
"title": "Maximale contextvenster",
Expand Down
4 changes: 4 additions & 0 deletions locales/pl-PL/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID modelu"
},
"modalTitle": "Konfiguracja niestandardowego modelu",
"reasoning": {
"extra": "Ta konfiguracja włączy jedynie zdolność modelu do głębokiego myślenia, a konkretne efekty w pełni zależą od samego modelu. Proszę samodzielnie przetestować, czy model ma zdolność do głębokiego myślenia.",
"title": "Wsparcie dla głębokiego myślenia"
},
"tokens": {
"extra": "Ustaw maksymalną liczbę tokenów wspieranych przez model",
"title": "Maksymalne okno kontekstu",
Expand Down
4 changes: 4 additions & 0 deletions locales/pt-BR/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID do Modelo"
},
"modalTitle": "Configuração do Modelo Personalizado",
"reasoning": {
"extra": "Esta configuração ativará apenas a capacidade de pensamento profundo do modelo, e o efeito específico depende totalmente do próprio modelo. Por favor, teste se este modelo possui a capacidade de pensamento profundo utilizável.",
"title": "Suporte a Pensamento Profundo"
},
"tokens": {
"extra": "Configurar o número máximo de tokens suportados pelo modelo",
"title": "Janela de contexto máxima",
Expand Down
4 changes: 4 additions & 0 deletions locales/ru-RU/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID модели"
},
"modalTitle": "Настройка пользовательской модели",
"reasoning": {
"extra": "Эта настройка активирует возможность глубокого мышления модели, конкретный эффект полностью зависит от самой модели, пожалуйста, протестируйте, обладает ли модель доступной способностью к глубокому мышлению",
"title": "Поддержка глубокого мышления"
},
"tokens": {
"extra": "Установите максимальное количество токенов, поддерживаемое моделью",
"title": "Максимальное окно контекста",
Expand Down
4 changes: 4 additions & 0 deletions locales/tr-TR/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "Model ID"
},
"modalTitle": "Özel Model Yapılandırması",
"reasoning": {
"extra": "Bu yapılandırma yalnızca modelin derin düşünme yeteneğini açacaktır, belirli etkiler tamamen modelin kendisine bağlıdır, lütfen bu modelin kullanılabilir derin düşünme yeteneğine sahip olup olmadığını kendiniz test edin",
"title": "Derin düşünmeyi destekler"
},
"tokens": {
"extra": "Modelin desteklediği maksimum Token sayısını ayarlayın",
"title": "Maksimum bağlam penceresi",
Expand Down
4 changes: 4 additions & 0 deletions locales/vi-VN/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "ID mô hình"
},
"modalTitle": "Cấu hình mô hình tùy chỉnh",
"reasoning": {
"extra": "Cấu hình này sẽ chỉ kích hoạt khả năng suy nghĩ sâu của mô hình, hiệu quả cụ thể hoàn toàn phụ thuộc vào chính mô hình, vui lòng tự kiểm tra xem mô hình này có khả năng suy nghĩ sâu có thể sử dụng hay không",
"title": "Hỗ trợ suy nghĩ sâu"
},
"tokens": {
"extra": "Cài đặt số Token tối đa mà mô hình hỗ trợ",
"title": "Cửa sổ ngữ cảnh tối đa",
Expand Down
4 changes: 4 additions & 0 deletions locales/zh-CN/modelProvider.json
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@
"title": "模型 ID"
},
"modalTitle": "自定义模型配置",
"reasoning": {
"extra": "此配置将仅开启模型深度思考的能力,具体效果完全取决于模型本身,请自行测试该模型是否具备可用的深度思考能力",
"title": "支持深度思考"
},
"tokens": {
"extra": "设置模型支持的最大 Token 数",
"title": "最大上下文窗口",
Expand Down
Loading

0 comments on commit 4c03422

Please sign in to comment.