请问有支持本地大模型翻译的计划吗? #7
Answered
by
VirgilClyne
xuanlingzi
asked this question in
Q&A 问答
Replies: 3 comments 13 replies
-
|
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
VirgilClyne
-
可以使用结构化输出规范大模型输出格式,gemini flash模型速度还是可以的,网页端我用沉浸式翻译YouTube视频体验还不错😃 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
如果在本地运行ollama/lmstudio跑速度还行的模型,通过本地翻译不会受api的调用数量限制,看到Translate.mjs里已经加入了baidu、youdao的调用方法,有没有这样的开发计划?
Beta Was this translation helpful? Give feedback.
All reactions