-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollma 403怎么办? #174
Comments
启动ollama时添加一个环境变量即可 |
添加了环境变量仍然403 |
你本机命令行启动的时候用这个命令 然后要测试能不能行的话,另开一个控制台用这个命令 curl http://localhost:11434/api/generate -d '{
"model": "llama3.2",
"prompt":"Why is the sky blue?"
}' 测试通过的话就没啥问题 kiss-translator 插件配置的一个选项框需要填 |
测试总是报错。都是这样的消息: 模型本身没问题,用各种ChatGPT兼容界面去连接/v1/chat/completions,让它翻译,都能翻译过来。而且我用的格式就是KISS Ollama设置里面那种,只不过把{{from}}{{to}}{{text}}这些都替换成具体的值。 |
看到了另一个issue |
logs:
PS C:\Users\60461> ollama serve
2024/08/07 14:40:41 routes.go:1011: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:D:\Software\scoop\apps\ollama_cderv\current\ollama_runners OLLAMA_TMPDIR:]"
time=2024-08-07T14:40:41.359+08:00 level=INFO source=images.go:740 msg="total blobs: 5"
time=2024-08-07T14:40:41.360+08:00 level=INFO source=images.go:747 msg="total unused blobs removed: 0"
time=2024-08-07T14:40:41.360+08:00 level=INFO source=routes.go:1057 msg="Listening on 127.0.0.1:11434 (version 0.1.42)"
time=2024-08-07T14:40:41.361+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7 cpu]"
time=2024-08-07T14:40:41.532+08:00 level=INFO source=types.go:71 msg="inference compute" id=GPU-5392dfca-71f5-39f0-f508-c97cd317dd59 library=cuda compute=8.9 driver=12.4 name="NVIDIA GeForce RTX 4060 Ti" total="16.0 GiB" available="14.9 GiB"
[GIN] 2024/08/07 - 14:40:50 | 403 | 0s | 127.0.0.1 | POST "/api/generate"
[GIN] 2024/08/07 - 14:40:51 | 403 | 0s | 127.0.0.1 | POST "/api/generate"
[GIN] 2024/08/07 - 14:40:52 | 403 | 0s | 127.0.0.1 | POST "/api/generate"
The text was updated successfully, but these errors were encountered: