-
Notifications
You must be signed in to change notification settings - Fork 237
Frontend still reaches out to database.build with own LLM #157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey @tino, thanks for the issue. Based on the path in those 2 requests, I think those are related to the deployments feature, which is separate from LLM API calls (it's checking to see if you have previously made any deployments so it can update the UI accordingly). Deployments should only work when logged in, so you could try logging out to prevent those requests from completing. |
Logged in into supabase? I'm not. And I'm not logged in to database.build either, as I've setup the "Bring your own LLM". Or did you mean something else? |
I was referring to logging in with database.build. Dug a bit deeper and realized we might still be making requests to the deployment endpoint (to get status of previous deployments, if any) even when logged out, which is a bug. We should add a check to make sure we're logged in first. PR's welcome! |
This comment has been minimized.
This comment has been minimized.
Hey @ArjixWasTaken, you're the first to report requests to To be clear, you'll still see a request to that route in the network logs, but in BYO-LLM mode it's being intercepted by the service worker and rerouted directly to OpenAI (see the Size column in the above screenshot). |
My bad, after looking at the ollama logs, it does indeed intercept it and contact my local ollama server. I'll try to debug it myself then, since it's probably a user error on my side. Edit: I had to write a reverse proxy to see what ollama responds with (couldn't find another way) and it looks like the model I am using ( For reference I'll leave the proxy code here, in case anyone in the same situation wants to debug what ollama responds with :) // pnpm add http-proxy
import httpProxy from 'http-proxy';
const proxy = httpProxy.createProxyServer({
target: 'http://localhost:11434'
}).listen(11435);
proxy.on('proxyRes', async (res) => {
const { promise, resolve } = Promise.withResolvers();
const chunks = [];
res.on('data', chunk => chunks.push(chunk));
res.on('end', resolve)
await promise;
const buffer = Buffer.concat(chunks);
console.log(buffer.toString('utf-8'))
}); @gregnr |
Bug report
Describe the bug
I expected from the readme that if you "Bring your own LLM", that everything would reside in the browser, and there would be only conversations with the LLM provider directly.
To Reproduce
Steps to reproduce the behavior, please provide code snippets or a repository:
Expected behavior
No requests to database.build beyond loading the html and static files.
Screenshots
System information
The text was updated successfully, but these errors were encountered: