Skip to content

Frontend still reaches out to database.build with own LLM #157

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tino opened this issue Jan 31, 2025 · 6 comments
Open

Frontend still reaches out to database.build with own LLM #157

tino opened this issue Jan 31, 2025 · 6 comments
Labels
bug Something isn't working

Comments

@tino
Copy link

tino commented Jan 31, 2025

Bug report

Describe the bug

I expected from the readme that if you "Bring your own LLM", that everything would reside in the browser, and there would be only conversations with the LLM provider directly.

To Reproduce

Steps to reproduce the behavior, please provide code snippets or a repository:

  1. Set up with your own OpenAI LLM key
  2. Enter a message
  3. See database.build/db/... requests in browser debugger that are not handled by the service worker (and aren't static)

Expected behavior

No requests to database.build beyond loading the html and static files.

Screenshots

Image

System information

  • OS: macOS
  • Browser (if applies): Arc (Chrome)
@tino tino added the bug Something isn't working label Jan 31, 2025
@gregnr
Copy link
Collaborator

gregnr commented Feb 18, 2025

Hey @tino, thanks for the issue. Based on the path in those 2 requests, I think those are related to the deployments feature, which is separate from LLM API calls (it's checking to see if you have previously made any deployments so it can update the UI accordingly).

Deployments should only work when logged in, so you could try logging out to prevent those requests from completing.

@tino
Copy link
Author

tino commented Mar 10, 2025

Logged in into supabase? I'm not. And I'm not logged in to database.build either, as I've setup the "Bring your own LLM". Or did you mean something else?

@gregnr
Copy link
Collaborator

gregnr commented Mar 20, 2025

I was referring to logging in with database.build. Dug a bit deeper and realized we might still be making requests to the deployment endpoint (to get status of previous deployments, if any) even when logged out, which is a bug. We should add a check to make sure we're logged in first. PR's welcome!

@ArjixWasTaken

This comment has been minimized.

@gregnr
Copy link
Collaborator

gregnr commented Apr 13, 2025

Hey @ArjixWasTaken, you're the first to report requests to /api/chat in BYO-LLM mode. Can you confirm this is what you meant?

To be clear, you'll still see a request to that route in the network logs, but in BYO-LLM mode it's being intercepted by the service worker and rerouted directly to OpenAI (see the Size column in the above screenshot).

@ArjixWasTaken
Copy link

ArjixWasTaken commented Apr 13, 2025

My bad, after looking at the ollama logs, it does indeed intercept it and contact my local ollama server.
It just receives a 400/404 status code, so it fails.

I'll try to debug it myself then, since it's probably a user error on my side.

Edit: I had to write a reverse proxy to see what ollama responds with (couldn't find another way) and it looks like the model I am using (gemma3:4b) does not support tools!

For reference I'll leave the proxy code here, in case anyone in the same situation wants to debug what ollama responds with :)

// pnpm add http-proxy
import httpProxy from 'http-proxy';

const proxy = httpProxy.createProxyServer({
    target: 'http://localhost:11434'
}).listen(11435);

proxy.on('proxyRes', async (res) => {
    const { promise, resolve } = Promise.withResolvers();

    const chunks = [];
    res.on('data', chunk => chunks.push(chunk));
    res.on('end', resolve)

    await promise;

    const buffer = Buffer.concat(chunks);
    console.log(buffer.toString('utf-8'))
});

@gregnr
On that note, it would be very helpful if the service worker responded with the error, I'd immediately be able to tell what went wrong.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants