Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider using lmnr.ai and their observe() method to see if I can trace input/output from method calls. #325

Open
burtonator opened this issue Jan 18, 2025 · 3 comments

Comments

@burtonator
Copy link

Better than building this by hand PLUS I get the function call duration.

https://www.lmnr.ai/

@dinmukhamedm
Copy link
Member

@burtonator not sure you've created this in a correct repo :)
But very curious to learn about your use case

@burtonator
Copy link
Author

@burtonator not sure you've created this in a correct repo :) But very curious to learn about your use case

Hilarious! You're right. Wrong repo. Sorry. Was just to fall asleep and wanted to make a ticket for this!

Anyway. I wrote something internally and it's for a workflow agent (not fully autonomous agent).

I want input/output tracing and a performance UI but the other 50% of this is that I need resume so that if the same job starts up it can resume from cache and continue executing.

It would also be nice to view cache dependencies as a graph but that's fore the future.

Most of the functions I want to observe are "slow" meaning 500ms-5000ms

Happy to discuss..

@skull8888888
Copy link
Collaborator

@burtonator happy to help you get started with Laminar. It's actually extremely easy because we auto-instrument majority of LLM providers and frameworks. Check out our getting started guide here https://docs.lmnr.ai/tracing/introduction. If you have any questions ping me anytime

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants