Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Memory Integrations (chat history) #209

Merged
merged 3 commits into from
Jul 23, 2023

Conversation

nidzola
Copy link
Contributor

@nidzola nidzola commented Jul 21, 2023

PR Checklist

  • Read the Contributing documentation.
  • Read the Code of conduct documentation.
  • Name your Pull Request title clearly, concisely, and prefixed with the name of the primarily affected package you changed according to Good commit messages (such as memory: add interfaces for X, Y or util: add whizzbang helpers).
  • Check that there isn't already a PR that solves the problem the same way to avoid creating a duplicate.
  • Provide a description in this PR that addresses what the PR is solving, or reference the issue that it solves (e.g. Fixes #123).
  • Describes the source of new concepts.
  • References existing implementations as appropriate.
  • Contains test coverage for new functions.
  • Passes all golangci-lint checks.

Description

  • adding the ability to port in any memory integration for the buffer
  • by this logic, we can pass in any store that we would like to have to keep track of the chat history
    Screenshot 2023-07-21 at 6 00 32 PM

@Struki84
Copy link
Contributor

Very nice! As I understand this will allow me to implement a postgre based chat message history, which I attempted in #186

@zivkovicn
Copy link
Contributor

zivkovicn commented Jul 21, 2023

Very nice! As I understand this will allow me to implement a postgre based chat message history, which I attempted in #186

exactly! 😄 , you just define the store and pass it down to the buffer, it could be anything (psql or any other).

@Struki84
Copy link
Contributor

@tmc Can we add option.WithTokenLimit() or something of the kind here? I know python lib has TokenBuffer class to flush to chat msgs that overflow the token size for a model. Not sure if we need an entirely new buffer for that, so we might consider it... cc @FluffyKebab

related to #125

@nidzola
Copy link
Contributor Author

nidzola commented Jul 21, 2023

@tmc Can we add option.WithTokenLimit() or something of the kind here? I know python lib has TokenBuffer class to flush to chat msgs that overflow the token size for a model. Not sure if we need an entirely new buffer for that, so we might consider it... cc @FluffyKebab

related to #125

I will see when I grab time, to add the token limit, like they have in the python lib, maybe we can do it as a separate PR.

Copy link
Owner

@tmc tmc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add or augment an example showing use?

@tmc tmc merged commit a698f9b into tmc:main Jul 23, 2023
3 checks passed
@nidzola
Copy link
Contributor Author

nidzola commented Jul 23, 2023

Could you add or augment an example showing use?

TestBufferMemoryWithChatHistoryOption is mimicking the store (that could be anything) and is an example of usage, do you think we need smth more?

@nidzola nidzola deleted the feat-adding-memory-buffer-options branch July 23, 2023 19:27
@nidzola
Copy link
Contributor Author

nidzola commented Jul 24, 2023

@tmc Can we add option.WithTokenLimit() or something of the kind here? I know python lib has TokenBuffer class to flush to chat msgs that overflow the token size for a model. Not sure if we need an entirely new buffer for that, so we might consider it... cc @FluffyKebab

related to #125

@tmc @Struki84 added the token limit logic #217

@Struki84 Struki84 mentioned this pull request Jul 28, 2023
9 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants