Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Gemini models #75

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

harshpare12
Copy link

  • Add support for Gemini-1.5-flash and Gemini-1.5-pro.

Author: Harsh Pare

- Add support for Gemini-1.5-flash
and Gemini-1.5-pro.

Author: Harsh Pare
launch_scientist.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@conglu1997 conglu1997 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! Would you also be able to add a note in the README describing what keys are needed?

Do these work out of the box with the Aider API?

@@ -199,6 +214,17 @@ def get_response_from_llm(
)
content = response.choices[0].message.content
new_msg_history = new_msg_history + [{"role": "assistant", "content": content}]
elif "gemini" in model:
new_msg_history = msg_history + [{"role": "user", "parts": msg}]
response = client.generate_content(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No system message!

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, looks like Gemini doesn't have support for system messages as of now. This is what I found online:
Discussion1, Discussion2

@harshpare12
Copy link
Author

Hi @conglu1997, are these changes good to merge?

@conglu1997
Copy link
Collaborator

conglu1997 commented Aug 29, 2024

I think we will need a solution for the system messages, the code won't work without them.

@conglu1997
Copy link
Collaborator

We made a lot of changes to adding models to make it way easier, could you move these changes to llm.py? I think we can solve the system message problem by adding an extra user message at the front!

@harshpare12
Copy link
Author

Hi @conglu1997, sure I'll go over the changes made in the repository and make the changes accordingly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants