-
Notifications
You must be signed in to change notification settings - Fork 21
Added new model options for pplx sonar model #201
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThe Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant PromptExecution
participant LLMService
User->>PromptExecution: initiate prompt execution (with optional search_mode, search_context_size)
PromptExecution->>LLMService: send payload (includes search_mode, web_search_options.search_context_size if set)
LLMService-->>PromptExecution: return results
PromptExecution-->>User: deliver results
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
athina/steps/llm.py (1)
475-484
: Same code duplication issue as in execute_async method.This is the third instance of identical search parameter logic. The refactoring suggestion from the
execute_async
method applies here as well.
🧹 Nitpick comments (1)
athina/steps/llm.py (1)
380-389
: Code duplication detected - consider refactoring.This logic is identical to the implementation in the
execute
method. Consider extracting this into a helper method to eliminate duplication.def _build_search_params(self): """Build search-related parameters for LLM service calls.""" params = {} if self.search_mode: params["search_mode"] = self.search_mode web_search_options = {} if self.search_context_size: web_search_options["search_context_size"] = self.search_context_size if web_search_options: params["web_search_options"] = web_search_options return paramsThen use
**self._build_search_params()
in all three methods.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
athina/steps/llm.py
(4 hunks)pyproject.toml
(1 hunks)
🔇 Additional comments (2)
pyproject.toml (1)
3-3
: LGTM: Appropriate version bump for new features.The minor version increment correctly reflects the addition of new optional search functionality in the PromptExecution class.
athina/steps/llm.py (1)
284-293
: Implementation looks correct with noted asymmetry.The conditional parameter passing is well-implemented. Note the asymmetric structure where
search_mode
is top-level whilesearch_context_size
is nested underweb_search_options
- this may be intentional based on the API requirements.
search_mode: Optional[str] = None | ||
search_context_size: Optional[str] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider type correction and add documentation.
search_context_size
should likely beOptional[int]
instead ofOptional[str]
since context size is typically a numeric value.- Missing docstring documentation for the new attributes explaining their purpose and expected values.
- search_mode: Optional[str] = None
- search_context_size: Optional[str] = None
+ search_mode: Optional[str] = None
+ search_context_size: Optional[int] = None
Also consider adding docstring documentation:
"""
...existing docstring...
search_mode (Optional[str]): The search mode for web search functionality.
search_context_size (Optional[int]): The context size for search results.
"""
🤖 Prompt for AI Agents
In athina/steps/llm.py around lines 231 to 232, change the type of
search_context_size from Optional[str] to Optional[int] to correctly represent
it as a numeric value. Additionally, update the relevant docstring to include
descriptions for both search_mode and search_context_size, explaining their
purpose and expected types as Optional[str] and Optional[int] respectively.
Summary by CodeRabbit
New Features
Chores