Skip to content

Conversation

samelhousseini
Copy link
Collaborator

Description

New AI-powered Recommender of Recommenders

References

Using OpenAI Reasoning Models

Checklist:

  • [X ] I have updated the documentation accordingly.
  • [X ] This PR is being made to staging branch AND NOT TO main branch.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Collaborator

@loomlike loomlike left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your contribution! I've left a few comments—these changes are not mandatory but would be nice to have. Feel free to reply if you have any questions or concerns.

Below is an at-a-glance map of how deeply the OpenAI's o3 AI reasoning model can help you with every algorithm in the list.


| Algorithm | Model expertise level\* |
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How was this table derived?

from rich.console import Console
console = Console()

from tenacity import (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

redundant import (line 12)

if model == "o4-mini":
return tiktoken.get_encoding("o200k_base")
else:
return tiktoken.get_encoding("o200k_base")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to handle else cases, I recommend to use encoding_for_model which will raise KeyError for you as well if the model name is not defined in the lookup table: https://github.com/openai/tiktoken/blob/4560a8896f5fb1d35c6f8fd6eee0399f9a1a27ca/tiktoken/model.py#L80

Also, if you want to use else, I suggest using elif for all conditions except the first one to make it logically correct. Alternatively, you can remove else since each if statement already returns.

# print(">>>>>>>>>>>>>>>>> call_llm model_info", model_info)

if (model_info.model_name == "gpt-4o") or ((model_info.model_name == "gpt-45")):
return call_4(messages, model_info.client, model_info.model, temperature)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may simplify this conditional logic by distinguishing between reasoning and non-reasoning models instead of listing all the models individually.

I would recommend using the Responses API instead of the Chat API for this type of task though.

@retry(wait=wait_random_exponential(min=1, max=30), stop=stop_after_attempt(10))
def call_o1(messages, client, model, reasoning_effort ="medium"):
# print(f"\nCalling OpenAI APIs with {len(messages)} messages - Model: {model} - Endpoint: {client._base_url}\n")
response = client.chat.completions.create(model=model, messages=messages, reasoning_effort=reasoning_effort)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please consider using Responses API if possible, for both reasoning and non-reasoning models.

openai_embedding_model_info = {
"KEY": os.getenv('OPENAI_API_KEY'),
"MODEL": os.getenv('OPENAI_MODEL_EMBEDDING'),
"DIMS": 3072 if os.getenv('AZURE_OPENAI_MODEL_EMBEDDING') == "text-embedding-3-large" else 1536
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which models are covered by the else case here? Can we be certain that all their dimensions are 1536?

prompt_path = os.path.join(prompts_directory, prompt_name)
if os.path.exists(prompt_path): return prompt_path

return os.path.join(module_directory, 'prompts', prompt_name)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will this filepath always exist? If not, raising ValueError or FileNotFoundError seems the right choice.

img.save(new_image_path, 'JPEG')
return new_image_path
else:
return None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe raise ValueError or at least warn users that the file is not png?




from pathlib import Path
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you move import statement to the top of the file?



import uuid
import hashlib
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, maybe move all imports to the top of the file

@setuc setuc self-assigned this Jun 6, 2025
@setuc setuc added the enhancement New feature or request label Jun 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants