Skip to content

Bug: "WARNING: [EchoAgent] Error: 'choices' " with local installation of Ollama for LLMs #111

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ambki opened this issue Mar 9, 2025 · 2 comments

Comments

@ambki
Copy link

ambki commented Mar 9, 2025

I've make a clean default installation of AgentForge (via pip) with a default Ollama instance in docker in a miniconda environment (python3.11). The operating system is:

Release Ubuntu 24.04.2 LTS (Noble Numbat) 64-bit
Kernel Linux 6.8.0-55-generic x86_64
MATE 1.26.2
No GPUs.

After configuring the EchoAgent agent as stated in the AgentForge docs section and having run the command "$python run_agent.py", I get the following error:

WARNING: [EchoAgent] Error: 'choices'. Retrying in 2 seconds...
WARNING: [EchoAgent] Error: 'choices'. Retrying in 4 seconds...
WARNING: [EchoAgent] Error: 'choices'. Retrying in 8 seconds...

It doesn't stop if I do not kill the process.

I've been investigating a bit and the problem is that AgenForge is expecting a [choice] field in the JSON response (see /home/{USER}/miniconda3/envs/agentforge/lib/python3.11/site-packages/agentforge/apis/ollama_api.py) and from "my" ollama instance I'm getting no [choice] field. This is my output from ollama:

     $curl -X POST http://localhost:11434/api/generate -d '{"model": "llama3.1:latest", "prompt": "You are EchoAgent, an AI that repeats what 
       the user says. User input: Hello, world!", "stream": false}'

{"model":"llama3.1:latest","created_at":"2025-03-09T13:34:10.232821069Z","response":"Hello, world!","done":true,"done_reason":"stop","context":[128006,882,128007,271,2675,527,38906,17230,11,459,15592,430,44694,1148,279,1217,2795,13,2724,1988,25,22691,11,1917,0,128009,128006,78191,128007,271,9906,11,1917,0],"total_duration":4018373987,"load_duration":31050538,"prompt_eval_count"

Suggested solution

I've modified ollama_api.py (path: /home/{USER}/miniconda3/envs/agentforge/lib/python3.11/site-packages/agentforge/apis) in this way (see below) for a good response without eliminating the current script capabilities (aldus, getting [choices] field from a local ollama response if occurs):

import requests
import json
from .base_api import BaseModel

class Ollama(BaseModel):

@staticmethod
def _prepare_prompt(model_prompt):
    return model_prompt

def _do_api_call(self, prompt, **filtered_params):
    url = filtered_params.pop('host_url', 'http://localhost:11434/api/generate')
    headers = {'Content-Type': 'application/json'}
    data = {
        "model": self.model_name,
        "system": prompt.get('system'),
        "prompt": prompt.get('user'),
        **filtered_params
    }

    response = requests.post(url, headers=headers, json=data)

    if response.status_code != 200:
        self.logger.log(f"Request error: {response}", 'error')
        return None

    return response.json()

#This is the original last two lines of this script (commented)
#def _process_response(self, raw_response):
    #return raw_response['choices'][0]['message']['content']

#Changes for a generalization of the ollama_api.py 
def _process_response(self, raw_response):
    # Handle different Ollama endpoint responses
    if 'response' in raw_response:  # /api/generate
        return raw_response['response']
    elif 'message' in raw_response:  # /api/chat
        return raw_response['message']['content']
    else:
        return raw_response['choices'][0]['message']['content']

I hope it is clear enough otherwise happy to elaborate further.

@anselale
Copy link
Collaborator

anselale commented Jun 1, 2025

Hey, thanks a ton for the detailed report and for digging into the root cause! AgentForge was indeed expecting an OpenAI-style choices field.

We’ve just updated the Ollama integration in our dev branch to handle both formats (as well as /api/chat), so it’ll work out of the box with your setup and other Ollama endpoints. Your suggested solution was super helpful, thanks for sharing it!

This fix will be included in the next release. If you run into any more issues or have other suggestions, please let us know. Really appreciate your help making AgentForge better!

@ambki
Copy link
Author

ambki commented Jun 2, 2025

No worries. Happy to have provided a suitable solution. Keep on going with your nice work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants