Skip to content

[Bug]: Crawl returning 'str' object has no attribute 'choices #979

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jacobshenn opened this issue Apr 12, 2025 · 5 comments
Open

[Bug]: Crawl returning 'str' object has no attribute 'choices #979

jacobshenn opened this issue Apr 12, 2025 · 5 comments
Labels
🐞 Bug Something isn't working 🩺 Needs Triage Needs attention of maintainers

Comments

@jacobshenn
Copy link

crawl4ai version

0.5.0

Expected Behavior

Return a normal crawl matching my schema.

Current Behavior

I am crawling a set of about 600 links. For some links, the crawl works perfectly, but for others, the crawler returns:
[
{
"index": 0,
"error": true,
"tags": [
"error"
],
"content": "'str' object has no attribute 'choices'"
}
]

there is no pattern for which links the crawler returns this for which makes me wonder whether this is an API issue. Has anyone seen or encountered this bug?

Is this reproducible?

Yes

Inputs Causing the Bug

Steps to Reproduce

Code snippets

OS

mac)S

Python version

3.13

Browser

Chrome

Browser version

No response

Error logs & Screenshots (if applicable)

No response

@jacobshenn jacobshenn added 🐞 Bug Something isn't working 🩺 Needs Triage Needs attention of maintainers labels Apr 12, 2025
@RaccoonOnion
Copy link

Same here. Using deepseek chat API

@jacobshenn
Copy link
Author

Same here. Using deepseek chat API

Hey! Thanks for replying. Are you using the deepseek API inside crawl4ai? or are you using it standalone?

  • Thanks

@RaccoonOnion
Copy link

Same here. Using deepseek chat API

Hey! Thanks for replying. Are you using the deepseek API inside crawl4ai? or are you using it standalone?

  • Thanks

Inside LLMExtractionStrategy as:

llm_strategy = LLMExtractionStrategy(
    llm_config=LLMConfig(provider="deepseek/deepseek-chat", api_token=os.getenv("DEEPSEEK_API")),
    schema=LeaderboardEntry.model_json_schema(),
    extraction_type="schema",
    instruction=INSTRUCTION_TO_LLM,
    chunk_token_threshold=1000,
    overlap_rate=0.0,
    apply_chunking=True,
    input_format="markdown",
    extra_args={"temperature": 0.0, "max_tokens": 2048},
)

@jacobshenn
Copy link
Author

I'm running my API's through openrouter and getting this error.

`
llm_strategy = LLMExtractionStrategy(
llm_config=LLMConfig(
provider="deepseek/deepseek-chat",
api_token="os.getenv("openrouter")),
base_url="https://openrouter.ai/api/v1",

    )`

Output:

{'index': 0, 'error': True, 'tags': ['error'], 'content': 'litellm.BadRequestError: DeepseekException - {"error":{"message":"deepseek-chat is not a valid model ID","code":400},"user_id":"user_2th1C5iID3WInICREZPY1NCmXhb"}'}

Have you ran into this at all?

@dhruvthak3r
Copy link

Yes, I'm facing the same issue with DeepSeek model through groq Api

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 Bug Something isn't working 🩺 Needs Triage Needs attention of maintainers
Projects
None yet
Development

No branches or pull requests

3 participants