Skip to content

ChatOpenAI封装本地vllm服务链式调用出错 #34067

@AhriOR

Description

@AhriOR

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-cli
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-perplexity
  • langchain-prompty
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Example Code (Python)

from langchain_core.output_parsers import StrOutputParser
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(
    model="Qwen3-1.7B",  # model
    temperature=0,
    api_key=qwen_api_key,  # vllm-key
    base_url=qwen_url,     # vllm服务地址
)

async def intense_recognize_qwen(input: InputSchema) -> str | None:
    """根据用户输入的症状,判断病人是否有感冒、发烧或不确定症状
    :arg message: 用户输入的消息
    :return: Qwen模型的回复
    """
    logger.info(f"intense recognize qwen tool start")
    prompt=ChatPromptTemplate.from_template(
        """你是一个经验丰富的医生,能够根据病人的症状,准确地判断病人是否有感冒、发烧或不确定症状。
        用户输入:{input},
        请根据输入内容进行回复。
        
        任务:仅识别用户输入的核心症状类型,严格只输出「感冒」「发烧」「不确定症状」三者中的一个,不添加任何额外文字、解释或标点,确保结果直接可用,不需要输出思考的过程。
        判断标准(清晰对照):
        
        归为「感冒」:明确提 “感冒”,或含鼻塞、流涕、咽痛、咳嗽、头痛、全身酸痛中至少 1 项症状
        归为「发烧」:明确提 “发烧 / 发热”,或含体温高(如 “38 度”)、浑身烫、怕冷(伴随体温异常暗示)中至少 1 项表述
        归为「不确定症状」:仅说 “不舒服”“难受” 等模糊描述,或未提上述任何感冒 / 发烧相关内容
        
        示例对照(帮你快速理解标准)
        
        用户输入	                  
        “我今天鼻塞、咳嗽,好像感冒了”
        模型输出	
        {{
            "symptom": "{{感冒}}"
        }}

        用户输入	                  
        “体温 38.5 度,浑身发烫”
        模型输出	
        {{
            "symptom": "{{发烧}}"
        }}
        
        用户输入	                  
        “就是感觉身体不太舒服”	
        模型输出	
        {{
            "symptom": "{{不确定症状}}"
        }}
        
        
        用户输入:{input}
        你的回答(仅识别用户输入的核心症状类型,严格只输出「感冒」「发烧」「不确定症状」三者中的一个,不添加任何额外文字、解释或标点,确保结果直接可用)
        不允许包含任何额外内容(如思考过程<think>):
        `json
        {{
            "symptom": "{{symptom}}"
        }}
        
        
        """
    )

    # 使用 RunnableSequence 而不是 LLMChain
    chain = prompt | llm | StrOutputParser()
    print(chain)


    input_message=input.query
    try:
        result=await chain.ainvoke({"input": HumanMessage(content=input_message)})
    except Exception as e:
        print(f"Qwen模型调用失败: {e}")
        logger.error(f"Qwen模型调用失败: {e}")

Error Message and Stack Trace (if applicable)

Qwen模型调用失败: 'str' object has no attribute 'model_dump'

Description

chain的输出为str而非可识别的对象
导致无法输出result

System Info

(vllm-env) cby@DESKTOP-1CP7TE5:/mnt/c/Users/Mechrevo/PycharmProjects/vllm_langgraph$ python -m langchain_core.sys_info

System Information

OS: Linux
OS Version: #1 SMP PREEMPT_DYNAMIC Thu Jun 5 18:30:46 UTC 2025
Python Version: 3.12.12 | packaged by Anaconda, Inc. | (main, Oct 21 2025, 20:16:04) [GCC 11.2.0]

Package Information

langchain_core: 1.0.4
langchain: 1.0.7
langchain_community: 0.4.1
langsmith: 0.4.40
langchain_classic: 1.0.0
langchain_openai: 1.0.2
langchain_pinecone: 0.2.13
langchain_text_splitters: 1.0.0
langchainhub: 0.1.15
langgraph_sdk: 0.2.9

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.13.2
dataclasses-json: 0.6.7
httpx: 0.28.1
httpx-sse: 0.4.3
jsonpatch: 1.33
langgraph: 1.0.3
numpy: 1.26.4
openai: 2.6.1
orjson: 3.11.4
packaging: 24.2
pinecone: 7.3.0
pydantic: 2.12.3
pydantic-settings: 2.11.0
PyYAML: 6.0.2
pyyaml: 6.0.2
requests: 2.32.5
requests-toolbelt: 1.0.0
rich: 14.2.0
simsimd: 6.5.3
sqlalchemy: 2.0.44
SQLAlchemy: 2.0.44
tenacity: 8.5.0
tiktoken: 0.12.0
types-requests: 2.32.4.20250913
typing-extensions: 4.15.0
zstandard: 0.25.0
(vllm-env) cby@DESKTOP-1CP7TE5:/mnt/c/Users/Mechrevo/PycharmProjects/vllm_langgraph$

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing featureopenai

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions