Skip to content

[HELP]推理奖励模型报错,感谢大家,求教qwen基座rm后的模型如何vllm推理 #4045

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
EvilCalf opened this issue Apr 30, 2025 · 3 comments

Comments

@EvilCalf
Copy link

Exception in callback VllmEngine.patch_remove_log.<locals>.new_log_task_completion(error_callback=<bound method...7efd7bbd70b0>>)(<Task finishe...dimensions'")>) at /root/paddlejob/workspace/env_run/ms-swift/swift/llm/infer/infer_engine/vllm_engine.py:497
handle: <Handle VllmEngine.patch_remove_log.<locals>.new_log_task_completion(error_callback=<bound method...7efd7bbd70b0>>)(<Task finishe...dimensions'")>) at /root/paddlejob/workspace/env_run/ms-swift/swift/llm/infer/infer_engine/vllm_engine.py:497>
Traceback (most recent call last):
  File "/root/miniconda3/envs/swift/lib/python3.12/asyncio/events.py", line 84, in _run
    self._context.run(self._callback, *self._args)
  File "/root/paddlejob/workspace/env_run/ms-swift/swift/llm/infer/infer_engine/vllm_engine.py", line 499, in new_log_task_completion
    return_value = task.result()
                   ^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/engine/async_llm_engine.py", line 863, in run_engine_loop
    result = task.result()
             ^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/engine/async_llm_engine.py", line 786, in engine_step
    request_outputs = await self.engine.step_async(virtual_engine)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/engine/async_llm_engine.py", line 356, in step_async
    outputs = await self.model_executor.execute_model_async(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/executor/executor_base.py", line 265, in execute_model_async
    output = await make_async(self.execute_model)(execute_model_req)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/executor/executor_base.py", line 140, in execute_model
    output = self.collective_rpc("execute_model",
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/executor/uniproc_executor.py", line 56, in collective_rpc
    answer = run_method(self.driver_worker, method, args, kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/utils.py", line 2456, in run_method
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/worker/worker_base.py", line 420, in execute_model
    output = self.model_runner.execute_model(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/worker/pooling_model_runner.py", line 148, in execute_model
    self.model.pooler(hidden_states=hidden_or_intermediate_states,
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/model_executor/models/adapters.py", line 80, in pooler
    return self._pooler(hidden_states, pooling_metadata)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/model_executor/layers/pooler.py", line 100, in forward
    pooled_data = self.head(pooled_data, pooling_metadata)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/swift/lib/python3.12/site-packages/vllm/model_executor/layers/pooler.py", line 224, in forward
    pooling_param.dimensions
AttributeError: 'NoneType' object has no attribute 'dimensions'

vllm=0.8.5 ms-swift=3.4.0


CUDA_VISIBLE_DEVICES=1,2 swift infer \
    --model /root/paddlejob/workspace/env_run/xujiading/model_science/rm_2/v1-20250428-193329/checkpoint-937 \
    --infer_backend vllm \
    --dataset /root/paddlejob/workspace/env_run/xujiading/model_science/rm_2/v1-20250428-193329/val_dataset.jsonl \
@EvilCalf
Copy link
Author

推理后端用pt是可以的

@EvilCalf
Copy link
Author

用python的话出来的结果是生成式的内容,求教

@EvilCalf
Copy link
Author

EvilCalf commented Apr 30, 2025

求大佬们以qwen为基座训出来的rm,Qwen2ForSequenceClassification/Qwen3ForSequenceClassification,怎么vllm推理啊

@EvilCalf EvilCalf changed the title 推理奖励模型报错 [!HELP]推理奖励模型报错,感谢大家,求教qwen基座rm后的模型如何vllm推理 Apr 30, 2025
@EvilCalf EvilCalf changed the title [!HELP]推理奖励模型报错,感谢大家,求教qwen基座rm后的模型如何vllm推理 [HELP]推理奖励模型报错,感谢大家,求教qwen基座rm后的模型如何vllm推理 Apr 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant