Skip to content

orangepi aipro运行mindnlp实例qwen1.5-0.5b.py时报错 #2131

@dannyyudong

Description

@dannyyudong

Describe the bug/ 问题描述 )
运行mindnlp实例qwen1.5-0.5b.py时报错

  • **Hardware Environment:

orange pi aipro(8t 8g) ascend310b4

  • Software Environment / 软件环境):
    -- MindSpore version 2.4.1) :
    --Mindnlp0.4.1
    -- Python version (e.g., Python 3.9.2) :
    -- OS platform and distribution (e.g., Linux Ubuntu 22.04):
    CANN8.0RC3

  • Excute Mode / 执行模式 (Mandatory / 必填)(PyNative/Graph):

Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph

已经进行过mindspore检查运行通过

Screenshots/ 日志 / 截图 (Mandatory / 必填)

(base) HwHiAiUser@orangepiaipro:~/qwen1.5$ python qwen1.5-0.5b.py
/usr/local/miniconda3/lib/python3.9/site-packages/numpy/core/getlimits.py:499: UserWarning: The value of the smallest subnormal for <class 'numpy.float64'> type is zero.
setattr(self, word, getattr(machar, word).flat[0])
/usr/local/miniconda3/lib/python3.9/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float64'> type is zero.
return self._float_to_str(self.smallest_subnormal)
/usr/local/miniconda3/lib/python3.9/site-packages/numpy/core/getlimits.py:499: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero.
setattr(self, word, getattr(machar, word).flat[0])
/usr/local/miniconda3/lib/python3.9/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero.
return self._float_to_str(self.smallest_subnormal)
Building prefix dict from the default dictionary ...
Loading model from cache /tmp/jieba.cache
Loading model cost 2.165 seconds.
Prefix dict has been built successfully.
1.26kB [00:00, 2.87MB/s]
2.65MB [00:01, 1.52MB/s]
1.59MB [00:01, 1.63MB/s]
6.70MB [00:02, 3.43MB/s]
661B [00:00, 1.27MB/s]
83%|███████████████████████████████▎ | 976M/1.15G [01:41<00:20, 10.5MB/s]100%|█████████████████████████████████████| 1.15G/1.15G [02:04<00:00, 9.92MB/s]
Qwen2ForCausalLM has generative capabilities, as prepare_inputs_for_generation is explicitly overwritten. However, it doesn't directly inherit from GenerationMixin.PreTrainedModel will NOT inherit from GenerationMixin, and this model will lose the ability to call generate and other related functions.

  • If you are the owner of the model architecture code, please modify your model class such that it inherits from GenerationMixin (after PreTrainedModel, otherwise you'll get an exception).
  • If you are not the owner of the model architecture class, please contact the model code owner to update it.
    Sliding Window Attention is enabled but not implemented for eager; unexpected results may be encountered.
    [WARNING] DEVICE(37016,e7ffee3b1020,python):2025-08-14-15:24:39.217.814 [mindspore/ccsrc/plugin/device/ascend/hal/device/ascend_memory_adapter.cc:116] Initialize] Free memory size is less than half of total memory size.Device 0 Device HBM total size:7912181760 Device HBM free size:2654588928 may be other processes occupying this card, check as: ps -ef|grep python
    Traceback (most recent call last):
    File "/home/HwHiAiUser/qwen1.5/qwen1.5-0.5b.py", line 14, in
    model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen1.5-0.5B-Chat",low_cpu_mem_usage=True)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/auto/auto_factory.py", line 510, in from_pretrained
    return model_class.from_pretrained(
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/modeling_utils.py", line 3126, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 669, in init
    self.model = Qwen2Model(config)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 475, in init
    [Qwen2DecoderLayer(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 475, in
    [Qwen2DecoderLayer(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 371, in init
    self.self_attn = QWEN2_ATTENTION_CLASSES[config._attn_implementation](config, layer_idx)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 276, in init
    self.rotary_emb = Qwen2RotaryEmbedding(
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 145, in init
    self._set_cos_sin_cache(
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/transformers/models/qwen2/modeling_qwen2.py", line 155, in _set_cos_sin_cache
    emb = ops.cat(
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindnlp/core/ops/array.py", line 25, in cat
    return mindspore.mint.cat(tensors, dim)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindspore/mint/init.py", line 609, in cat
    return ops.auto_generate.cat(tensors, dim)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindspore/ops/auto_generate/gen_ops_def.py", line 1444, in cat
    return concat_impl(tensors, axis)
    File "/home/HwHiAiUser/.local/lib/python3.9/site-packages/mindspore/ops/auto_generate/pyboost_inner_prim.py", line 146, in call
    return _convert_stub(super().call(tensors, axis))
    RuntimeError: aclnnCatGetWorkspaceSize call failed, please check!

  • Ascend Error Message:

EZ1001: [PID: 37016] 2025-08-14-15:25:11.185.069 tensor 0 not implemented for DT_FLOAT, should be in dtype support list [].[THREAD:37016]

(Please search "CANN Common Error Analysis" at https://www.mindspore.cn for error code description)


  • C++ Call Stack: (For framework developers)

mindspore/ops/kernel/ascend/pyboost/auto_generate/concat.cc:55 operator()

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions