Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Qwen/phi2 模型转换出错 #52

Open
xhzheng1895 opened this issue Jul 3, 2024 · 2 comments
Open

Qwen/phi2 模型转换出错 #52

xhzheng1895 opened this issue Jul 3, 2024 · 2 comments

Comments

@xhzheng1895
Copy link

xhzheng1895 commented Jul 3, 2024

分别跑了Qwen1.5-1.8B-Chat和Qwen-1_8B-Chat,报了类似的问题:
以Qwen1.5-1.8B-Chat举例:
使用tramsformers==4.31.0:

Traceback (most recent call last):
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 135, in load_hf
    self.model = AutoModelForCausalLM.from_pretrained(model_path, trust_remote_code=True).float().eval()
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 1109, in from_pretrained
    config_class = get_class_from_dynamic_module(
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/dynamic_module_utils.py", line 500, in get_class_from_dynamic_module
    return get_class_in_module(class_name, final_module.replace(".py", ""))
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/dynamic_module_utils.py", line 200, in get_class_in_module
    module = importlib.import_module(module_path)
  File "/usr/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'transformers_modules.Qwen1'

使用最新版本transformers(version==4.42.3)

/home/xinzhe02/.cache/huggingface/modules/transformers_modules/Qwen1.5-1.8B-Chat/modeling_qwen2.py:306: TracerWarning: Iterating over a tensor might cause the trace to be incorrect. Passing a tensor of different shape won't change the number of iterations executed (and might lead to errors or silently give incorrect results).
  cos, sin = rotary_pos_emb
/home/xinzhe02/.cache/huggingface/modules/transformers_modules/Qwen1.5-1.8B-Chat/modeling_qwen2.py:323: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_weights.size() != (bsz, self.num_heads, q_len, kv_seq_len):
/home/xinzhe02/.cache/huggingface/modules/transformers_modules/Qwen1.5-1.8B-Chat/modeling_qwen2.py:330: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attention_mask.size() != (bsz, 1, q_len, kv_seq_len):
/home/xinzhe02/.cache/huggingface/modules/transformers_modules/Qwen1.5-1.8B-Chat/modeling_qwen2.py:342: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_output.size() != (bsz, self.num_heads, q_len, self.head_dim):
export done!
Killed

这里ONNX的目录下是有文件的,可能是ONNX转MNN时出错

cmdline:
python3 llm_export.py --type Qwen1_5-1_8B-Chat
--path ../Qwen1.5-1.8B-Chat
--export --export_mnn
--onnx_path ./Qwen1_5-1_8B-Chat-Int4-S-ONNX
--mnn_path ./Qwen1_5-1_8B-Chat-Int4-S-MNN
python version: 3.9.19
其他包除transformers,均follow requirement.txt

@xhzheng1895
Copy link
Author

xhzheng1895 commented Jul 3, 2024

phi-2也是transformers各版本报不同的错...
<4.37.0

Traceback (most recent call last):
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1404, in <module>
    llm_exporter = llm_models[model_type](args)
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1142, in __init__
    super().__init__(args)
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 121, in __init__
    self.load_hf(args.path)
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 137, in load_hf
    self.model = AutoModel.from_pretrained(model_path, trust_remote_code=True).float().eval()
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 525, in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 1050, in from_pretrained
    config_class = CONFIG_MAPPING[config_dict["model_type"]]
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 748, in __getitem__
    raise KeyError(key)
KeyError: 'phi'

=4.37.0

Traceback (most recent call last):
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1404, in <module>
    llm_exporter = llm_models[model_type](args)
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1142, in __init__
    super().__init__(args)
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 122, in __init__
    self.load_model()
  File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1147, in load_model
    transformer = self.model.transformer
  File "/home/xinzhe02/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1709, in __getattr__
    raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'PhiForCausalLM' object has no attribute 'transformer'

@haozixu
Copy link

haozixu commented Jul 4, 2024

我猜得用llm_models里面的对应.py替换hf transformers中的实现

@xhzheng1895 xhzheng1895 changed the title Qwen-1_8B-Chat 模型转换出错 Qwen/phi2 模型转换出错 Jul 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants