-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Qwen/phi2 模型转换出错 #52
Comments
phi-2也是transformers各版本报不同的错...
|
我猜得用llm_models里面的对应.py替换hf transformers中的实现 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
分别跑了Qwen1.5-1.8B-Chat和Qwen-1_8B-Chat,报了类似的问题:
以Qwen1.5-1.8B-Chat举例:
使用tramsformers==4.31.0:
使用最新版本transformers(version==4.42.3)
这里ONNX的目录下是有文件的,可能是ONNX转MNN时出错
cmdline:
python3 llm_export.py --type Qwen1_5-1_8B-Chat
--path ../Qwen1.5-1.8B-Chat
--export --export_mnn
--onnx_path ./Qwen1_5-1_8B-Chat-Int4-S-ONNX
--mnn_path ./Qwen1_5-1_8B-Chat-Int4-S-MNN
python version: 3.9.19
其他包除transformers,均follow requirement.txt
The text was updated successfully, but these errors were encountered: