Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

想要合并release里的文件 #32

Open
JokerJostar opened this issue Mar 12, 2024 · 3 comments
Open

想要合并release里的文件 #32

JokerJostar opened this issue Mar 12, 2024 · 3 comments
Assignees

Comments

@JokerJostar
Copy link

求教

embedding.onnx
lm.onnx
block_[0-23].onnx
这三类文件分别是什么作用,如果要合并的话只用block吗还是所有的一起,我直接转换chatglm3-6b失败了

@wangzhaode
Copy link
Owner

具体的作用可以看下代码;如果像转换合并的模型就使用--export

@wangzhaode wangzhaode self-assigned this Mar 12, 2024
@JokerJostar
Copy link
Author

具体的作用可以看下代码;如果像转换合并的模型就使用--export

python3 llm_export.py --path ../chatglm3-6b --export --onnx_path ./chatglm2-6b-onnx
The device support i8sdot:0, support fp16:0, support i8mm: 0
Traceback (most recent call last):
File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 1081, in
llm_exporter = llm_modelsmodel_type
File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 562, in init
super().init(args)
File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 511, in init
super().init(args)
File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 90, in init
self.sp_model = spm.SentencePieceProcessor(tokenizer_model)
File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 447, in Init
self.Load(model_file=model_file, model_proto=model_proto)
File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 905, in Load
return self.LoadFromFile(model_file)
File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

这是sentencepiece的问题吗,我是用requirements.txt安装的

@Franklin-L
Copy link

具体的作用可以看下代码;如果像转换合并的模型就使用--export

python3 llm_export.py --path ../chatglm3-6b --export --onnx_path ./chatglm2-6b-onnx The device support i8sdot:0, support fp16:0, support i8mm: 0 Traceback (most recent call last): File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 1081, in llm_exporter = llm_modelsmodel_type File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 562, in init super().init(args) File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 511, in init super().init(args) File "/home/jostar/workspace/PycharmProjects/Rk/llm-export/llm_export.py", line 90, in init self.sp_model = spm.SentencePieceProcessor(tokenizer_model) File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 447, in Init self.Load(model_file=model_file, model_proto=model_proto) File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 905, in Load return self.LoadFromFile(model_file) File "/home/jostar/.local/lib/python3.10/site-packages/sentencepiece/init.py", line 310, in LoadFromFile return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

这是sentencepiece的问题吗,我是用requirements.txt安装的

请问大佬有没有针对微调后的chatglm3的onnx转换的经验呀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants