-
Notifications
You must be signed in to change notification settings - Fork 86
为什么使用huggingface上面的代码,老是说识别不了这个model type tinyllava
#162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
使用load_model.py中的load_pretrained_model函数加载试试 |
还是不行 |
而且很奇怪的一点是我明明给的是从huggingface上面下载下来的本地路径,也设置了 model = TinyLlavaForConditionalGeneration.from_pretrained(model_name_or_path, local_files_only=True, # 仅使用本地文件 |
应该是环境问题 重新执行pip install -e .之后试一下 |
麻烦请问可以提供一下你的模型结构吗,可以打印一下吗,我实在是跑不通 |
|
@MichealZhangxa 你好,请问您解决这个load本地模型的问题了吗? |
没有,解决了麻烦告诉我一下 |
在权重文件夹下加入config.json内加入 "auto_map": { |
@ZhangXJ199 |
把AutoModel加载换成load_pretrained_model函数中的TinyLlavaForConditionalGeneration.from_pretrained加载 |
我之前试了,还是没用
…---原始邮件---
发件人: "Xingjian ***@***.***>
发送时间: 2025年3月26日(周三) 中午12:00
收件人: ***@***.***>;
抄送: ***@***.******@***.***>;
主题: Re: [TinyLLaVA/TinyLLaVA_Factory] 为什么使用huggingface上面的代码,老是说识别不了这个model type `tinyllava` (Issue #162)
把AutoModel加载换成load_pretrained_model函数中的TinyLlavaForConditionalGeneration.from_pretrained加载
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
ZhangXJ199 left a comment (TinyLLaVA/TinyLLaVA_Factory#162)
把AutoModel加载换成load_pretrained_model函数中的TinyLlavaForConditionalGeneration.from_pretrained加载
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
把模型下载到本地再试试 把HF路径改为绝对路径 |
@ZhangXJ199 ![]() ![]() |
@ZhangXJ199 |
是的,TinyLLaVA-Phi-2-SigLIP-3.1B是我们发布的初代模型,其中含有modeling_tinyllava_phi.py文件,且config.json文件中含有auto_map,可以依托transformer进行加载。 |
@ZhangXJ199 好嘞谢谢我试试 |
能请问下,TinyLlavaForConditionalGeneration.from_pretrained这里没有chat函数,怎么解决呢 @ZhihuaGao @ZhangXJ199 |
可以使用generate函数进行inference |
使用不了Zhang199/TinyLLaVA-Qwen2-0.5B-SigLIP,但是能使用tinyllava/TinyLLaVA-Phi-2-SigLIP-3.1B,transformer库升级也没用
The text was updated successfully, but these errors were encountered: