Skip to content

peft set_adapter报错 #1792

@guxinyu1225

Description

@guxinyu1225

Describe the bug/ 问题描述 (Mandatory / 必填)
使用peft加载多个lora adapter后,使用set_adapter尝试激活其中一个lora adapter后报错AttributeError: 'LlamaForCausalLM' object has no attribute 'cells'

  • Hardware Environment(Ascend/GPU/CPU) / 硬件环境:

Please delete the backend not involved / 请删除不涉及的后端:
/device ascend/GPU/CPU/kirin/等其他芯片

  • Software Environment / 软件环境 (Mandatory / 必填):
    -- MindSpore version (e.g., 1.7.0.Bxxx) : 2.3.0
    -- Python version (e.g., Python 3.7.5) : 3.9.10
    -- OS platform and distribution (e.g., Linux Ubuntu 16.04):Ubuntu 18.04

  • Excute Mode / 执行模式 (Mandatory / 必填)(PyNative/Graph):

/mode pynative

To Reproduce / 重现步骤 (Mandatory / 必填)
在加载tloen/alpaca-lora-7b以及22h/cabrita-lora-v0-1前,我用

def torch_to_mindspore(ckpt_path, save_path):
state_dict = torch.load(ckpt_path, map_location="cpu")
ms_ckpt = []
for k, v in state_dict.items():
if 'wq' in k:
k = k.replace('wq', 'w_q')
v = v.transpose(0, 1)
if 'wk' in k:
k = k.replace('wk', 'w_k')
v = v.transpose(0, 1)
if 'wv' in k:
k = k.replace('wv', 'w_v')
v = v.transpose(0, 1)
if 'wo' in k:
k = k.replace('wo', 'w_o')
v = v.transpose(0, 1)
if 'w1' in k:
k = k.replace('w1', 'w_1')
v = v.transpose(0, 1)
if 'w2' in k:
k = k.replace('w2', 'w_2')
v = v.transpose(0, 1)
if 'w3' in k:
k = k.replace('w3', 'w_3')
v = v.transpose(0, 1)
if 'output' in k:
v = v.transpose(0, 1)
if 'rope' in k:
continue
ms_ckpt.append({'name': k, 'data': mindspore.Tensor(v.numpy())})
mindspore.save_checkpoint(ms_ckpt, save_path)

将adapter_model.bin转为adapter_model.ckpt

from mindnlp.transformers import LlamaForCausalLM, LlamaTokenizer
from mindnlp.peft import PeftModel, LoraConfig
model_name = "baffo32/decapoda-research-llama-7B-hf"
tokenizer = LlamaTokenizer.from_pretrained(model_name)
model = LlamaForCausalLM.from_pretrained(model_name)
model = PeftModel.from_pretrained(model, "tloen/alpaca-lora-7b", adapter_name="eng_alpaca")
peft_config = LoraConfig.from_pretrained("22h/cabrita-lora-v0-1")
model.add_adapter(adapter_name="portuguese_alpaca",peft_config=peft_config)
model.load_adapter("./cabrita-lora-v0-1", adapter_name="portuguese_alpaca")
model.set_adapter("eng_alpaca")

Expected behavior / 预期结果 (Mandatory / 必填)
eng_alpaca adapter激活

Screenshots/ 日志 / 截图 (Mandatory / 必填)
image

Additional context / 备注 (Optional / 选填)
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions