Skip to content

prepare_inputs_for_generation missing when using newer transformers version #2769

@Hrovatin

Description

@Hrovatin

System Info

The err depends on transformers version: it errors out using 4.56.0 but not 4.49.0 (did not test the versions in between)
Using pfts 0.17.1, linux, python 3.12.11

Who can help?

No response

Reproduction

from transformers import AutoModelForMaskedLM
from peft import LoraConfig, get_peft_model

model = AutoModelForMaskedLM.from_pretrained("Synthyra/ESMplusplus_small",trust_remote_code=True ) # I actually use local version of weights, but I would expect the same with the remote one
lora_target_modules = ["layernorm_qkv.1", "out_proj", "query", "key",
                  "value", "dense"]
lora_config = LoraConfig(
    r=32,
    lora_alpha=64,
    lora_dropout=0.01,
    bias="none", 
    target_modules=lora_target_modules,
    use_rslora=True, 
    task_type='CAUSAL_LM',
)
model = get_peft_model(model, lora_config)

Expected behavior

When doing peft on ESMplusplusForMaskedLM (from https://github.com/Synthyra/FastPLMs/blob/4b38cb271bdbb06eecb310218f4c3b33b5ad8c12/modeling_esm_plusplus.py#L846) I sometimes get error object has no attribute 'prepare_inputs_for_generation', depending on transformers version, as explained above.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions