Skip to content

Issue of Trainer on WMDP #119

Open
Open
@chengjiali

Description

@chengjiali

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task
  • My own task or dataset (give details below)

Reproduction

Hi, thank you for adding the WMDP dataset.

The recent changes are merged and the dataset is setup. I tried to run an experiment on WMDP with the following command.

#!/bin/bash

export MASTER_PORT=$(python -c "import socket; s=socket.socket(); s.bind(('', 0)); print(s.getsockname()[1]); s.close()")
echo "Master Port: $MASTER_PORT"

per_device_train_batch_size=4
gradient_accumulation_steps=2

model=zephyr-7b-beta

data_splits=(
    "cyber"
    "bio"
)

trainers=(
    "GradAscent"
    "GradDiff"
    "NPO"
    "SimNPO"
)


for data_split in "${data_splits[@]}"; do
    for trainer in "${trainers[@]}"; do

            task_name=wmdp_${model}_${data_split}_${trainer}

            CUDA_VISIBLE_DEVICES=4,5,6,7 accelerate launch --config_file configs/accelerate/default_config.yaml --main_process_port $MASTER_PORT \
                src/train.py --config-name=unlearn.yaml \
                experiment=unlearn/wmdp/default.yaml \
                model=${model} \
                data_split=${data_split} \
                trainer=${trainer} \
                task_name=${task_name} \
                paths.output_dir=saves/unlearn/${task_name}/ \
                trainer.args.per_device_train_batch_size=${per_device_train_batch_size} \
                trainer.args.gradient_accumulation_steps=${gradient_accumulation_steps} \
                trainer.args.ddp_find_unused_parameters=true \
                trainer.args.gradient_checkpointing=true
    done
done

Error:
For GA

  File "/user/open-unlearning/src/train.py", line 48, in main
    trainer, trainer_args = load_trainer(
                            ^^^^^^^^^^^^^
  File "/user/open-unlearning/src/trainer/__init__.py", line 64, in load_trainer
    trainer = trainer_cls(
              ^^^^^^^^^^^^
  File "/user/open-unlearning/src/trainer/base.py", line 19, in __init__
    super().__init__(*args, **kwargs)
TypeError: Trainer.__init__() got an unexpected keyword argument 'gamma'

For GD

Traceback (most recent call last):
  File "/user/open-unlearning/src/train.py", line 48, in main
    trainer, trainer_args = load_trainer(
                            ^^^^^^^^^^^^^
  File "/user/open-unlearning/src/trainer/__init__.py", line 64, in load_trainer
    trainer = trainer_cls(
              ^^^^^^^^^^^^
  File "/user/open-unlearning/src/trainer/unlearn/grad_diff.py", line 8, in __init__
    super().__init__(*args, **kwargs)
  File "/user/open-unlearning/src/trainer/base.py", line 19, in __init__
    super().__init__(*args, **kwargs)
TypeError: Trainer.__init__() got an unexpected keyword argument 'steering_coeff'

Could you help me look into this? Could you please provide an official script for WMDP unlearning? Thank you so much for your help.

Expected behavior

Run with no bug.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions