Skip to content

experimental result #1

@xiehou-design

Description

@xiehou-design

Hi, your work is very interesting! But I met some question when I followed the readme's file to reproduct the project. I run the train_eae.sh file and the file's CONFIG was replaced to config_eae_ace05e_AMRBart+prefixgen+copy_enccrossprefix_tp40.json, and I just change the json's pretrainModel dir path. But I run this project, the result is lower that your paper's result.
The recent results section log information is as follows:

[2023-06-27 15:41:31] - __main__ - Epoch 45
Train 45: 100%|████████████████████████| 1051/1051 [05:07<00:00,  3.42it/s]
[2023-06-27 15:46:38] - __main__ - Average training loss : 0.039728544652462006...
Dev 45: 100%|██████████████████████████████| 38/38 [00:26<00:00,  1.45it/s]
[2023-06-27 15:47:05] - __main__ - --------------------------Dev Scores---------------------------------
[2023-06-27 15:47:05] - __main__ - Role I     - P: 70.79 ( 429/ 606), R: 54.79 ( 429/ 783), F: 61.77
[2023-06-27 15:47:05] - __main__ - Role C     - P: 66.01 ( 400/ 606), R: 51.09 ( 400/ 783), F: 57.60
[2023-06-27 15:47:05] - __main__ - ---------------------------------------------------------------------
[2023-06-27 15:47:05] - __main__ - {'epoch': 45, 'dev_scores': {'arg_id': (0.7079207920792079, 0.5478927203065134, 0.6177105831533477), 'arg_cls': (0.6600660066006601, 0.5108556832694764, 0.5759539236861051)}}
[2023-06-27 15:47:05] - __main__ - Current best
[2023-06-27 15:47:05] - __main__ - {'best_epoch': 36, 'best_scores': {'arg_id': (0.7207357859531772, 0.5504469987228607, 0.6241853729181752), 'arg_cls': (0.6822742474916388, 0.5210727969348659, 0.5908761766835626)}}
[2023-06-27 15:47:05] - __main__ - ./output/eae_ace05e_AMRBart+prefixgen+copy_enccrossprefix_tp40/20230627_101723/train.log
[2023-06-27 15:47:05] - __main__ - Done!

config_eae_ace05e_AMRBart+prefixgen+copy_enccrossprefix_tp40.json information is as follows:

{
    "dataset": "ace05e",
    "gpu_device": 0,
    "seed": 100,
    "train_file": "./processed_data/ace05e_bart/train.w1.oneie.json",
    "dev_file": "./processed_data/ace05e_bart/dev.w1.oneie.json",
    "test_file": "./processed_data/ace05e_bart/test.w1.oneie.json",
    "finetune_dir": "./data/eae_ace05e/",
    "train_finetune_file": "./data/eae_ace05e/train_all.pkl",
    "dev_finetune_file": "./data/eae_ace05e/dev_all.pkl",
    "test_finetune_file": "./data/eae_ace05e/test_all.pkl", 
    "vocab_file": "./data/eae_ace05e/vocab.json",
    "model_type": "AMR+prefixgen+copy",
    "output_dir": "./output/eae_ace05e_AMRBart+prefixgen+copy_enccrossprefix_tp40",
    "cache_dir": "./cache",
    "model_name": "/media/ubuntu/projects/pretrainModel/bart-large",
    "input_style": ["event_type_sent", "triggers", "template", "na_token"],   
    "output_style": ["argument:sentence"], 
    "max_epoch": 45,
    "warmup_epoch": 5,
    "train_batch_size": 4,
    "eval_batch_size": 12,
    "accumulate_step": 1,
    "learning_rate": 1e-05,
    "weight_decay": 1e-05,
    "grad_clipping": 5.0,
    "beam_size": 1,
    "max_length": 250,
    "max_output_length": 50,
    "ignore_first_header": true,
    "use_encoder_prefix": true,
    "use_cross_prefix": true,
    "use_decoder_prefix": false,
    "AMR_model_path": "/media/ubuntu/projects/pretrainModel/AMRBART-large-finetuned-AMR3.0-AMR2Text-v2",
    "latent_dim": 1024,
    "prefix_length": 40,
    "freeze_AMR": false,
    "freeze_bart": false,
    "freeze_prefixprojector": false,
    "pretrained_model_path": null
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions