Skip to content

Using HuggingFace AMRBartTokenizer #21

Open
@Zoher15

Description

@Zoher15

Hi @goodbai-nlp ,

This is great work! Thanks for making your model available on huggingface. Makes things easier.

However, I am not sure I follow the instructions for generating AMRs. I want to simply generate an AMR for a sentence. In your instructions the easiest way to do so seems to be using huggingface:

from transformers import BartForConditionalGeneration
from model_interface.tokenization_bart import AMRBartTokenizer      # We use our own tokenizer to process AMRs

model = BartForConditionalGeneration.from_pretrained("xfbai/AMRBART-large-finetuned-AMR3.0-AMRParsing-v2")
tokenizer = AMRBartTokenizer.from_pretrained("xfbai/AMRBART-large-finetuned-AMR3.0-AMRParsing-v2")

Are you expecting us to install your repo as a python package? If not, how do expect us to import your tokenizer in our scripts from model_interface?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions