Skip to content

Batch to Inference #440

@GGital

Description

@GGital

Hello everyone, right now I am trying to inference image captioning using OFA/Huge fine-tuned on CoCo with about 48k images ,but I am facing very slow speed due to 1 image per batch ( about 1 image / sec which means I have to wait for about 13 hours to inference entire dataset). is there any way to do batch inference on my test set and still keeping beam search generation ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions