Skip to content

Sam2 in 🤗 Transformers #682

@yonigozlan

Description

@yonigozlan

Hello!

I'm an ML engineer at Hugging Face, and we're (finally) implementing Sam2 in Transformers! The PR is here, and should be almost ready to go.

Seeing as the checkpoints on the hub (such as this one for 2.1 tiny) don't contain safetensors files or config files that would conflict with transformers checkpoints, I was wondering if you would be open to host the transformers checkpoints on the existing repo, alongside the existing weights and yaml configs?

It wouldn't change anything for the current users of this repo, but it would enable users wanting to use the transformers implementation to load it via

from transformers import AutoModel

sam2_model = AutoModel.from_pretrained("facebook/sam2.1-hiera-tiny")

Happy to give more information if needed, and looking forward to your answer!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions