-
Notifications
You must be signed in to change notification settings - Fork 2k
Open
Description
Hello!
I'm an ML engineer at Hugging Face, and we're (finally) implementing Sam2 in Transformers! The PR is here, and should be almost ready to go.
Seeing as the checkpoints on the hub (such as this one for 2.1 tiny) don't contain safetensors files or config files that would conflict with transformers checkpoints, I was wondering if you would be open to host the transformers checkpoints on the existing repo, alongside the existing weights and yaml configs?
It wouldn't change anything for the current users of this repo, but it would enable users wanting to use the transformers implementation to load it via
from transformers import AutoModel
sam2_model = AutoModel.from_pretrained("facebook/sam2.1-hiera-tiny")
Happy to give more information if needed, and looking forward to your answer!
ugotsoul, KaijingOfficial, danielkorth and gredin
Metadata
Metadata
Assignees
Labels
No labels