Open
Description
Currently, the CLI says HF_TOKEN is required when using --model-tokenizer <model-id>
, even for public models.
However, most public models on Hugging Face can be downloaded without a token. Forcing users to always set HF_TOKEN is unnecessary. When running in a Kubernetes environment, the pod containing genai-bench may not have the model's PVC mounted. Therefore, requiring a TOKEN to download tokenizers from Hugging Face is an inconvenient design.
Expected behavior:
- Try unauthenticated download first.
- Use HF_TOKEN only when access is denied (e.g., for private/gated models).
- Update docs to mark HF_TOKEN as optional, not required.
Thanks!
Metadata
Metadata
Assignees
Labels
No labels