Skip to content

HF - Model Export Issue with OSS models #3609

@ep150de

Description

@ep150de

🐛 Describe the bug

When trying to convert the OSS model from OpenAI using model export and model optimizer spaces from HuggingFace, I get the following error:

repro steps:
URL: https://huggingface.co/spaces/OpenVINO/nncf-quantization
https://huggingface.co/spaces/OpenVINO/export

Model source: https://huggingface.co/openai/gpt-oss-20b

Error: The checkpoint you are trying to load has model type gpt_oss but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Environment

WebUI - Huggingface browser
Windows 11
Chrome Browser

Minimal Reproducible Example

repro steps:
URL: https://huggingface.co/spaces/OpenVINO/nncf-quantization
https://huggingface.co/spaces/OpenVINO/export

Model source: https://huggingface.co/openai/gpt-oss-20b

add model to hub model id input box (openai/gpt-oss-20b)

run
fails
see output error

Are you going to submit a PR?

  • Yes I'd like to help by submitting a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions