Skip to content

Feature Request: Add Ernie4.5MoE support #14465

Open
@EliEron

Description

@EliEron

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Currently llama.cpp only supports the 0.3B dense ERNIE 4.5 model, but all other models in the ERNIE family are MoE models which has not been implemented yet.

Motivation

ERNIE 4.5 is an exciting new family of open multimodal Apache 2.0 licensed models from Baidu, boasting some pretty impressive benchmarks.

There is a lot of hype around these models, and I feel a lot of people would love to see this supported fully in llama.cpp. Especially since these models should run quite well on CPU centered setups due to it's MoE architecture.

Possible Implementation

There are open PRs for vLLM and SGLang that can be used as reference.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions