Skip to content

GPU out of memory when running example code #479

@Lliar-liar

Description

@Lliar-liar

Hi.
When running infer_llm_base.py in InternLM-XComposer/InternLM-XComposer-2.5-OmniLive/example, my 40GB A100 went out of memory.
I'm using the downloaded model locally(downloaded from hugging face`.
Is that a problem with the model or the code?
PS: Please provided more detailed guide for running inference of InternLM-XComposer-2.5-OmniLive. Thanks!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions