-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Regression: "opea/vllm-gaudi:latest" container in crash loop #1038
Comments
Only change in OPEA Git since v1.1 is dropping of the However, comparing the Shows quite a few differences, also in sizes of the installed layers. => I think the problem is in Habana repo side. Recent Gaudi vLLM dependency changes is one possibility: https://github.com/HabanaAI/vllm-fork/commits/habana_main/requirements-hpu.txt Maybe new HPU deps do not handle correctly pod's Gaudi plugin device request allowing vLLM (write) access only to one of the 8 devices in the node? |
Not sure how the docker file are buid. Here both v1.1 and latest are built from the same commit id @ashahba |
@xiguiw As you can see from the OPEA script, it [1] It would be faster to clone specific commit instead of first cloning whole repo and only then checking out that specific commit. |
@eero-t I mean if the vllm-gaudi image build from the same commit in v1.1 and latest, it does not make sense that v1.1 works but latest failed. If v1.1 and latest docker images are built with different commit ID, that possible. |
Priority
Undecided
OS type
Ubuntu
Hardware type
Gaudi2
Installation method
Deploy method
Running nodes
Single Node
What's the version?
https://hub.docker.com/layers/opea/vllm-gaudi/latest/images/sha256-d2c0b0aa88cd26ae2084990663d8d789728f658bacacd8a49cc5b81a6a022c58
Description
vllm-gaudi:latest
container does not find devices, and is in crash loop.But if I change
latest
tag to1.1
, it works fine, i.e. this is regression.Reproduce steps
Apply: opea-project/GenAIInfra#610
Then run ChatQnA from GenAIInfra:
$ helm install chatqna chatqna/ --skip-tests --values chatqna/gaudi-vllm-values.yaml ...
Raw log
The text was updated successfully, but these errors were encountered: