[CI/build] Add libraries needed for building VLLM wheel to the test docker image. #29672
+8
−1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Purpose
Fix building VLLM wheel using VLLM docker image. Mostly fixes issue #29669.
In pull request #29270 we switched from devel to basic nvidia image as a base for the default and test VLLM docker image. This basic nvidia image doesn't contain all libraries needed to build a VLLM wheel. Even though we install nvrtc library in test VLLM docker image, CMake can't find it. But when I install dev version of nvrtc library, CMake does find it and fails while compiling kernels. I found the minimum set of dev libraries needed to compile VLLM wheel successfully. Original test image size is 24.8Gb. With dev libraries it is 28.6Gb. With devel nvidia image it is 34.4Gb. In any case, changes in test image do not increase the size of default docker image.
This change doesn't fix that build python package is missing in the test VLLM docker image.
Test Plan
Build VLLM test docker image.
docker build ./vllm --target test --tag ... --file ./vllm/docker/DockerfileRun VLLM test docker image.
docker container run --rm -it --network host --gpus all --shm-size=2g --entrypoint /bin/bash -v [map vllm dir] ...Build full VLLM wheel.
pip install build && cd vllm && python3 -m build --wheelTest Result
VLLM wheel is build.
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.