Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM demos adjustments for Windows #2940

Merged
merged 23 commits into from
Jan 16, 2025
Merged

LLM demos adjustments for Windows #2940

merged 23 commits into from
Jan 16, 2025

Conversation

mzegla
Copy link
Collaborator

@mzegla mzegla commented Dec 20, 2024

No description provided.

@mzegla mzegla changed the title Include self-contained Python in the package and adjust demo LLM demos adjustments for Windows Jan 9, 2025
@mzegla mzegla marked this pull request as ready for review January 13, 2025 13:23
Copy link
Collaborator

@rasapala rasapala left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments.

copy %cd%\bazel-out\x64_windows-opt\bin\src\python39.dll dist\windows\ovms
if !errorlevel! neq 0 exit /b !errorlevel!
xcopy C:\opt\ovms-python-3.9.6-embed dist\windows\ovms\python /E /I /H
if %errorlevel% neq 0 (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

!errorlevel!

setupvars.bat Outdated
:: limitations under the License.
::
@echo off
setlocal EnableExtensions EnableDelayedExpansion
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove setlocal and endlocal, otherwise the settings will not propagate to the calling terminal.

pip3 install -U -r demos/common/export_models/requirements.txt
Download export script, install it's dependencies and create directory for the models:
```console
curl https://raw.githubusercontent.com/openvinotoolkit/model_server/refs/heads/main/demos/common/export_models/export_model.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

command only shows file content - it does not download it


**CPU**
```console
python demos/common/export_models/export_model.py text_generation --source_model meta-llama/Meta-Llama-3-8B-Instruct --weight-format fp16 --kv_cache_precision u8 --config_file_path models/config.json --model_repository_path models --overwrite_models
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

between demos there are commands with:
python demos/common/export_models/export_model.py or python export_model.py

maybe unify?

git clone --branch v0.6.0 --depth 1 https://github.com/vllm-project/vllm
cd vllm
pip3 install -r requirements-cpu.txt --extra-index-url https://download.pytorch.org/whl/cpu
cd benchmarks
wget https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered/resolve/main/ShareGPT_V3_unfiltered_cleaned_split.json # sample dataset
curl https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered/resolve/main/ShareGPT_V3_unfiltered_cleaned_split.json # sample dataset
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add "-o"

@mzegla mzegla merged commit 1b5032b into main Jan 16, 2025
8 checks passed
@mzegla mzegla deleted the win_llm_demo branch March 6, 2025 09:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants