Skip to content

Trying to get this working under Windows #12

@SoftologyPro

Description

@SoftologyPro

I install without Apex using this batch file...

@echo off

cd
echo *** Deleting Lumina-Video directory if it exists
if exist Lumina-Video\. rd /S /Q Lumina-Video

echo *** Cloning Lumina-Video repository
git clone https://github.com/Alpha-VLLM/Lumina-Video
cd Lumina-Video

echo *** Creating venv
python -m venv venv
call venv\scripts\activate.bat

echo *** Installing requirments
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install https://huggingface.co/datasets/Softology-Pro/VoC/resolve/main/flash_attn-2.7.1.post1+cu124torch2.5.1cxx11abiFALSE-cp310-cp310-win_amd64.whl
pip install imageio
pip install https://huggingface.co/datasets/Softology-Pro/VoC/resolve/main/triton-3.0.0-cp310-cp310-win_amd64.whl

echo *** Installing GPU Torch
pip uninstall -y torch
pip uninstall -y torch
pip install --no-cache-dir --ignore-installed --force-reinstall --no-warn-conflicts torch==2.5.1+cu124 torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124

rem echo *** Installing Apex
rem git clone https://github.com/NVIDIA/apex
rem cd apex
rem pip install wheel
rem pip install setuptools
rem pip install -v --disable-pip-version-check --no-cache-dir --no-build-isolation --config-settings "--build-option=--cpp_ext" --config-settings "--build-option=--cuda_ext" ./

echo *** Downloading models
if not exist "ckpts" mkdir "ckpts"
if not exist "ckpts\f24R960" mkdir "ckpts\f24R960"
huggingface-cli download --resume-download Alpha-VLLM/Lumina-Video-f24R960 --local-dir ./ckpts/f24R960

call venv\scripts\deactivate.bat
cd ..

echo *** Finished Lumina-Video install
echo.
echo *** Scroll up and check for errors.  Do not assume it worked.
pause

And then run with this batch file...

@echo off
set USE_LIBUV=0
set TORCHDYNAMO_VERBOSE=1 
set PL_TORCH_DISTRIBUTED_BACKEND=gloo
cd Lumina-Video
call venv\scripts\activate.bat
python -u generate.py --ckpt ./ckpts/f24R960 --resolution 1248x704 --fps 24 --frames 96 --prompt "Shrek eating pizza" --neg_prompt "" --sample_config f24F96R960
call venv\scripts\deactivate.bat

But it gives these errors...

D:\Tests\Lumina-Video>run
D:\Tests\Lumina-Video\Lumina-Video\models\components.py:6: UserWarning: Cannot import apex RMSNorm, switch to vanilla implementation
  warnings.warn("Cannot import apex RMSNorm, switch to vanilla implementation")
D:\Tests\Lumina-Video\Lumina-Video\venv\lib\site-packages\torch\distributed\distributed_c10d.py:730: UserWarning: Attempted to get default timeout for nccl backend, but NCCL support is not compiled
  warnings.warn(
[W312 20:07:28.000000000 socket.cpp:752] [c10d] The client socket has failed to connect to [kubernetes.docker.internal]:10000 (system error: 10049 - The requested address is not valid in its context.).
Traceback (most recent call last):
  File "D:\Tests\Lumina-Video\Lumina-Video\generate.py", line 238, in <module>
    main()
  File "D:\Tests\Lumina-Video\Lumina-Video\generate.py", line 224, in main
    dist.init_process_group("nccl")
  File "D:\Tests\Lumina-Video\Lumina-Video\venv\lib\site-packages\torch\distributed\c10d_logger.py", line 83, in wrapper
    return func(*args, **kwargs)
  File "D:\Tests\Lumina-Video\Lumina-Video\venv\lib\site-packages\torch\distributed\c10d_logger.py", line 97, in wrapper
    func_return = func(*args, **kwargs)
  File "D:\Tests\Lumina-Video\Lumina-Video\venv\lib\site-packages\torch\distributed\distributed_c10d.py", line 1527, in init_process_group
    default_pg, _ = _new_process_group_helper(
  File "D:\Tests\Lumina-Video\Lumina-Video\venv\lib\site-packages\torch\distributed\distributed_c10d.py", line 1750, in _new_process_group_helper
    raise RuntimeError("Distributed package doesn't have NCCL built in")
RuntimeError: Distributed package doesn't have NCCL built in
D:\Tests\Lumina-Video\Lumina-Video>

Any ideas how I can get this working under Windows (on a 24GB 4090)?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions