Skip to content

Distorch 2 - offloading to same gpu unload issue #155

@ppereirasky

Description

@ppereirasky

Hi There,

First of all Huge Congrats on the most useful ComfyUI extension nowadays and for the GREAT Work building it. 👍

I am currently running Wan 2.2 I2V 14B fp8_e5m2 on my 5090 using your Distorch 2 Nodes.

I am offloading 1Gb for each model (high and low) to a second GPU 5060 Ti 16Gb where there is already the VAE and the T5 models.
It works great. However I noticed that when the workflow runs and switches between the high noise model to low noise model both almost entirely on the 5090, the 1Gb "distorched" to 5060 Ti is unloaded and then the 1Gb of the low noise loaded.

Even with enough VRAM on the 5060 Ti, the "distorched" 1Gb + 1Gb don't remain loaded - before one is used the other is unloaded. It works, but it would be faster if both remained in VRAM.

Best Regards and Merry Christmas :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions