-
Notifications
You must be signed in to change notification settings - Fork 215
GETTING VERY WEIRD ERROR #607
Replies: 1 comment · 5 replies
-
Beta Was this translation helpful? Give feedback.
All reactions
-
ill paste the console scripts in a moment. i dont understand. it seems to get to 99% finished and even shows a preview and then it fails |
Beta Was this translation helpful? Give feedback.
All reactions
-
Im also not loading xformers so dont know where it gets that from |
Beta Was this translation helpful? Give feedback.
All reactions
-
venv "A:\AMD-AUTO1111\stable-diffusion-webui-amdgpu\venv\Scripts\Python.exe" The above exception was the direct cause of the following exception: Traceback (most recent call last): Repository Not Found for url: https://huggingface.co/None/resolve/main/config.json. The above exception was the direct cause of the following exception: Traceback (most recent call last): Failed to create model quickly; will retry using slow method. To create a public link, set |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ok I deleted the previous install, I followed a youtube channels recommendations. instead I used this form your page - Install Python 3.10.6 (ticking Add to PATH), and git and now it is working |
Beta Was this translation helpful? Give feedback.
All reactions
-
I also added strawberry perl and zluda to PATH environment. it just runs super slow. like 4 times slower than my previous 3060 rtx |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
So this is the error message i get when I try to run an image generation. I installed everything as stipulated and running AMD 9070 card
NotImplementedError: No operator found for
memory_efficient_attention_forward
with inputs: query : shape=(1, 6912, 1, 512) (torch.float32) key : shape=(1, 6912, 1, 512) (torch.float32) value : shape=(1, 6912, 1, 512) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0decoderF
is not supported because: max(query.shape[-1] != value.shape[-1]) > 128 device=cpu (supported: {'cuda'}) attn_bias type is <class 'NoneType'> operator wasn't built - seepython -m xformers.info
for more info[email protected]
is not supported because: max(query.shape[-1] != value.shape[-1]) > 256 device=cpu (supported: {'cuda'}) dtype=torch.float32 (supported: {torch.float16, torch.bfloat16}) operator wasn't built - seepython -m xformers.info
for more infotritonflashattF
is not supported because: max(query.shape[-1] != value.shape[-1]) > 128 device=cpu (supported: {'cuda'}) dtype=torch.float32 (supported: {torch.float16, torch.bfloat16}) operator wasn't built - seepython -m xformers.info
for more info triton is not availablecutlassF
is not supported because: device=cpu (supported: {'cuda'}) operator wasn't built - seepython -m xformers.info
for more infosmallkF
is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 device=cpu (supported: {'cuda'}) operator wasn't built - seepython -m xformers.info
for more info unsupported embed per head: 512can anyone translate wth this all means?
Beta Was this translation helpful? Give feedback.
All reactions