Skip to content

some questions about FlashAttention #20

Open
@Rolin-zrl

Description

@Rolin-zrl

Building the wheel is very slow. I chose the version "flash_attn-2.6.1+cu118torch2.1cxx11abiTRUE-cp310-cp310-linux_x86_64.whl" to download and install via pip, based on the specifications torch==2.11, cuda==11.8, python==3.10 on my Linux system. there still exists errors, as shown below:
ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /home/whuav/anaconda3/envs/MeshAnything/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so)

any suggestions?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions