To use the `--use-flash-attention` feature, the `flash-attn` package must be installed first. command: J:\ComfyUI\ComfyUI_windows_portable\python_embeded\python.exe -m pip install flash-attn ...
I am encountering an error while attempting to install the flash-attn library on my Windows 11 machine with CUDA 11.8. Despite having the nvcc compiler and CUDA properly installed and accessible, the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results