You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @apachemycat , would you mind sharing the version of flash_atten in your environment? I am using flash-attn==2.5.7 , looks all good. Also, you can replace dropout_layer_norm with torch.nn.functional.layer_norm
& dropout, although kernel acceleration may not supported now.
Is there an existing issue for this bug?
🐛 Describe the bug
ModuleNotFoundError: No module named 'dropout_layer_norm'
[2024-05-17 03:23:11,932] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 615) of binary: /usr/bin/python
dropout_layer_norm is depreated by flash_attn ,so If any other choise ?
Environment
No response
The text was updated successfully, but these errors were encountered: