Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can not load flash_attn_2_cuda #3

Open
DuroCuri opened this issue Oct 5, 2024 · 0 comments
Open

Can not load flash_attn_2_cuda #3

DuroCuri opened this issue Oct 5, 2024 · 0 comments

Comments

@DuroCuri
Copy link

DuroCuri commented Oct 5, 2024

2024-10-05 22:57:55 - INFO - =======Loading Dataset=======
Traceback (most recent call last):
File "E:\llm\florence2-ft-simple\main.py", line 63, in
main(args)
File "E:\llm\florence2-ft-simple\main.py", line 34, in main
model = AutoModelForCausalLM.from_pretrained(args.model_dir, trust_remote_code=True).to(device)
File "e:\anaconda3\envs\florence2\lib\site-packages\transformers\models\auto\auto_factory.py", line 553, in from_pretrained
model_class = get_class_from_dynamic_module(
File "e:\anaconda3\envs\florence2\lib\site-packages\transformers\dynamic_module_utils.py", line 552, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module, force_reload=force_download)
File "e:\anaconda3\envs\florence2\lib\site-packages\transformers\dynamic_module_utils.py", line 249, in get_class_in_module
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "C:\Users\Administrator.cache\huggingface\modules\transformers_modules\Florence-2-large\modeling_florence2.py", line 63, in
from flash_attn.bert_padding import index_first_axis, pad_input, unpad_input # noqa
File "e:\anaconda3\envs\florence2\lib\site-packages\flash_attn_init
.py", line 3, in
from flash_attn.flash_attn_interface import (
File "e:\anaconda3\envs\florence2\lib\site-packages\flash_attn\flash_attn_interface.py", line 10, in
import flash_attn_2_cuda as flash_attn_cuda
ImportError: DLL load failed while importing flash_attn_2_cuda: 找不到指定的程序。

I can not run this finetune script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant