Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention' #5534

Closed
harborsarah opened this issue May 14, 2024 · 2 comments · Fixed by #6517
Assignees
Labels
bug Something isn't working compression

Comments

@harborsarah
Copy link

Dear authors,

I install Deepspeed through pip install deepspeed. And my torch version is 1.12.1 and cuda 11.3
However, when i try to use following code to count the flops, it gives me error: AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

if step == profile_step: # if using multi nodes, check global_rank == 0 as well prof.stop_profile() flops = prof.get_total_flops() macs = prof.get_total_macs() params = prof.get_total_params() if print_profile: prof.print_model_profile(profile_step=profile_step) prof.end_profile()

I check the README file and it written that the package support torch>=1.9, so it should work with my version. Do you know how to solve this problem? Thanks a lot.

ds_report output
Please run ds_report to give us details about your setup.

Screenshots
image

@harborsarah harborsarah added bug Something isn't working compression labels May 14, 2024
@lemyx
Copy link

lemyx commented Sep 2, 2024

I have met the same error.

The environment is torch1.13.1+cu117

@loadams
Copy link
Contributor

loadams commented Sep 10, 2024

Hi @lemyx or @harborsarah - could you test with #6517? I believe I've made the required changes there.

github-merge-queue bot pushed a commit that referenced this issue Sep 12, 2024
)

Changes from #4724 broke support for torch<2.0 in the flops profiler as
the scaled_dot_product_attention [wasn't
added](https://pytorch.org/docs/2.0/generated/torch.nn.functional.scaled_dot_product_attention.html#torch.nn.functional.scaled_dot_product_attention)
until a beta version in torch 2.0

Resolved: #5534

Todo:
- [ ] Test this
- [ ] Issue resolution with users.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working compression
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants