Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention mask & bias #1

Merged
merged 73 commits into from
Oct 13, 2022
Merged

attention mask & bias #1

merged 73 commits into from
Oct 13, 2022

Conversation

robotcator
Copy link
Collaborator

Add attention mask & bias.

Currently, because of the ldg instruction problem, we only support an even number of sequence lengths.

Gmem_tile_bias gmem_bias(params, binfo, tidx, loop_step_idx);

using Gmem_tile_ds = typename Kernel_traits::Gmem_tile_ds;
Gmem_tile_ds gmem_ds(params, binfo, tidx, loop_step_idx);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ds should be disable during forward?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, this part is for backward.

@guolinke guolinke merged commit a80a963 into dptech-corp:main Oct 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants