Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fuse layer_norm and add_bias_input between layers #4

Merged
merged 3 commits into from
Apr 22, 2022

Conversation

ZzSean
Copy link

@ZzSean ZzSean commented Apr 21, 2022

PR types

Others

PR changes

Others

Describe

Fuse layer_norm and add_bias_input between layers

@wangxicoding wangxicoding merged commit 00fa94c into wangxicoding:opt_gpt_generation Apr 22, 2022
@ZzSean ZzSean deleted the fuse_ln branch June 6, 2022 07:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants