Skip to content

Commit

Permalink
do not use assing=True for nn.LayerNorm
Browse files Browse the repository at this point in the history
  • Loading branch information
wkpark committed Sep 25, 2024
1 parent 7d56c4a commit 5d32084
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion modules/sd_disable_initialization.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ def __enter__(self):
def load_from_state_dict(original, module, state_dict, prefix, *args, **kwargs):
used_param_keys = []

if isinstance(module, (torch.nn.Linear, torch.nn.Conv2d, torch.nn.GroupNorm, torch.nn.LayerNorm)):
if isinstance(module, (torch.nn.Linear, torch.nn.Conv2d, torch.nn.GroupNorm,)):
# HACK add assign=True to local_metadata for some cases
args[0]['assign_to_params_buffers'] = True

Expand Down

0 comments on commit 5d32084

Please sign in to comment.