You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
run python colossalai run --nproc_per_node 1 --hostfile hostfile --master_addr 172.17.0.2 benchmark.py --plugin "gemini_cpu" -l 512 -g -b 6,
error occured.
Loading checkpoint shards: 0%| | 0/2 [00:09<?, ?it/s]
Traceback (most recent call last):
File "/home/Colosal/examples/language/llama/benchmark.py", line 206, in
main()
File "/home/Colosal/examples/language/llama/benchmark.py", line 146, in main
model = AutoModelForCausalLM.from_pretrained(
File "/opt/conda/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 493, in from_pretrained
return model_class.from_pretrained(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2910, in from_pretrained
) = cls._load_pretrained_model(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3267, in _load_pretrained_model
new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 719, in _load_state_dict_into_meta_model
set_module_tensor_to_device(model, param_name, param_device, **set_module_kwargs)
File "/opt/conda/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 320, in set_module_tensor_to_device
new_value = param_cls(new_value, requires_grad=old_value.requires_grad).to(device)
File "/opt/conda/lib/python3.9/site-packages/colossalai/lazy/lazy_init.py", line 160, in new
elem = func(*args, **{**kwargs, 'device': 'meta'})
TypeError: 'LazyTensor' object is not callable
Environment
build from this repo, branch example/llama dockerfile.
The text was updated successfully, but these errors were encountered:
🐛 Describe the bug
Hi, I checkout to the branch example/llama,
and modify code from
to
run
python colossalai run --nproc_per_node 1 --hostfile hostfile --master_addr 172.17.0.2 benchmark.py --plugin "gemini_cpu" -l 512 -g -b 6
,error occured.
Loading checkpoint shards: 0%| | 0/2 [00:09<?, ?it/s]
Traceback (most recent call last):
File "/home/Colosal/examples/language/llama/benchmark.py", line 206, in
main()
File "/home/Colosal/examples/language/llama/benchmark.py", line 146, in main
model = AutoModelForCausalLM.from_pretrained(
File "/opt/conda/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 493, in from_pretrained
return model_class.from_pretrained(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2910, in from_pretrained
) = cls._load_pretrained_model(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3267, in _load_pretrained_model
new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
File "/opt/conda/lib/python3.9/site-packages/transformers/modeling_utils.py", line 719, in _load_state_dict_into_meta_model
set_module_tensor_to_device(model, param_name, param_device, **set_module_kwargs)
File "/opt/conda/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 320, in set_module_tensor_to_device
new_value = param_cls(new_value, requires_grad=old_value.requires_grad).to(device)
File "/opt/conda/lib/python3.9/site-packages/colossalai/lazy/lazy_init.py", line 160, in new
elem = func(*args, **{**kwargs, 'device': 'meta'})
TypeError: 'LazyTensor' object is not callable
Environment
build from this repo, branch example/llama dockerfile.
The text was updated successfully, but these errors were encountered: