Replies: 1 comment 2 replies
-
Hello, |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
when I perform multihead fine-tuning, an error occurs, but when I use naive fine-tuning, no error is reported. How to solve this problem? Thank you
(/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/e3nn/o3/_wigner.py:10: FutureWarning: You are using
torch.load
withweights_only=False
(the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value forweights_only
will be flipped toTrue
. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user viatorch.serialization.add_safe_globals
. We recommend you start settingweights_only=True
for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature._Jd, _W3j_flat, _W3j_indices = torch.load(os.path.join(os.path.dirname(file), 'constants.pt'))
/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/cli/run_train.py:143: FutureWarning: You are using
torch.load
withweights_only=False
(the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value forweights_only
will be flipped toTrue
. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user viatorch.serialization.add_safe_globals
. We recommend you start settingweights_only=True
for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.model_foundation = torch.load(
/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/calculators/mace.py:128: FutureWarning: You are using
torch.load
withweights_only=False
(the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value forweights_only
will be flipped toTrue
. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user viatorch.serialization.add_safe_globals
. We recommend you start settingweights_only=True
for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.torch.load(f=model_path, map_location=device)
Traceback (most recent call last):
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/extxyz.py", line 419, in _read_xyz_frame
line = next(lines)
^^^^^^^^^^^
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/tools/multihead_tools.py", line 163, in assemble_mp_data
select_samples(dict_to_namespace(args_samples))
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/cli/fine_tuning_select.py", line 248, in select_samples
atoms_list_pt = ase.io.read(args.configs_pt, index=":")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/formats.py", line 797, in read
return list(_iread(filename, index, format, io, parallel=parallel,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/parallel.py", line 302, in new_generator
for result in generator(*args, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/formats.py", line 866, in _iread
for dct in io.read(fd, *args, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/formats.py", line 624, in wrap_read_function
yield from read(filename, index, **kwargs)
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/extxyz.py", line 726, in read_xyz
yield _read_xyz_frame(fileobj, natoms, properties_parser, nvec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/ase/io/extxyz.py", line 421, in _read_xyz_frame
raise XYZError('ase.io.extxyz: Frame has {} atoms, expected {}'
ase.io.extxyz.XYZError: ase.io.extxyz: Frame has 41 atoms, expected 64
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/bin/mace_run_train", line 8, in
sys.exit(main())
^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/cli/run_train.py", line 62, in main
run(args)
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/cli/run_train.py", line 261, in run
collections = assemble_mp_data(args, tag, head_configs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/hpc2hdd/home/guanghualiu/anaconda3/envs/mace_multihead_env/lib/python3.12/site-packages/mace/tools/multihead_tools.py", line 183, in assemble_mp_data
raise RuntimeError(
RuntimeError: Model or descriptors download failed and no local model found
)
Beta Was this translation helpful? Give feedback.
All reactions