You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since the lattice matrix L is not rotation invariant, we instead predict the 6 lattice parameters, i.e. the lengths of the 3 lattice vectors and the angles between them. We normalize the lengths of lattice vectors with [N^(1/3)], where N is the number of atoms, to ensure that the lengths for materials of different sizes are at the same scale.
This is turning out to be a lot more troublesome than I thought. WIP in normalize-lattice-lengths branch.
The main issue is dealing with a_range, b_range, and c_range during RGB scaling and unscaling. Dividing by the cubed root of the number of atoms was easy enough, but then I need new ranges, and this would need to be reflected in the behavior of fit based on the self.normalize_lengths_by_atoms kwarg. Will probably need another Colab notebook (or an update to an existing one) that looks at quantiles based on the normalized values rather than the original values.
The more hyperparameters like this, the more default ranges that will need to be created. Right now, the defaults are based on primitive cells, and I haven't incorporated defaults that hasan-sayeed produced for conventional cells #96. By adding a second boolean that affects default ranges, I end up with $2^2$ default hyperparameter sets. Another one and it becomes $2^3$ and so on. I could also leave it to the user, but then it throws off closed-loop hyperparameter optimization. It's also hard to know without trying both options whether I should stick with one or the other and whether it's worth the effort to answer that question.
Related methods from CDVAE (emphasis added):
See
get_reduced_structure()
Originally posted by @sgbaird in #94 (comment)
The text was updated successfully, but these errors were encountered: