-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement ZBL potential #134
Merged
Merged
Changes from 8 commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
18adfdc
Began implementing ZBL potential
peastman 78bf317
Apply smooth cutoff to ZBL
peastman c63a1a3
Allow cutoff to be specified in config file
peastman b17d504
Workaround for pytorch bug
peastman d6709d8
Bug fixes to ZBL potential
peastman d23e650
Adapted to new API for priors
peastman eb9fd15
Merge branch 'main' into zbl
peastman 9e035cb
Added test case for ZBL
peastman 53d1e4d
Attempt at fixing test failure on CI
peastman 7016f5f
Support multiple prior models
peastman 7069d52
Tests and fixes for multiple priors
peastman 4dc5268
Clarification to docstring
peastman 9066935
Fixed error
peastman File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,2 @@ | ||
from torchmdnet.priors.atomref import Atomref | ||
from torchmdnet.priors.zbl import ZBL |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,54 @@ | ||
import torch | ||
from torchmdnet.priors.base import BasePrior | ||
from torchmdnet.models.utils import Distance, CosineCutoff | ||
|
||
class ZBL(BasePrior): | ||
"""This class implements the Ziegler-Biersack-Littmark (ZBL) potential for screened nuclear repulsion. | ||
Is is described in https://doi.org/10.1007/978-3-642-68779-2_5 (equations 9 and 10 on page 147). It | ||
is an empirical potential that does a good job of describing the repulsion between atoms at very short | ||
distances. | ||
|
||
To use this prior, the Dataset must provide the following attributes. | ||
|
||
atomic_number: 1D tensor of length max_z. atomic_number[z] is the atomic number of atoms with atom type z. | ||
distance_scale: multiply by this factor to convert coordinates stored in the dataset to meters | ||
energy_scale: multiply by this factor to convert energies stored in the dataset to Joules | ||
raimis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
""" | ||
def __init__(self, cutoff_distance, max_num_neighbors, atomic_number=None, distance_scale=None, energy_scale=None, dataset=None): | ||
super(ZBL, self).__init__() | ||
if atomic_number is None: | ||
atomic_number = dataset.atomic_number | ||
if distance_scale is None: | ||
distance_scale = dataset.distance_scale | ||
raimis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
if energy_scale is None: | ||
energy_scale = dataset.energy_scale | ||
atomic_number = torch.as_tensor(atomic_number, dtype=torch.int8) | ||
self.register_buffer("atomic_number", atomic_number) | ||
self.distance = Distance(0, cutoff_distance, max_num_neighbors=max_num_neighbors) | ||
self.cutoff = CosineCutoff(cutoff_upper=cutoff_distance) | ||
self.cutoff_distance = cutoff_distance | ||
self.max_num_neighbors = max_num_neighbors | ||
self.distance_scale = distance_scale | ||
self.energy_scale = energy_scale | ||
|
||
def get_init_args(self): | ||
return {'cutoff_distance': self.cutoff_distance, | ||
'max_num_neighbors': self.max_num_neighbors, | ||
'atomic_number': self.atomic_number, | ||
'distance_scale': self.distance_scale, | ||
'energy_scale': self.energy_scale} | ||
|
||
def reset_parameters(self): | ||
pass | ||
|
||
def post_reduce(self, y, z, pos, batch): | ||
edge_index, distance, _ = self.distance(pos, batch) | ||
atomic_number = self.atomic_number[z[edge_index]] | ||
# 5.29e-11 is the Bohr radius in meters. All other numbers are magic constants from the ZBL potential. | ||
a = 0.8854*5.29177210903e-11/(atomic_number[0]**0.23 + atomic_number[1]**0.23) | ||
d = distance*self.distance_scale/a | ||
f = 0.1818*torch.exp(-3.2*d) + 0.5099*torch.exp(-0.9423*d) + 0.2802*torch.exp(-0.4029*d) + 0.02817*torch.exp(-0.2016*d) | ||
f *= self.cutoff(distance) | ||
# Compute the energy, converting to the dataset's units. Multiply by 0.5 because every atom pair | ||
# appears twice. | ||
return y + 0.5*(2.30707755e-28/self.energy_scale/self.distance_scale)*torch.sum(f*atomic_number[0]*atomic_number[1]/distance, dim=-1) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The loading of the pretrained models (https://github.com/torchmd/torchmd-net/tree/main/examples#loading-checkpoints) fails:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It sounds like we want to redo how prior args are specified as described in #26 (comment). That means we can switch back to
prior_args
for this value. There will still be compatibility issues, because it will need to become a list with args for multiple prior models, but I can add a check for that case for backward compatibility. I'll go ahead and make the changes in this PR.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be a good idea to add a test case for loading model checkpoints from a previous version.