Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Counter-intuitive Batch SCF Charge Convention #179

Closed
MoleOrbitalHybridAnalyst opened this issue Oct 18, 2024 · 2 comments · Fixed by #180
Closed

Counter-intuitive Batch SCF Charge Convention #179

MoleOrbitalHybridAnalyst opened this issue Oct 18, 2024 · 2 comments · Fixed by #180

Comments

@MoleOrbitalHybridAnalyst

My molecules all have charge=1 and I found I have to set charges to 1 / batch_size to get the correct results in batch mode. A simple example is as follows:

import torch

import dxtb
from dxtb.typing import DD

dd: DD = {"device": torch.device("cpu"), "dtype": torch.double}


numbers = torch.tensor(
    [
        [8, 1, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1],
        [8, 1, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1, 8, 1, 1]
    ],
    device=dd["device"],
)
positions = torch.tensor(
    [
        [
                   [ -7.68281384,   1.3350934 ,   0.74846383],
                   [ -5.7428588 ,   1.31513411,   0.36896714],
                   [ -8.23756184,  -0.19765779,   1.67193897],
                   [ -8.13313558,   2.93710683,   1.6453921 ],
                   [ -2.95915993,   1.40005084,   0.24966306],
                   [ -2.1362031 ,   1.4795743 ,  -1.38758999],
                   [ -2.40235213,   2.84218589,   1.24419946],
                   [ -8.2640369 ,   5.79677268,   2.54733192],
                   [ -8.68767571,   7.18194193,   1.3350556 ],
                   [ -9.27787497,   6.09327071,   4.03498102],
                   [ -9.34575393,  -2.54164384,   3.28062124],
                   [ -8.59029812,  -3.46388688,   4.6567765 ],
                   [-10.71898011,  -3.58163572,   2.65211723],
                   [ -9.5591796 ,   9.66793334,  -0.53212042],
                   [ -8.70438089,  11.29169941,  -0.5990394 ],
                   [-11.12723654,   9.8483266 ,  -1.43755624],
                   [ -2.69970054,   5.55135395,   2.96084179],
                   [ -1.59244386,   6.50972855,   4.06699298],
                   [ -4.38439138,   6.18065165,   3.1939773 ]
        ],
        [
                  [ -7.67436676,   1.33433562,   0.74512468],
                  [ -5.75285545,   1.30220838,   0.37189432],
                  [ -8.23155251,  -0.20308887,   1.67397231],
                  [ -8.15184386,   2.94589406,   1.6474141 ],
                  [ -2.96380866,   1.39739578,   0.24572676],
                  [ -2.14413995,   1.48993378,  -1.37321106],
                  [ -2.39808135,   2.86614761,   1.25247646],
                  [ -8.26855335,   5.79452391,   2.54948621],
                  [ -8.69277797,   7.18061912,   1.33247046],
                  [ -9.28819287,   6.08797948,   4.03809906],
                  [ -9.3377226 ,  -2.54245643,   3.27861813],
                  [ -8.59693106,  -3.48501402,   4.65503795],
                  [-10.72627446,  -3.59514726,   2.66139579],
                  [ -9.55955755,   9.6716561 ,  -0.53106973],
                  [ -8.7077635 ,  11.28708848,  -0.59527696],
                  [-11.12540351,   9.87000175,  -1.44181568],
                  [ -2.70194931,   5.55490663,   2.9641866 ],
                  [ -1.60305656,   6.49854138,   4.07984311],
                  [ -4.39083534,   6.17898869,   3.18702311]
        ],
    ],
    **dd
)
charge = torch.tensor([0.5, 0.5], **dd)


# no conformers -> batched mode 1
opts = {"verbosity": 6, "batch_mode": 1}

calc = dxtb.Calculator(numbers, dxtb.GFN1_XTB, opts=opts, **dd)
result = calc.energy(positions, chrg=charge)

print(result.detach().numpy())

This gives the correct energy:

[-34.77268237 -34.77261976]

If I have charge = torch.tensor([1, 1], **dd) instead, I get the wrong energies:

[-34.02695165 -34.02703838]

which are the energies of the molecules as if their charges = 2.

So it seems that the charge tensor is multiplied by batch_size somewhere internally. I think this behavior is quite counter-intuitive and probably not the actually wanted behavior.

@marvinfriede
Copy link
Member

Thanks for raising this issue. I confirmed that there is indeed something weird going on. I will take a look.

@marvinfriede marvinfriede linked a pull request Oct 20, 2024 that will close this issue
@marvinfriede
Copy link
Member

I fixed it and created a bugfix release v0.1.1. You can download it from pip now. It will also be available via conda in a few days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants