Skip to content

Conversation

@matthewcarbone
Copy link
Owner

Complete, backwards-incompatible overhaul of the code. I will be

  • Testing everything
  • Including complete CI/coverage
  • Documenting everything
  • Ensuring docs build properly
  • Adding a new logging system built on loguru
  • Bunches of other stuff

matthewcarbone and others added 30 commits April 17, 2022 09:15
* Start to add docs
* Start to add type checking
* Remove analyze.py
* Add ParallelDenseExecutor which is parallel over k,w points
* Move petsc4py solver to its own submodule. That way, people
  who wish to run MPI jobs without petsc can still import a
  parallel module without the import error.
added saving of G and exec times to disk
and generated comparisons to ground truth
for petsc and serial dense (cont frac)
to collect G from last rank but staying on rank 0
changed getRank/getSize syntax in base.py (not affecting other
executors)
@matthewcarbone
Copy link
Owner Author

matthewcarbone commented Jun 11, 2022

@Chiffafox hey Stepan! I have overhauled the entire API to be much more user friendly. Everything in models.py and the entire engine module are done. I've more or less compared the basis constructions to the previous version of the code in your branch (new_lorentzfits).

At this point, I'm going to take a shot at the executors. Primarily, I'm going to implement a "global syntax" for checkpoint-restart and object serialization (so far a lot is MSONable, we'll make it pickle-able too). Just wanted to keep you in the loop on this.

Honestly, we could start writing a computational paper on this if you want... what do you think?

Edit: oh, and the framework for 2 and 3 dimensions is in place, e.g., the 2 and 3 dimensional Config objects behave properly when removing/adding phonons (a lot of thanks to my student on this)!

@Chiffafox
Copy link
Collaborator

Chiffafox commented Jun 17, 2022 via email

@matthewcarbone
Copy link
Owner Author

Hey @Chiffafox, sounds good. Yeah if it's no different to you definitely branch off of mc/v1. But having your input and help with the executors (or as they're now called, solvers) would be amazing. Thanks!

@matthewcarbone
Copy link
Owner Author

matthewcarbone commented Jul 24, 2022

@Chiffafox I have implemented the new versions of the solvers. They should be MPI-compatible by default, with like 10x less code. Still have to implement the dispersion stuff but there may even be a way to parallelize that too. Let me know what you think!

I think the next thing to do is add the checkpoint/restart to the solvers objects, then finally to re-implement your PETSc accelerators. Then we're done with the code, basically!

Finite temperature code was having some issues. Before this fix,
the code was only working properly for when there were single
phonon types. Now everything is properly parsed for more than
one phonon type.
@matthewcarbone
Copy link
Owner Author

I think 97f9d42 should fix the finite-T issues with H+P but we should definitely test it.

@matthewcarbone matthewcarbone added this to the v1 Release milestone Jul 24, 2022
@matthewcarbone matthewcarbone merged commit 1628b3f into master Sep 14, 2022
@matthewcarbone matthewcarbone deleted the mc/v1 branch September 14, 2022 20:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment