Skip to content

Commit

Permalink
Merge branch 'development' into topic-pyAMReX
Browse files Browse the repository at this point in the history
  • Loading branch information
RemiLehe committed Jul 12, 2023
2 parents cabce53 + f30799b commit 3c4644d
Show file tree
Hide file tree
Showing 248 changed files with 5,392 additions and 2,768 deletions.
67 changes: 66 additions & 1 deletion .clang-tidy
Original file line number Diff line number Diff line change
@@ -1,6 +1,71 @@
Checks: '-*,
bugprone-argument-comment,
bugprone-assert-side-effect,
bugprone-bad-signal-to-kill-thread,
bugprone-bool-pointer-implicit-conversion,
bugprone-branch-clone,
bugprone-copy-constructor-init,
bugprone-dangling-handle,
bugprone-dynamic-static-initializers,
-bugprone-easily-swappable-parameters,
bugprone-exception-escape,
bugprone-fold-init-type,
bugprone-forward-declaration-namespace,
bugprone-forwarding-reference-overload,
-bugprone-implicit-widening-of-multiplication-result,
bugprone-inaccurate-erase,
bugprone-incorrect-roundings,
bugprone-infinite-loop,
bugprone-integer-division,
bugprone-lambda-function-name,
bugprone-macro-parentheses,
bugprone-macro-repeated-side-effects,
bugprone-misplaced-operator-in-strlen-in-alloc,
bugprone-misplaced-pointer-arithmetic-in-alloc,
-bugprone-misplaced-widening-cast,
bugprone-move-forwarding-reference,
bugprone-multiple-statement-macro,
bugprone-no-escape,
bugprone-not-null-terminated-result,
bugprone-parent-virtual-call,
bugprone-posix-return,
bugprone-redundant-branch-condition,
bugprone-reserved-identifier,
bugprone-signal-handler,
bugprone-signed-char-misuse,
bugprone-sizeof-container,
bugprone-sizeof-expression,
bugprone-spuriously-wake-up-functions,
bugprone-string-constructor,
bugprone-string-integer-assignment,
bugprone-string-literal-with-embedded-nul,
bugprone-stringview-nullptr,
bugprone-suspicious-enum-usage,
bugprone-suspicious-include,
bugprone-suspicious-memory-comparison,
bugprone-suspicious-memset-usage,
bugprone-suspicious-missing-comma,
bugprone-suspicious-semicolon,
bugprone-suspicious-string-compare,
bugprone-swapped-arguments,
bugprone-terminating-continue,
bugprone-throw-keyword-missing,
bugprone-too-small-loop-variable,
bugprone-undefined-memory-manipulation,
bugprone-undelegated-constructor,
bugprone-unhandled-exception-at-new,
bugprone-unhandled-self-assignment,
bugprone-unused-raii,
bugprone-unused-return-value,
bugprone-use-after-move,
bugprone-virtual-near-miss,
cppcoreguidelines-avoid-goto,
modernize-use-nullptr
misc-const-correctness,
modernize-avoid-bind,
modernize-use-nullptr,
performance-faster-string-find,
performance-for-range-copy,
readability-non-const-parameter
'

HeaderFilterRegex: 'Source[a-z_A-Z0-9\/]+\.H$'
2 changes: 1 addition & 1 deletion .github/workflows/cuda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ jobs:
which nvcc || echo "nvcc not in PATH!"
git clone https://github.com/AMReX-Codes/amrex.git ../amrex
cd ../amrex && git checkout --detach d9bae8ce9e69a962154a9340a0fb8ae9895c1fde && cd -
cd ../amrex && git checkout --detach 0236a3732cc8e399e01eacd0253a782da40ce1f7 && cd -
make COMP=gcc QED=FALSE USE_MPI=TRUE USE_GPU=TRUE USE_OMP=FALSE USE_PSATD=TRUE USE_CCACHE=TRUE -j 2
build_nvhpc21-11-nvcc:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/insitu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
CC: gcc
CMAKE_PREFIX_PATH: /ascent/install/lib/cmake/
container:
image: alpinedav/ascent:0.9.1
image: alpinedav/ascent:0.9.2
steps:
- uses: actions/checkout@v3
- name: Configure
Expand Down
3 changes: 2 additions & 1 deletion Docs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,14 @@ docutils>=0.17.1

# PICMI API docs
# note: keep in sync with version in ../requirements.txt
picmistandard==0.24.0
picmistandard==0.25.0
# for development against an unreleased PICMI version, use:
# picmistandard @ git+https://github.com/picmi-standard/picmi.git#subdirectory=PICMI_Python

pygments
recommonmark
sphinx>=5.3
sphinx-copybutton
sphinx-design
sphinx_rtd_theme>=1.1.1
sphinxcontrib-bibtex
Expand Down
8 changes: 7 additions & 1 deletion Docs/source/acknowledge_us.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,9 +53,15 @@ Prior WarpX references

If your project uses a specific algorithm or component, please consider citing the respective publications in addition.

- Sandberg R T, Lehe R, Mitchell C E, Garten M, Qiang J, Vay J-L and Huebl A.
**Hybrid Beamline Element ML-Training for Surrogates in the ImpactX Beam-Dynamics Code**.
14th International Particle Accelerator Conference (IPAC'23), WEPA101, *in print*, 2023.
`preprint <https://www.ipac23.org/preproc/pdf/WEPA101.pdf>`__,
`DOI:10.18429/JACoW-IPAC-23-WEPA101 <https://doi.org/10.18429/JACoW-IPAC-23-WEPA101>`__

- Huebl A, Lehe R, Zoni E, Shapoval O, Sandberg R T, Garten M, Formenti A, Jambunathan R, Kumar P, Gott K, Myers A, Zhang W, Almgren A, Mitchell C E, Qiang J, Sinn A, Diederichs S, Thevenet M, Grote D, Fedeli L, Clark T, Zaim N, Vincenti H, Vay JL.
**From Compact Plasma Particle Sources to Advanced Accelerators with Modeling at Exascale**.
Proceedings of the 20th Advanced Accelerator Concepts Workshop (AAC'22), *submitted* 2023.
Proceedings of the 20th Advanced Accelerator Concepts Workshop (AAC'22), *in print*, 2023.
`arXiv:2303.12873 <https://arxiv.org/abs/2303.12873>`__

- Huebl A, Lehe R, Mitchell C E, Qiang J, Ryne R D, Sandberg R T, Vay JL.
Expand Down
1 change: 1 addition & 0 deletions Docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
'sphinx.ext.mathjax',
'sphinx.ext.napoleon',
'sphinx.ext.viewcode',
'sphinx_copybutton',
'sphinx_design',
'breathe',
'sphinxcontrib.bibtex'
Expand Down
2 changes: 1 addition & 1 deletion Docs/source/dataanalysis/openpmdviewer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,4 +65,4 @@ by using the command:
ts.slider()

You can also access the particle and field data as numpy arrays with the methods ``ts.get_field`` and ``ts.get_particle``.
See the openPMD-viewer tutorials `here <https://github.com/openPMD/openPMD-viewer/tree/master/tutorials>`_ for more info.
See the openPMD-viewer tutorials `here <https://github.com/openPMD/openPMD-viewer/tree/dev/docs/source/tutorials>`_ for more info.
56 changes: 54 additions & 2 deletions Docs/source/developers/checksum.rst
Original file line number Diff line number Diff line change
Expand Up @@ -41,8 +41,11 @@ See additional options
* ``--rtol`` relative tolerance for the comparison
* ``--atol`` absolute tolerance for the comparison (a sum of both is used by ``numpy.isclose()``)

Reset a benchmark from a plotfile you know is correct
-----------------------------------------------------
Reset a benchmark with new values that you know are correct
-----------------------------------------------------------

Reset a benchmark from a plotfile generated locally
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

This is using ``checksumAPI.py`` as a Python script.

Expand All @@ -61,3 +64,52 @@ Since this will automatically change the JSON file stored on the repo, make a se
git add <test name>.json
git commit -m "reset benchmark for <test name> because ..." --author="Tools <warpx@lbl.gov>"
Reset a benchmark from the Azure pipeline output on Github
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Alternatively, the benchmarks can be reset using the output of the Azure continuous intergration (CI) tests on Github. The output can be accessed by following the steps below:

* On the Github page of the Pull Request, find (one of) the pipeline(s) failing due to benchmarks that need to be updated and click on "Details".

.. figure:: https://user-images.githubusercontent.com/49716072/135133589-1fd8a626-ff93-4b9e-983f-acee028e0e4e.png
:alt: Screen capture showing how to access Azure pipeline output on Github.

* Click on "View more details on Azure pipelines".

.. figure:: https://user-images.githubusercontent.com/49716072/135133596-8f73afa2-969e-49a4-b4a6-184a4f478a44.png
:alt: Screen capture showing how to access Azure pipeline output on Github.

* Click on "Build & test".

.. figure:: https://user-images.githubusercontent.com/49716072/135133607-87324124-6145-4589-9a92-dcc8ea9432e4.png
:alt: Screen capture showing how to access Azure pipeline output on Github.

From this output, there are two options to reset the benchmarks:

#. For each of the tests failing due to benchmark changes, the output contains the content of the new benchmark file, as shown below.
This content can be copied and pasted into the corresponding benchmark file.
For instance, if the failing test is ``LaserAcceleration_BTD``, this content can be pasted into the file ``Regression/Checksum/benchmarks_json/LaserAcceleration_BTD.json``.

.. figure:: https://user-images.githubusercontent.com/49716072/244415944-3199a933-990b-4bde-94b1-162b7e8e22be.png
:alt: Screen capture showing how to read new benchmark file from Azure pipeline output.

#. If there are many tests failing in a single Azure pipeline, it might become more convenient to update the benchmarks automatically.
WarpX provides a script for this, located in ``Tools/DevUtils/update_benchmarks_from_azure_output.py``.
This script can be used by following the steps below:

* From the Azure output, click on "View raw log".

.. figure:: https://user-images.githubusercontent.com/49716072/135133617-764b6daf-a8e4-4a50-afae-d4b3a7568b2f.png
:alt: Screen capture showing how to download raw Azure pipeline output.

* This should lead to a page that looks like the image below. Save it as a text file on your local computer.

.. figure:: https://user-images.githubusercontent.com/49716072/135133624-310df207-5f87-4260-9917-26d5af665d60.png
:alt: Screen capture showing how to download raw Azure pipeline output.

* On your local computer, go to the WarpX folder and cd to the ``Tools/DevUtils`` folder.

* Run the command ``python update_benchmarks_from_azure_output.py /path/to/azure_output.txt``. The benchmarks included in that Azure output should now be updated.

* Repeat this for every Azure pipeline (e.g. ``cartesian2d``, ``cartesian3d``, ``qed``) that contains benchmarks that need to be updated.
86 changes: 84 additions & 2 deletions Docs/source/developers/python.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,89 @@
.. _development-python:

Python interface
================
Processing PICMI Input Options
==============================

The input parameters in a WarpX PICMI file are processed in two layers.
The first layer is the Python level API, which mirrors the :ref:`C++ application input structure <running-cpp-parameters>`; the second is the translation from the PICMI input to the equivalent :ref:`app (AMReX) input file parameters <running-cpp-parameters>`.

The two layers are described below.

Input parameters
----------------

In a C++ input file, each of the parameters has a prefix, for example ``geometry`` in ``geometry.prob_lo``.
For each of these prefixes, an instance of a Python class is created and the parameters saved as attributes.
This construction is used since the lines in the input file look very much like a Python assignment statement,
assigning attributes of class instances, for example ``geometry.dims = 3``.

Many of the prefix instances are predefined, for instance ``geometry`` is created in the file ``Python/pywarpx/Geometry.py``.
In that case, ``geometry`` is an instance of the class ``Bucket`` (specified in ``Python/pywarpx/Bucket.py``),
the general class for prefixes.
It is called ``Bucket`` since its main purpose is a place to hold attributes.
Most of the instances are instances of the ``Bucket`` class.
There are exceptions, such as ``constants`` and ``diagnostics`` where extra processing is needed.

There can also be instances created as needed.
For example, for the particle species, an instance is created for each species listed in ``particles.species_names``.
This gives a place to hold the parameters for the species, e.g., ``electrons.mass``.

The instances are then used to generate the input parameters.
Each instance can generate a list of strings, one for each attribute.
This happens in the ``Bucket.attrlist`` method.
The strings will be the lines as in an input file, for example ``"electrons.mass = m_e"``.
The lists for each instance are gathered into one long list in the ``warpx`` instance (of the class ``WarpX`` defined in
``Python/pywarpx/WarpX.py``).
This instance has access to all of the predefined instances as well as lists of the generated instances.

In both of the ways that WarpX can be run with Python, that list of input parameter strings will be generated.
This is done in the routine ``WarpX.create_argv_list`` in ``Python/pywarpx/WarpX.py``.
If WarpX will be run directly in Python, that list will be sent to the ``amrex_init`` routine as the ``argv``.
This is as if all of the input parameters had been specified on the command line.
If Python is only used as a prepocessor to generate the input file, the list are the strings that are written out to create the
input file.

There are two input parameters that do not have prefixes, ``max_step`` and ``stop_time``.
These are handled via keyword arguments in the ``WarpX.create_argv_list`` method.

Conversion from PICMI
---------------------

In the PICMI implementation, defined in ``Python/pywarpx/picmi.py``, for each PICMI class, a class was written that
inherits the PICMI class and does the processing of the input.
Each of the WarpX classes has two methods, ``init`` and ``initialize_inputs``.
The ``init`` method is called during the creation of the class instances that happens in the user's PICMI input file.
This is part of the standard - each of the PICMI classes call the method ``handle_init`` from the constructor ``__init__`` routines.
The main purpose is to process application specific keyword arguments (those that start with ``warpx_`` for example).
These are then passed into the ``init`` methods.
In the WarpX implementation, in the ``init``, each of the WarpX specific arguments are saved as attributes of the implementation
class instancles.

It is in the second method, ``initialize_inputs``, where the PICMI input parameters are translated into WarpX input parameters.
This method is called later during the intialization.
The prefix instances described above are all accessible in the implementation classes (via the ``pywarpx`` module).
For each PICMI input quantity, the appropriate WarpX input parameters are set in the prefix classes.
As needed, for example in the ``Species`` class, the dynamic prefix instances are created and the attributes set.

Simulation class
----------------

The ``Simulation`` class ties it all together.
In a PICMI input file, all information is passed into the ``Simulation`` class instance, either through the constructor
or through ``add_`` methods.
Its ``initialize_inputs`` routine initializes the input parameters it handles and also calls the ``initialize_inputs``
methods of all of the PICMI class instances that have been passed in, such as the field solver, the particles species,
and the diagnostics.
As with other PICMI classes, the ``init`` routine is called by the constructor and ``initialize_inputs`` is called during
initialization.
The initialization happens when either the ``write_input_file`` method is called or the ``step`` method.
After ``initialize_inputs`` is finished, the attributes of the prefix instances have been filled in, and the process described
above happens, where the prefix instances are looped over to generate the list of input parameter strings (that is either written
out to a file or passed in as ``argv``).
The two parameters that do not have a prefix, ``max_step`` and ``stop_time``, are passed into the ``warpx`` method as keyword
arguments.

Python runtime interface
========================

The Python interface provides low and high level access to much of the data in WarpX.
With the low level access, a user has direct access to the underlying memory contained
Expand Down
19 changes: 19 additions & 0 deletions Docs/source/highlights.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ Plasma-Based Acceleration

Scientific works in laser-plasma and beam-plasma acceleration.

#. Sandberg R T, Lehe R, Mitchell C E, Garten M, Qiang J, Vay J-L, Huebl A.
**Hybrid Beamline Element ML-Training for Surrogates in the ImpactX Beam-Dynamics Code**.
14th International Particle Accelerator Conference (IPAC'23), WEPA101, *in print*, 2023.
`preprint <https://www.ipac23.org/preproc/pdf/WEPA101.pdf>`__,
`DOI:10.18429/JACoW-IPAC-23-WEPA101 <https://doi.org/10.18429/JACoW-IPAC-23-WEPA101>`__

#. Wang J, Zeng M, Li D, Wang X, Gao J.
**High quality beam produced by tightly focused laser driven wakefield accelerators**.
arXiv pre-print, 2023.
Expand Down Expand Up @@ -76,6 +82,18 @@ Particle Accelerator & Beam Physics

Scientific works in particle and beam modeling.

#. Sandberg R T, Lehe R, Mitchell C E, Garten M, Qiang J, Vay J-L, Huebl A.
**Hybrid Beamline Element ML-Training for Surrogates in the ImpactX Beam-Dynamics Code**.
14th International Particle Accelerator Conference (IPAC'23), WEPA101, *in print*, 2023.
`preprint <https://www.ipac23.org/preproc/pdf/WEPA101.pdf>`__,
`DOI:10.18429/JACoW-IPAC-23-WEPA101 <https://doi.org/10.18429/JACoW-IPAC-23-WEPA101>`__

#. Tan W H, Piot P, Myers A, Zhang W, Rheaume T, Jambunathan R, Huebl A, Lehe R, Vay J-L.
**Simulation studies of drive-beam instability in a dielectric wakefield accelerator**.
13th International Particle Accelerator Conference (IPAC'22), MOPOMS012, 2022.
`DOI:10.18429/JACoW-IPAC2022-MOPOMS012 <https://doi.org/10.18429/JACoW-IPAC2022-MOPOMS012>`__


High Energy Astrophysical Plasma Physics
****************************************

Expand All @@ -86,6 +104,7 @@ Scientific works in astrophysical plasma modeling.
arXiv pre-print, 2023.
`DOI:10.48550/arXiv.2304.10566 <https://doi.org/10.48550/arXiv.2304.10566>`__


Microelectronics
****************

Expand Down
39 changes: 39 additions & 0 deletions Docs/source/install/batch/lsf.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
Job Submission
''''''''''''''

* ``bsub your_job_script.bsub``


Job Control
'''''''''''

* interactive job:

* ``bsub -P $proj -W 2:00 -nnodes 1 -Is /bin/bash``

* `details for my jobs <https://docs.olcf.ornl.gov/systems/summit_user_guide.html#monitoring-jobs>`_:

* ``bjobs 12345`` all details for job with <job id> ``12345``
* ``bjobs [-l]`` all jobs under my user name
* ``jobstat -u $(whoami)`` job eligibility
* ``bjdepinfo 12345`` job dependencies on other jobs

* details for queues:

* ``bqueues`` list queues

* communicate with job:

* ``bkill <job id>`` abort job
* ``bpeek [-f] <job id>`` peek into ``stdout``/``stderr`` of a job
* ``bkill -s <signal number> <job id>`` send signal or signal name to job
* ``bchkpnt`` and ``brestart`` checkpoint and restart job (untested/unimplemented)
* ``bmod -W 1:30 12345`` change the walltime of a job (currently not allowed)
* ``bstop <job id>`` prevent the job from starting
* ``bresume <job id>`` release the job to be eligible for run (after it was set on hold)


References
''''''''''

* https://www.ibm.com/docs/en/spectrum-lsf
Loading

0 comments on commit 3c4644d

Please sign in to comment.