Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 21.08 #2158

Merged
merged 2 commits into from
Aug 3, 2021
Merged

Release 21.08 #2158

merged 2 commits into from
Aug 3, 2021

Conversation

ax3l
Copy link
Member

@ax3l ax3l commented Aug 3, 2021

Prepare the August release of WarpX:

# update dependencies
./Tools/Release/updateAMReX.py
./Tools/Release/updatePICSAR.py
# bump version number
./Tools/Release/newVersion.sh

Following this workflow: https://warpx.readthedocs.io/en/latest/maintenance/release.html

@ax3l ax3l added component: documentation Docs, readme and manual component: third party Changes in WarpX that reflect a change in a third-party library labels Aug 3, 2021
@ax3l ax3l enabled auto-merge (squash) August 3, 2021 19:44
Copy link
Member

@RevathiJambunathan RevathiJambunathan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Axel

@ax3l ax3l merged commit 6810fb0 into ECP-WarpX:development Aug 3, 2021
@ax3l ax3l deleted the release-21.08 branch August 3, 2021 19:54
roelof-groenewald added a commit to ModernElectron/WarpX that referenced this pull request Aug 11, 2021
* CI: Add Missing Python Analysis for EB Test (ECP-WarpX#2147)

* CI: Add Missing Python Analysis for EB Test

* Use 1 MPI Process for Azure

* a few _rt (ECP-WarpX#2146)

* PSATD: div Cleaning Implemented only with psatd.J_linear_in_time=1 (ECP-WarpX#2142)

* BTD: Don't Flush If Written (ECP-WarpX#2148)

Written BTD buffers for lab snapshot data are reset to zero size
(count). When we do the final write of all partly filled buffers
in `FilterComputePackFlushLastTimestep`, we should not write such
already completed backtransformed lab snapshots again.

* openPMD: `groupBased` Option Missing (ECP-WarpX#2149)

The `groupBased` iteration encoding (input: `g`) was not parsed.

* Remove predefined constants from example input files (ECP-WarpX#2153)

* BTD_ReducedSliceDiag: BTD Plotfiles (ECP-WarpX#2152)

By accident, the 2nd test did not use plotfile output.

* Allow extra particle attributes (besides ux, uy, uz and w) to be set at particle creation in AddNParticles() (ECP-WarpX#2115)

* exposes AddRealComp to Python to allow extra particle attributes to be added at runtime; also includes a new function to grab a particle data array from the name of the component rather than the index

* added functionality to AddNParticles() to allow extra particle attributes to also be set during particle creation

* added function to get index of a particle component given the PID name

* changed new get component index and get_particle_arrays_from_comp_name functions to take species name as argument rather than species id

* changed warpx_addRealComp to accept a species name as input and only add the new component for that species

* added a test of the pywarpx bridge to get particle data and add new particle attributes at runtime

* changed all particle interacting functions in libwarpx to use the species name rather than id, also changed the functions to get particle array data to use the component name rather than index

* updated test according to PR ECP-WarpX#2119 changes

* removed unneeded BL_ASSERT(nattr == 1) statement

* fixed bug in add_particles to correctly determine the number of extra attributes

* fixed bug in AddNParticles if fewer attribute values are passed than the number of extra arrays for the species

* use isinstance(attr, ndarray) rather than type(attr) is np.ndarray

* generalize_runtime_comps_io

* fix OpenPMD

* fix OpenPMD

* fix plot flags in WritePlotFile

* fix offset and comment

* changed extra pid test to not use an underscore in the pid name

* switched _libwarpx.py::add_particles to use kwargs to accept the weight and extra attribute arrays

* License update in test file

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* fix typo

* added a test with unique_particles=False

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* updated docstring and comments

Co-authored-by: atmyers <atmyers2@gmail.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Add predefined density profile to parameters documentation (ECP-WarpX#2155)

* Release 21.08 (ECP-WarpX#2158)

* AMReX/PICSAR: 21.08
* WarpX: 21.08

* initialize guard cells for macroscopic properties (ECP-WarpX#2159)

* openPMD: Style Cleaning (Dec/Def) (ECP-WarpX#2160)

In our code base, definitions and declarations of functions need a
space after their name. This makes them easy to search for.

Ref.:
  https://warpx.readthedocs.io/en/21.08/developers/contributing.html#style-and-conventions

* openPMD: Use Steps if != BTD (ECP-WarpX#2157)

* openPMD: Use Steps if != BTD

For all but back-transformed diagnostics, we can use efficient,
temporally sequentially increasing writes to iteration numbers
for iterations.

This allows us to give a guarantee to HPC I/O libraries on how
to arrange data, e.g., we can use ADIOS2 BeginStep() and
EndStep().

This enables paths to:
- streaming workflows, such as ADIOS2 SST or SSC, where we stage data
  over the network instead of using files
  https://openpmd-api.readthedocs.io/en/0.14.0/usage/streaming.html

This mitigates:
- host-side memory aggregation for ADIOS2 with openPMD `groupBased`
  iteration encoding
  https://openpmd-api.readthedocs.io/en/0.14.0/backends/adios2.html#memory-usage

* openPMD: Open Iterations Explicitly

Explicitly open iterations. Usually, file-open operations are delayed
until the first load/storeChunk operation is flush-ed. In parallel
contexts where we might want to do only particle writes from a few
ranks but no field wries in the future, this will avoid that we run
into hangs from the non-collective nature of the first flush.

The Streaming API (i.e., Series::writeIterations()) will call this
method implicitly as well, but back-transformed diags (particles)
might still need this.

* openPMD: 0.13.0+

Needed for streaming API and collective open.

* openPMD: Missing Include (ECP-WarpX#2162)

Fix compile issue on Conda-Forge (Windows with Clang).

* PEC Analysis: Remove Unused Imports (ECP-WarpX#2165)

Fix a LGTM warning on unused imports in our PEC analysis scripts.

* BinaryCollision: use more general particle data structure (ECP-WarpX#2137)

* Added B field to plasma lens (ECP-WarpX#2163)

* Added B field to plasma lens

* Fix B field and updated CI test to include the B field

* Updated benchmark

* Fixed bug where specifying write_dir for particle diagnostic did not work (ECP-WarpX#2167)

* Add particle weight as an explicit argument for _libwarpx.py::add_particles() (ECP-WarpX#2161)

* added particle weight as an explicit argument for _libwarpx.py::add_particles()

* changes requested during code review

* RZ PSATD: Time Averaging for Multi-J Algorithm (ECP-WarpX#2141)

* RZ PSATD: Time Averaging for Multi-J Algorithm

* Fix Wrong Signs in Bm

* Use Time Averaging in CI Test, Update Benchmark

* Minor Fix

* Shift parsing of physical/mathematical constants from hard-coding to table lookup (ECP-WarpX#2128)

* Shift parsing of physical/mathematical constants from hard-coding to table lookup
* Make constants map local and static for now, until there's a reason for it to be accessible/modifiable
* Accept rewording
* Accept rewording

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Moved some routines to .cpp file (ECP-WarpX#2168)

* In add_particles, used None for input parameter values (ECP-WarpX#2169)

* More cleanup of particle boundaries (ECP-WarpX#2171)

* Moved some routines to .cpp file

* Moved more stuff to a header file

* make parameter less prone to numerical issues in single precision (ECP-WarpX#2173)

* ES Solver: Fix SP Build (ECP-WarpX#2174)

Fix HIP:
```
/home/runner/work/WarpX/WarpX/Source/FieldSolver/ElectrostaticSolver.cpp:434:9: error: non-constant-expression cannot be narrowed from type 'double' to 'float' in initializer list [-Wc++11-narrowing]
        1.-beta[0]*beta[0], 1.-beta[1]*beta[1], 1.-beta[2]*beta[2])});
        ^~~~~~~~~~~~~~~~~~
/home/runner/work/WarpX/WarpX/build_sp/_deps/fetchedamrex-src/Src/Base/AMReX_SPACE.H:151:31: note: expanded from macro 'AMREX_D_DECL'
                              ^
/home/runner/work/WarpX/WarpX/Source/FieldSolver/ElectrostaticSolver.cpp:434:9: note: insert an explicit cast to silence this issue
        1.-beta[0]*beta[0], 1.-beta[1]*beta[1], 1.-beta[2]*beta[2])});
        ^~~~~~~~~~~~~~~~~~
```

* EB: RZ Warnings (ECP-WarpX#2176)

Not an implementation yet, just adding aborts and silencing warnings.

* fix typo (ECP-WarpX#2175)

* Fix: Performance Tests (Boundary) (ECP-WarpX#2178)

The performance tests were aborting since we changed the boundary
condition inputs last month. This fixes it.

* CI: Cover Performance Tests (ECP-WarpX#2179)

* CI: Cover Performance Tests

Make sure their syntax stays up-to-date.

* Performance Test in CI: Fake Output

Add "fake" output so we do not have unused variables when the tests
are run.

* reverted (or fixed) various unneccesary changes between ME fork and upstream/development

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>
Co-authored-by: MaxThevenet <maxence.thevenet@desy.de>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: atmyers <atmyers2@gmail.com>
Co-authored-by: Revathi  Jambunathan <41089244+RevathiJambunathan@users.noreply.github.com>
Co-authored-by: David Grote <grote1@llnl.gov>
Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Phil Miller <unmobile+gh@gmail.com>
Co-authored-by: Luca Fedeli <luca.fedeli@cea.fr>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: documentation Docs, readme and manual component: third party Changes in WarpX that reflect a change in a third-party library
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants