Skip to content

emefff/Tools-For-OpenFOAM-Automation

Repository files navigation

Tools-For-OpenFOAM-Automation

All presented scripts need to be run in your OpenFOAM case directory. controlDict_automation.py needs to run during the running OpenFOAM simualtion, additionally running controlDict_automation_plot.py to access the data generated by controlDict_automation.py implies two running python instances. Therefore, the easiest way to circumvent GIL is running Python in environments like Conda. The code below was written for Python 3.12.

snappyHexMesh_batchMesher.py

As all Foamers know, OpenFOAM is top-tier CFD software. Nevertheless, 'ease-of-use' or even 'convenience' are words that normally do not come to mind when we speak about OpenFOAM. There are tools like pyFOAM that may achieve similar control over openFOAM, however, I wanted to do this on my own. First of all, a good mesh is the most important input to OF. My personal mesher of choice is snappyHexMesh. There has been a lot said about sHM, I won't add to that. My experiences with it are mixed, but there is no open-source mesher that can handle multiregion or AMI better. So I have to deal with it. The problems I have with it are mostly with big meshes (10M+ cells) and the waiting time. Even with MPI and 32 cores, a waiting time of 1h+ is too long if you can't be sure your resulting mesh works. And that's the main problem with sHM: you never really can be with AMI because SHM is kind of stochastic (maybe there's a better word for it, IDK) regarding its results. What do I mean by that? Even with IDENTICAL snappyHexMeshDict, a second meshing run does not lead to exactly the same result. There are variations in the mesh, they are only 'kind of' the same. They even have different checkMesh results, that should tell you something. Normally, when meshing a new geometry, I'd start with some default parameters, look at checkMesh results and also look at the snappyHexMEsh.log. Unfortunately, checkMesh can't do it all. The most important value (in my opinion at least) checkMesh cannot deliver us is the layerPercentage (not an official OF term). These values, if you layer more than one patch, can only be found in sHM.log. So, therefore, one of the goals of the presented software was to aggregate important meshQualityParameters in one place. Currently this script can only do variations of input parameters for snappyHexMeshDict, repeatedly mesh in parallel, write logs, evaluate some results in these logs and aggregate the info for the user. Ideally, the user will then pick one of these meshes for his simulation. Attempts to do automated optimizations where not really fruitful, so at the moment we just kind of brute-force a better mesh with variations of important parameters and repeated meshing.

How to use this script: First of all, you need a working snappyHexMesh case, meaning you have a (nearly) watertight .stl in triSurface, a correct surfaceFeatureExtractDict, a correct blockMeshDict, decomposeParDict and finally a working snappyHexMeshDict. This is the workflow we also use in the exec_meshing() function inside this script. After meshing, the mesh is reconstructed and processor/ folders are cleaned. Of course, if you need createBaffles or mergeOrSplitBaffles or other operations on your mesh, these can easily be included. An important feature of this script is, that some of the logging is done in /var/tmp/. My advice for you is to mount it as tmpfs in order to have it in RAM. It will reduce the write cycles on your SSD or NVME. After you completed setting up the meshing, you are ready to enter data in to the script. Currently, the following parameters can be varied for automated mesh generation:

-) NUMBER_OPTIMIZATIONS: the total number of loops, then number of meshes is +1 because the original intention was to let the script optimize mesh parameters itself. This has proven to be very difficult. So at the moment, only a few parameters are varied and also this script can't to any separate meshing. Separate meshing is a popular tecnique ith snappyHexMEsh where you mesh your snapped mesh first until satisfied and THEN do the layering on this mesh. I only discovered this weeks ago, it has been a good alternative so far (although some other annoying stuff has to be done to get this mesh running in OpenFOAM, but I won't go into that.)

-) interval_maxNonOrtho: set the interval in which maxNonOrtho is varied. The following variations of maxNonOrtho can be performed: do equal steps along the given interval, do quasi-equal steps in the interval with slight randomization on the value, total uniform randomness of the value in the interval. You can comment and uncomment the appropriate lines in the code.

-) interval_maxBoundarySkewness: set the interval in which maxBoundarySkewness is varied. Same applies like above on how this parameter can be varied.

-) interval_layerFeatureAngle: set the interval in which layerFeatureAngle is varied. Same applies like above on how this parameter can be varied.

The following input parameters are also included but not varied currently in the script (this is also mentioned in the script) only their maximum value is used and written to snappyHexMeshDict:

interval_maxInternalSkewness

interval_relaxedMaxNonOrtho

interval_relaxedMaxBoundarySkewness

interval_relaxedMaxInternalSkewness

After entering your desired limits in these intervals, you should look at how these values are treated (equal linear steps, quasi-equal steps or uniform randomness). Just comment and uncomment the appropriate lines in the loop. You may also want to change the direction in which the intervals are stepped in case of equal and quasi-equal steps. After that you may run the script, it should be placed in your OpenFoam case directory. After the zero mesh (Mesh_0) it will tell you an estimated meshing time, especially with big meshes this is important.

The following files are written by the script: A folder is created for each mesh in the constant directory of your OpenFoam case with the following structure, let's assume a NUMBER_OPTIMIZATIONS = 3:

Mesh_0/ Mesh_1/ Mesh_2/ Mesh_3/

Inside each of these folders you'll find (assume we look into Mesh_0):

0/ cellVolume faceZone p

aspectRatio cellVolumeRatio k polyMesh/

cellAspectRatio cellZone minPyrVolume skewness

cellDeterminant checkMesh_0.log minTetVolume snappyHexMesh_0.log

cellRegion epsilon nonOrthoAngle U

cellShapes faceWeight nut wallDistance

Executing paraFoam will let you look at all these data and generated meshes. All logs can be looked into for a deeper comparison. A graph is presented in the end with the most important parameters for each mesh (again according to their number 0, 1, 2, 3, ...): Figure_1

maxNonOrtho_set, maxNonOrtho_result maxSkewness_set, maxSkewness_result layerFeatureAngle_set, layerPercentage_result (only the first entry from the first patch in the table is scanned) num_cells_result

You will be able to choose the mesh you like best, very often this will be a compromise with snappyHexMesh. Even if you don't choose a mesh for your simulation, you'll know a lot more about how sHM is treating your .stl. You can narrow one or more parameter interval(s) for additional loops, modify other parameters in sHM etc. This script is especially convenient, if you absolutely don't know where to start with your meshing. Now, this is work in progress. At the moment, the capabilities of this script are quite limited. However, it can be adapted to your needs in no time. The main advantage at the moment is that we can do some meshing over night.

controlDict_automation.py

When I started with OpenFOAM, I was very annoyed by the need of sitting and waiting while increasing certain parameters (e.g. deltaT or maxCo) after launching a new simulation. So what this script basically does is hand a little more automation to your simulation. It can:

-) increase maxCo either by a time trigger, meaning for example in 10 steps from 0.0001 to 0.5 in 10 minute intervals.

-) increase maxCo by a timeStep trigger. For that we extract the timeStep at which the running simulation is at, we the increase (mostly) the maxCo according to a list of values we determined in advance.

-) or we use both. This is done in the shared script. In the beginning we use a time trigger and later we switch over to timeStep triggering. This mostly makes sense in very big simulations.

-) increase writeInterval: in the beginning when deltaT are small, we probably want to write smaller writeIntervals and later, when all is going well with a large detlaT we want to write less data.

-) increase nCorrectors when maxCo is increased and decrease it when a certain percentage of the timeStep is reached.

-) increase nCorrectors, nOuterCorrectors, decrease epsilonRelaxationFactor when p does not converge. Although the script can do that, I have never seen an event when it actually can 'save' a non-converging simulation. It is probably already too late when this is triggered.

-) decrease nCorrectors, nOuterCorrectors, increase epsilonRelaxationFactor when last 30 timeSteps show converged pressure. When all is going well, all these parameters are decreased to their defined minimum values to keep iteration duration low.

-) calculate the achieved deltaT. PimpleFoam shows us this value during the run, we only have access to it via stdout in the running simulation. This script calculated this value from solverInfo

-) write collected data and results into a controlDict_automation_results.csv. Restarting this script will trigger it to write new data to controlDict_automation_results_1.csv and so on. These data can be evaluated by control_automation_plot.py script which is also presented here. The plotting script can handle the enumrated result.csv files. No worries here.

-) write a log file named controlDict_automation.log. It also enumerates log files if you interrupt the script similiar to the result.csv files.

The script features a lot of functions to access OpenFOAM parameters, it can easily be adapted to other needs.

It needs a function implemented in the controlDict like:

functions
{
    solverInfo
    {
        type            solverInfo;
        libs            ("libutilityFunctionObjects.so");
        fields          (U p);
        writeResidualFields yes;
        executeControl      timeStep;
        executeInterval     1;
        writeControl        adjustableRunTime;
        writeInterval       25;
    }
}

If this function is not present in your controlDict, it will be added. For other solvers, the script may need to be adapted, currently it is written for pimpleFoam. For the script to work, it needs to nbe run in your case directory.

An interesting feature would be to decrease maxCo or deltaT when max(mag(U)) increases.

controlDict_automation_plot.py

Like already mentioned above, this script visualizes the results collected by controlDict_automation.py. A typical result may look like this: controlDict_automation_75

Looking at the results we may judge if our simulation reaches a dead end (for example deltaT could approach unsustainable values, e.g. endTime = 10, deltaT = 1e-8 and a duration per timeStep of 30s would take almost a year) or how it behaves in general. In this example, these are the results from the also shared Francis turbine, we obviously increased maxCo and at the same time nCorrectors. We decreased nCorrectors after 1.01*timeStep_maxCo has passed (this is decrease_nCorrectors_factor). Looking at the achieved_deltaT we clearly find a linear increase after the last change of maxCo. This is a transient simulation with constant maxCo, so an increasing deltaT is a good sign (initial transients are still fading away).

solverInfo_plot.py

This is an easy one. In fact, all the above scripts somehow started with this script. It accesses data generated by a solverInfo function like the one shown above. In conjunction with controlDict_automation_plot.py it gives a pretty good coverage of what is going on in your running simulation. It can handle interrupted simulations. When you do that and have latestTime in your controlDict, solverInfo will generate a folder like {timeStep}/solveInfo.dat with a new .dat file. The script will catch that. If you interrupt twice at the same timeStep, OpenFOAM will write a {timeStep}/solveInfo_{timeStep}.dat. This script will not catch that! In this case please delete the new {timeStep} folder before restarting the simulation. Catching the new .dat is also not implemented, because it implies something went wrong in the first place. If you want to keep these data for sentimental reasons please copy on your own.

A typical plot could look like this, in fact it is again from the Francis turbine at the same timeStep: solverInfo_75

Steady residuals, and residuals are fading down after increasing maxCo/deltaT. What else do we need in CFD?

About

Tools for OpenFOAM automation in Python are presented.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages