Best version of snow partitioning to use? #403
-
I am trying to associate a vector of normalized 3D flow velocities from a different simulation engine (Palabos) to a single pore in python (identified by SNOW). Using the dictionary produced by ps.networks.snow is super handy for this (because of the output dictionary that contains the pores, apart from the added boundary faces) but it is not parallel as far as I can see. On the other hand, ps.filters.snow can be parallel but does not make the handy dictionary of pores. What I want to do is make a dictionary of velocities and add values to it for each pore (so each key in velocities is a pore, and for each key there is a list of flow velocities) I suppose I should just make a dictionary of pores from the filter code and the output regions...? Is the SNOW implementation the same across both of these methods? Is making a dictionary of pores from the filter code so that it can be parallel and also matches my simulation of velocity the easiest method to do this? I want to leverage the network framework but it is more difficult to go this route I think... I could be missing something obvious though... Thanks in advance for your thoughts. This also could just require more coding on my end which is fine I just want to be as efficient as possible. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Shouldn't you be assigning velocities to the throats rather than pores? |
Beta Was this translation helpful? Give feedback.
-
BTW, we are in the process of 'tidying' and 'unifying' the network extraction codes in porespy as part of the transition to V2. These things all take 10x longer than planned though. The planned changes will address your comments though. |
Beta Was this translation helpful? Give feedback.
Shouldn't you be assigning velocities to the throats rather than pores?