Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix water balance error associated with transpiration #444

Merged

Conversation

martynpclark
Copy link
Collaborator

For the case where operator splitting splits down to individual soil layers, the transpiration sink term used in the soil hydrology routines is only calculated when processing the top layer. This is a special case.

Added the special case to the computation of timestep-average fluxes in varSubstep.f90

@wknoben
Copy link
Collaborator

wknoben commented Jan 7, 2021

Without this change, in rare cases a water balance error occurs in coupled_em.f90 (added a split between soil ET and canopy transpiration for clarity):

solution method           =            2
data_step                 =       3600.0000000000
totalSoilCompress         =          0.0000000000
scalarTotalSoilLiq        =        881.7115625650
scalarTotalSoilIce        =        329.8913458868
balanceSoilWater0         =       1209.7726357183
balanceSoilWater1         =       1211.6029084519
balanceSoilInflux         =          1.8391640923
balanceSoilBaseflow       =          0.0000000000
balanceSoilDrainage       =          0.0089130253
balanceSoilET             =         -0.1470989161
balanceSoilET (cnpy tran) =         -0.1470989161
balanceSoilET (grnd evap) =          0.0000000000
scalarSoilWatBalError     =          0.1471205826
scalarSoilWatBalError     =          0.0001471206
absConvTol_liquid         =          0.0000100000


FATAL ERROR: run_oneGRU (gru index = 1)/run_oneHRU (hruId = 71027547)/coupled_em/soil hydrology does not balance

With this fix, this case completes as expected:

initial date/time = 2021-01-07  13:36:24.856
  final date/time = 2021-01-07  13:37:08.279

     elapsed init =   4.9000000E-02 s
    fraction init =   1.1284342E-03

    elapsed setup =   3.6000000E-02 s
   fraction setup =   8.2905373E-04

  elapsed restart =   4.0000000E-03 s
 fraction restart =   9.2117081E-05

     elapsed read =    3.117000     s
    fraction read =   7.1782235E-02

    elapsed write =    13.38000     s
   fraction write =   0.3081316

  elapsed physics =    26.43500     s
 fraction physics =   0.6087788

     elapsed time =    43.42300     s
       or             0.7237167     m
       or             1.2061944E-02 h
       or             5.0258102E-04 d

   number threads =          1

 FORTRAN STOP: finished simulation successfully.

Using Andrew's bit_4_bit script that's part of the test cases, output files are identical for the time period where both versions of the code can be run (checking a range of output variables):

(<xarray.Dataset>
 Dimensions:                         (depth: 8, gru: 1, hru: 1, ifcSoil: 9, ifcToto: 14, midSnow: 5, midSoil: 8, midToto: 13)
 Coordinates:
   * hru                             (hru) int64 71027547
   * gru                             (gru) int64 71027547
 Dimensions without coordinates: depth, ifcSoil, ifcToto, midSnow, midSoil, midToto
 Data variables:
     pptrate_mean                    (hru) float64 0.0
     airtemp_mean                    (hru) float64 0.0
     latitude                        (hru) float64 0.0
     longitude                       (hru) float64 0.0
     theta_sat                       (depth, hru) float64 0.0 0.0 0.0 ... 0.0 0.0
     theta_res                       (depth, hru) float64 0.0 0.0 0.0 ... 0.0 0.0
     nSnow                           (hru) int32 0
     nSoil                           (hru) int32 0
     nLayers                         (hru) int32 0
     scalarCanopyWat                 (hru) float64 0.0
     scalarSWE                       (hru) float64 0.0
     mLayerTemp                      (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerVolFracIce                (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerVolFracLiq                (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerVolFracWat                (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     scalarAquiferStorage            (hru) float64 0.0
     scalarSurfaceTemp_mean          (hru) float64 0.0
     mLayerDepth                     (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerHeight                    (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     iLayerHeight                    (ifcToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerFracLiqSnow               (midSnow, hru) float64 0.0 0.0 0.0 0.0 0.0
     mLayerVolFracAir                (midToto, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerCompress                  (midSoil, hru) float64 0.0 0.0 ... 0.0 0.0
     scalarTotalSoilWat              (hru) float64 0.0
     scalarSenHeatTotal_mean         (hru) float64 0.0
     scalarLatHeatTotal_mean         (hru) float64 0.0
     scalarCanopyTranspiration_mean  (hru) float64 0.0
     mLayerTranspire                 (midSoil, hru) float64 0.0 0.0 ... 0.0 0.0
     scalarRainPlusMelt_mean         (hru) float64 0.0
     scalarInfiltration_mean         (hru) float64 0.0
     scalarSurfaceRunoff_mean        (hru) float64 0.0
     iLayerLiqFluxSoil               (ifcSoil, hru) float64 0.0 0.0 ... 0.0 0.0
     mLayerBaseflow                  (midSoil, hru) float64 0.0 0.0 ... 0.0 0.0
     scalarSoilBaseflow_mean         (hru) float64 0.0
     scalarSoilDrainage_mean         (hru) float64 0.0
     scalarAquiferBaseflow_mean      (hru) float64 0.0
     scalarTotalET_mean              (hru) float64 0.0
     scalarTotalRunoff_mean          (hru) float64 0.0
     scalarNetRadiation_mean         (hru) float64 0.0
     hruId                           (hru) int64 0
     gruId                           (gru) int64 0,
 0.0)

@arbennett arbennett merged commit 45af60b into CH-Earth:develop Jan 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants