I want to simulate the recharge to a backfilled pit.
I've summarised the set up below;
- The initial water content is assumed to be 0.17
- The material is comparable with a loam in terms of Ksat conductivity.
- Precipitation and evaporation are applied as daily data with the atmosphere BC with surface runoff and a hCritA value of -150m. The annual precipitation is 830 mm/a and the potential evaporation 1200 mm/a.
- The bottom boundary is assigned as a free drainage BC (It is assumed the water table is deep)
- Simulation period of 10 Years
However what I found was that the actual surface flux for the two scenarios was significantly different. The 2 m profile has a much larger actual surface flux (vTop and sum(vTop)) compared with the 5m profile. On closer inspection, I noticed that the difference in the cumulative surface flux (Sum(vTop)) between the two scenarios is due to the actual evaporation. There is much greater evaporation that occurs in the simulation of the 5 m column compared with the 2m column.
As a check, I evaluated columns of 10 and 20 m thickness and in each case the infiltration is the same but the actual evaporation increases and consequently the overall actual surface flux (vTop and sumvTop) reduces with increasing profile thickness.
Why would the actual evaporation increase with increasing thickness of the soil profile?