I am modeling a hillslope with root water uptake enabled. I have supplied Hydrus with Tpot, Surface length associated with Transpiration, and the root distribution function (beta). The water uptake reduction model is set to Feddes. Feddes' parameters are to PO=-10 cm, POpt=-25 cm P2H=-600 cm, P2L=-1200 cm, P3=-15000 cm.
After running Hydrus, the Actual Root Water Uptake is equal to the Potential Root Water Uptake despite the rooting zone pressure head often lying outside the "optimal" RWU pressure head range and falling below the wilting point at times. This result does not seem correct to me. The reduction model never seems to have no influence on root water uptake and Hydrus is only using the supplied Tpot to define root water uptake. Has anyone run into this before? Maybe I am missing something obvious?
A discussion forum for users of the new HYDRUS 2D/3D. HYDRUS is a software package for simulating water, heat and solute movement in two- and three-dimensional variably saturated media. Happy Posting!
2 posts • Page 1 of 1