Articles | Volume 21, issue 1
https://doi.org/10.5194/hess-21-393-2017
https://doi.org/10.5194/hess-21-393-2017
Research article
 | 
24 Jan 2017
Research article |  | 24 Jan 2017

Improvement of hydrological model calibration by selecting multiple parameter ranges

Qiaofeng Wu, Shuguang Liu, Yi Cai, Xinjian Li, and Yangming Jiang

Abstract. The parameters of hydrological models are usually calibrated to achieve good performance, owing to the highly non-linear problem of hydrology process modelling. However, parameter calibration efficiency has a direct relation with parameter range. Furthermore, parameter range selection is affected by probability distribution of parameter values, parameter sensitivity, and correlation. A newly proposed method is employed to determine the optimal combination of multi-parameter ranges for improving the calibration of hydrological models. At first, the probability distribution was specified for each parameter of the model based on genetic algorithm (GA) calibration. Then, several ranges were selected for each parameter according to the corresponding probability distribution, and subsequently the optimal range was determined by comparing the model results calibrated with the different selected ranges. Next, parameter correlation and sensibility were evaluated by quantifying two indexes, RC Y,  X and SE, which can be used to coordinate with the negatively correlated parameters to specify the optimal combination of ranges of all parameters for calibrating models. It is shown from the investigation that the probability distribution of calibrated values of any particular parameter in a Xinanjiang model approaches a normal or exponential distribution. The multi-parameter optimal range selection method is superior to the single-parameter one for calibrating hydrological models with multiple parameters. The combination of optimal ranges of all parameters is not the optimum inasmuch as some parameters have negative effects on other parameters. The application of the proposed methodology gives rise to an increase of 0.01 in minimum Nash–Sutcliffe efficiency (ENS) compared with that of the pure GA method. The rising of minimum ENS with little change of the maximum may shrink the range of the possible solutions, which can effectively reduce uncertainty of the model performance.

Download
Short summary
We proposed a method to calibrate hydrological models by selecting parameter range. The results show the probability distribution can be used to determine the optimal range of a single parameter. Analysis of parameter sensitivity and correlation is helpful to obtain the optimal combination of multi-parameter ranges which contributes to a higher and more concentrated value of the objective function. The findings can provide references for enhancing the precision of hydrological process modelling.