Articles | Volume 12, issue 2
https://doi.org/10.5194/hess-12-587-2008
https://doi.org/10.5194/hess-12-587-2008
19 Mar 2008
 | 19 Mar 2008

Stochastic simulation experiment to assess radar rainfall retrieval uncertainties associated with attenuation and its correction

R. Uijlenhoet and A. Berne

Abstract. As rainfall constitutes the main source of water for the terrestrial hydrological processes, accurate and reliable measurement and prediction of its spatial and temporal distribution over a wide range of scales is an important goal for hydrology. We investigate the potential of ground-based weather radar to provide such measurements through a theoretical analysis of some of the associated observation uncertainties. A stochastic model of range profiles of raindrop size distributions is employed in a Monte Carlo simulation experiment to investigate the rainfall retrieval uncertainties associated with weather radars operating at X-, C-, and S-band. We focus in particular on the errors and uncertainties associated with rain-induced signal attenuation and its correction for incoherent, non-polarimetric, single-frequency, operational weather radars. The performance of two attenuation correction schemes, the (forward) Hitschfeld-Bordan algorithm and the (backward) Marzoug-Amayenc algorithm, is analyzed for both moderate (assuming a 50 km path length) and intense Mediterranean rainfall (for a 30 km path). A comparison shows that the backward correction algorithm is more stable and accurate than the forward algorithm (with a bias in the order of a few percent for the former, compared to tens of percent for the latter), provided reliable estimates of the total path-integrated attenuation are available. Moreover, the bias and root mean square error associated with each algorithm are quantified as a function of path-averaged rain rate and distance from the radar in order to provide a plausible order of magnitude for the uncertainty in radar-retrieved rain rates for hydrological applications.