63 results on '"Gumbel distribution"'
Search Results
2. On the variability of cold region flooding.
- Author
-
Matti, Bettina, Dahlke, Helen E., and Lyon, Steve W.
- Subjects
- *
METEOROLOGICAL precipitation , *HYDROLOGIC models , *CLIMATE change , *CRYOSPHERE , *TEMPERATURE effect ,COLD regions - Abstract
Summary Cold region hydrological systems exhibit complex interactions with both climate and the cryosphere. Improving knowledge on that complexity is essential to determine drivers of extreme events and to predict changes under altered climate conditions. This is particularly true for cold region flooding where independent shifts in both precipitation and temperature can have significant influence on high flows. This study explores changes in the magnitude and the timing of streamflow in 18 Swedish Sub-Arctic catchments over their full record periods available and a common period (1990–2013). The Mann–Kendall trend test was used to estimate changes in several hydrological signatures (e.g. annual maximum daily flow, mean summer flow, snowmelt onset). Further, trends in the flood frequency were determined by fitting an extreme value type I (Gumbel) distribution to test selected flood percentiles for stationarity using a generalized least squares regression approach. Results highlight shifts from snowmelt-dominated to rainfall-dominated flow regimes with all significant trends (at the 5% significance level) pointing toward (1) lower magnitudes in the spring flood; (2) earlier flood occurrence; (3) earlier snowmelt onset; and (4) decreasing mean summer flows. Decreasing trends in flood magnitude and mean summer flows suggest widespread permafrost thawing and are supported by increasing trends in annual minimum daily flows. Trends in selected flood percentiles showed an increase in extreme events over the full periods of record (significant for only four catchments), while trends were variable over the common period of data among the catchments. An uncertainty analysis emphasizes that the observed trends are highly sensitive to the period of record considered. As such, no clear overall regional hydrological response pattern could be determined suggesting that catchment response to regionally consistent changes in climatic drivers is strongly influenced by their physical characteristics. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
3. A framework for assessing compound drought events from a drought propagation perspective
- Author
-
Dejian Zhang, Huaxia Yao, Xiaohong Chen, Jiefeng Wu, Gaoxu Wang, and Xiaoyan Bai
- Subjects
Water resources ,geography ,geography.geographical_feature_category ,Gumbel distribution ,Climatology ,Streamflow ,Drainage basin ,Environmental science ,Precipitation ,Structural basin ,Water Science and Technology ,Weibull distribution ,Copula (probability theory) - Abstract
Compared with a single drought event, compound drought (i.e., two or more droughts occurring simultaneously) has more substantial effects on economic growth, water resources, and the ecological environment. A useful framework for assessing compound drought events from a drought propagation perspective was proposed. The framework includes the definition, identification, and risk assessment of compound drought. We defined compound drought as a drought event from a drought propagation perspective when meteorological drought and hydrological drought occur simultaneously. The Standardized Streamflow Index (SSI) and Standardized Precipitation Index (SPI) were used to represent hydrological and meteorological drought, respectively. By applying a certain truncation threshold of run theory, the compound drought events in the time series were identified and characterized according to their duration and severity. Five univariate distribution functions (i.e., Log-normal, Exponential, Gamma, Weibull, and Pareto) and four bivariate copula models (i.e., Gaussian, Clayton, Frank, and Gumbel) were used for fitting both the duration and severity of compound drought and their joint return periods. A 40-year monthly streamflow and precipitation dataset from the Dongjiang River Basin, which is located on the southern coast of China, was used as a case study. The characteristics of compound drought, including duration and severity, were captured by considering the time continuity of meteorological and hydrological droughts from a drought propagation perspective based on run theory. The most severe compound droughts in the study area were observed in 1963 and 2004–2005. The characteristics of compound drought duration and severity were related to the timescale of the drought index (i.e., SSI and SPI). For the study basin, the SPI at a 2–3-month timescale better matched the SSI on the monthly timescale compared with other timescales. The Gaussian-copula function better characterized the joint return periods between the duration and severity of compound drought compared with three other copula models. Although reservoir regulation played an important role in decreasing the duration (lower by 8.25%) and severity (lower by 9.27%) of compound drought, it had little effect on extreme compound drought events. Overall, the results indicated that the proposed method provides a useful tool for statistical assessments of compound drought from a drought propagation perspective. This framework could also be applied to other regions.
- Published
- 2022
4. How extreme was the October 2015 flood in the Carolinas? An assessment of flood frequency analysis and distribution tails
- Author
-
S. Samadi, R. C. Phillips, and Michael E. Meadows
- Subjects
Hydrology ,Return period ,Flood myth ,0208 environmental biotechnology ,Flooding (psychology) ,02 engineering and technology ,Shape parameter ,020801 environmental engineering ,Hydrology (agriculture) ,Gumbel distribution ,Probability distribution ,Environmental science ,Water Science and Technology ,Weibull distribution - Abstract
This paper examines the frequency, distribution tails, and peak-over-threshold (POT) of extreme floods through analysis that centers on the October 2015 flooding in North Carolina (NC) and South Carolina (SC), United States (US). The most striking features of the October 2015 flooding were a short time to peak (Tp) and a multi-hour continuous flood peak which caused intensive and widespread damages to human lives, properties, and infrastructure. The 2015 flooding was produced by a sequence of intense rainfall events which originated from category 4 hurricane Joaquin over a period of four days. Here, the probability distribution and distribution parameters (i.e., location, scale, and shape) of floods were investigated by comparing the upper part of empirical distributions of the annual maximum flood (AMF) and POT with light- to heavy- theoretical tails: Frechet, Pareto, Gumbel, Weibull, Beta, and Exponential. Specifically, four sets of U.S. Geological Survey (USGS) gauging data from the central Carolinas with record lengths from approximately 65–125 years were used. Analysis suggests that heavier-tailed distributions are in better agreement with the POT and somewhat AMF data than more often used exponential (light) tailed probability distributions. Further, the threshold selection and record length affect the heaviness of the tail and fluctuations of the parent distributions. The shape parameter and its evolution in the period of record play a critical and poorly understood role in determining the scaling of flood response to intense rainfall.
- Published
- 2018
5. Realistic sampling of anisotropic correlogram parameters for conditional simulation of daily rainfields
- Author
-
Yeboah Gyasi-Agyei
- Subjects
010504 meteorology & atmospheric sciences ,Gaussian ,0208 environmental biotechnology ,02 engineering and technology ,01 natural sciences ,Standard deviation ,020801 environmental engineering ,Copula (probability theory) ,symbols.namesake ,Gumbel distribution ,Joint probability distribution ,Kriging ,Statistics ,symbols ,Physics::Atmospheric and Oceanic Physics ,Correlogram ,0105 earth and related environmental sciences ,Water Science and Technology ,Quantile ,Mathematics - Abstract
This paper has established a link between the spatial structure of radar rainfall, which more robustly describes the spatial structure, and gauge rainfall for improved daily rainfield simulation conditioned on the limited gauged data for regions with or without radar records. A two-dimensional anisotropic exponential function that has parameters of major and minor axes lengths, and direction, is used to describe the correlogram (spatial structure) of daily rainfall in the Gaussian domain. The link is a copula-based joint distribution of the radar-derived correlogram parameters that uses the gauge-derived correlogram parameters and maximum daily temperature as covariates of the Box-Cox power exponential margins and Gumbel copula. While the gauge-derived, radar-derived and the copula-derived correlogram parameters reproduced the mean estimates similarly using leave-one-out cross-validation of ordinary kriging, the gauge-derived parameters yielded higher standard deviation (SD) of the Gaussian quantile which reflects uncertainty in over 90% of cases. However, the distribution of the SD generated by the radar-derived and the copula-derived parameters could not be distinguished. For the validation case, the percentage of cases of higher SD by the gauge-derived parameter sets decreased to 81.2% and 86.6% for the non-calibration and the calibration periods, respectively. It has been observed that 1% reduction in the Gaussian quantile SD can cause over 39% reduction in the SD of the median rainfall estimate, actual reduction being dependent on the distribution of rainfall of the day. Hence the main advantage of using the most correct radar correlogram parameters is to reduce the uncertainty associated with conditional simulations that rely on SD through kriging.
- Published
- 2018
6. Dynamic evolution and frequency analysis of hydrological drought from a three‐dimensional perspective
- Author
-
Haijiang Wu, Kai Feng, Gengxi Zhang, Xiaoling Su, Zezhong Zhang, Olusola O. Ayantobo, and Vijay P. Singh
- Subjects
Water resources ,Variable (computer science) ,Frequency analysis ,Gumbel distribution ,Joint probability distribution ,law ,Climatology ,Environmental science ,Conditional probability ,Structural basin ,Water Science and Technology ,law.invention ,Copula (probability theory) - Abstract
Drought events show a continuous spatiotemporal structure. The evolution of drought and its probabilistic modeling from a three-dimensional perspective are important for drought mitigation and risk management. This study therefore aimed to extract hydrological drought variables using a three-dimensional identification method in the Upstream of Heihe River Basin (UHHRB) in Northwest China during 1961–2014. Then, the spatiotemporal dynamic evolution of hydrological drought event was quantified, and drought frequency was evaluated using copula functions. Results showed that the spatiotemporal structure of drought can be satisfactorily reflected using the three-dimensional framework. The most severe drought originated in the middle part of the basin, and propagated towards the eastern part with variable migration velocity. The Gumbel copula was the most suitable model for determining the joint distribution of drought duration, severity, and area. The conditional probability of drought variables with a given conditional factors decreases as the values of factor increase, and the increasing trend had no significant effect on reducing the probability of drought occurrence for a relatively higher value of drought variable. Drought risks would be underestimated (or overestimated) if simply the ‘and’ (or ‘or’) case return periods were considered. Drought frequency analysis considering the spatiotemporal drought features can be regarded as a reliable method. This study could help to better understand the spatiotemporal dynamic evolution of drought and to facilitate water resources allocation and drought mitigation.
- Published
- 2021
7. Historical floods in flood frequency analysis: Is this game worth the candle?
- Author
-
Ewa Bogdanowicz, Krzysztof Kochanek, and Witold G. Strupczewski
- Subjects
Return period ,010504 meteorology & atmospheric sciences ,Mean squared error ,Flood myth ,0208 environmental biotechnology ,Monte Carlo method ,02 engineering and technology ,01 natural sciences ,020801 environmental engineering ,Gumbel distribution ,Statistics ,Econometrics ,Extreme value theory ,0105 earth and related environmental sciences ,Water Science and Technology ,Quantile ,Mathematics ,Weibull distribution - Abstract
In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest ( XM 1 ) or two largest ( XM 1 and XM 2 ) flood peak flows in a historical M -year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM 1 and XM 2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM 1 and XM 2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.
- Published
- 2017
8. Joint mapping of statistical streamflow descriptors
- Author
-
Gottschalk, Lars, Krasovskaia, Irina, Yu, Kun-xia, Leblois, Etienne, and Xiong, Lihua
- Subjects
- *
STATISTICAL methods in streamflow , *EXTREME value theory , *PROBABILITY theory , *GAMMA distributions , *WATERSHEDS , *OUTLIERS (Statistics) , *DATA analysis - Abstract
Summary: An approach is presented for a joint mapping and regionalization of statistical descriptors of streamflow like the mean value and the coefficient of variation of daily data, the flow duration curve, moments and distributions of annual minima and maxima. Probability theory, and specifically extreme value theory, offers possible parametric relations to be tested, linking knowledge of the basic mean value and coefficient of variation with the other descriptors. This initial study is limited to theoretical results derived from the assumption that the original daily data follow the lognormal or the gamma distribution. With these standard distributions the statistical regularity found is surprisingly good when confronting the theory against the empirical data of daily streamflow from 35 catchments in the Moselle drainage basin in France. The lognormal distribution appears to be the best candidate for the flow duration curve, as well as maximum and minimum streamflow. The tail behaviour of streamflow maxima is well described, while for minima the results are more difficult to interpret. One problem might be the relative higher uncertainty in annual low flow data, and especially the sensitivity to outliers towards high values. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
9. Multivariate non-stationary hydrological frequency analysis
- Author
-
Fateh Chebana and Taha B. M. J. Ouarda
- Subjects
Multivariate statistics ,Frequency analysis ,010504 meteorology & atmospheric sciences ,Series (mathematics) ,Copula (linguistics) ,0207 environmental engineering ,02 engineering and technology ,01 natural sciences ,law.invention ,Gumbel distribution ,law ,Log-normal distribution ,Econometrics ,020701 environmental engineering ,0105 earth and related environmental sciences ,Water Science and Technology ,Statistical hypothesis testing ,Mathematics ,Quantile - Abstract
To study hydrological events, such as floods and droughts, frequency analysis (FA) techniques are commonly employed. FA relies on some assumptions, especially, the stationarity of the data series. However, the stationarity assumption is not always fulfilled for a variety of reasons such as climate change and human activities. Thus, it is essential to check the stationarity or we should develop models that take into account the non-stationarity in a new risk assessment framework. On the other hand, a majority of hydrological phenomena are described by a number of correlated characteristics. To model the dependence structure between these hydrological variables, copulas are the most employed tool. Generally in the literature, the multivariate model is assumed to be the same over time even though multivariate stationarity is required. Considering the non-stationarity in the dependence structure is important because when the copula parameter changes, the multivariate quantile curve changes accordingly. Different scenarios can be considered when choosing a multivariate non-stationary model since several variables and a dependence structure are involved. The objective of the present study is to construct a model that integrates simultaneously multivariate and non-stationarity aspects along with hypothesis testing. For the copula part, we consider versions called Dynamic copulas and series of association measures are obtained through rolling windows of the corresponding series. Adapted versions of the AIC criterion are employed to select the final model (margins and copula). The procedure is applied to a flood volume and peak dataset from Iran. The obtained model constitutes of a lognormal distribution for the margins with linear trend in the peak series, stationary for the volume series and a quadratic trend in the logistic Gumbel copula parameter for the dependence structure.
- Published
- 2021
10. Comparison of methods for analysis of extremes when records are fragmented: A case study using Amazon basin rainfall data
- Author
-
Clarke, Robin T., de Paiva, Rodrigo Dias, and Uvo, Cintia Bertacchi
- Subjects
- *
CLIMATE extremes , *RAINFALL frequencies , *DATA analysis , *HYDROLOGY , *GEOLOGICAL basins , *CASE studies - Abstract
Summary: The analysis of annual extremes of hydrological and meteorological variables is frequently complicated by the presence of gaps in record, and when records are not only fragmented but also short, it is necessary to utilize to the full the information contained in them. One method is to abstract for statistical analysis all extreme events whose peaks exceed a pre-selected threshold value, but the threshold must be carefully chosen if “clumps” of peaks are to be avoided. A common alternative is a statistical analysis of maxima in years that are complete, possibly including in the analysis values from incomplete years according to some empirical rule. A plausible probability distribution has been proposed by [Jones, D.A., 1997. Plotting positions via maximum likelihood for a non-standard situation. Hydrol. Earth Syst. Sci. 1, 357–366] for the extremes observed in incomplete years, which takes into account not only the proportion of record that is missing within an incomplete year, but also the effect of seasonality. As part of a larger study on the hydrology of the Amazon basin, this paper uses 484 records with length not less than 12years from an extensive network of 750 rain gauges, to compare the method proposed by Jones (termed the DAJ method) with the following alternative procedures: (i) using only complete years of record and (ii) including years with less than 20% missing record, as if they were complete. Using the large-sample variance calculated for the annual maximum one-day rainfall with 100-year return period (P 100), the method proposed by Jones is shown to give smaller standard errors than either of the alternatives. Using the number of years in each record to calculate weighted mean variances over the 484 records, the mean standard errors of P 100 obtained by methods (i) and (ii) were 1.25 and 1.06 times the mean standard error given by the DAJ method. The precision of estimates obtained by the latter method was therefore better than either alternative. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
11. An at-site flood estimation method in the context of nonstationarity II. Statistical analysis of floods in Quebec
- Author
-
Tamer A. Gado and Van-Thanh-Van Nguyen
- Subjects
Flood myth ,0208 environmental biotechnology ,Nonparametric statistics ,Deviance (statistics) ,Context (language use) ,02 engineering and technology ,Standard deviation ,020801 environmental engineering ,Gumbel distribution ,Bayesian information criterion ,Statistics ,Generalized extreme value distribution ,Econometrics ,Water Science and Technology ,Mathematics - Abstract
Summary This paper, the second of a two-part paper, investigates the nonstationary behaviour of flood peaks in Quebec (Canada) by analyzing the annual maximum flow series (AMS) available for the common 1966–2001 period from a network of 32 watersheds. Temporal trends in the mean of flood peaks were examined by the nonparametric Mann–Kendall test. The significance of the detected trends over the whole province is also assessed by a bootstrap test that preserves the cross-correlation structure of the network. Furthermore, The LM–NS method (introduced in the first part) is used to parametrically model the AMS, investigating its applicability to real data, to account for temporal trends in the moments of the time series. In this study two probability distributions (GEV & Gumbel) were selected to model four different types of time-varying moments of the historical time series considered, comprising eight competing models. The selected models are: two stationary models (GEV0 & Gumbel0), two nonstationary models in the mean as a linear function of time (GEV1 & Gumbel1), two nonstationary models in the mean as a parabolic function of time (GEV2 & Gumbel2), and two nonstationary models in the mean and the log standard deviation as linear functions of time (GEV11 & Gumbel11). The eight models were applied to flood data available for each watershed and their performance was compared to identify the best model for each location. The comparative methodology involves two phases: (1) a descriptive ability based on likelihood-based optimality criteria such as the Bayesian Information Criterion (BIC) and the deviance statistic; and (2) a predictive ability based on the residual bootstrap. According to the Mann–Kendall test and the LM–NS method, a quarter of the analyzed stations show significant trends in the AMS. All of the significant trends are negative, indicating decreasing flood magnitudes in Quebec. It was found that the LM–NS method could provide accurate flood estimates in the context of nonstationarity. The results have indicated the importance of taking into consideration the nonstationary behaviour of the flood series in order to improve the quality of flood estimation. The results also provided a general impression on the possible impacts of climate change on flood estimation in the Quebec province.
- Published
- 2016
12. On the variability of cold region flooding
- Author
-
Helen E. Dahlke, Bettina Matti, and Steve W. Lyon
- Subjects
Flood myth ,0208 environmental biotechnology ,02 engineering and technology ,020801 environmental engineering ,Gumbel distribution ,Snowmelt ,Streamflow ,Climatology ,100-year flood ,Environmental science ,Cryosphere ,Precipitation ,Extreme value theory ,Water Science and Technology - Abstract
Cold region hydrological systems exhibit complex interactions with both climate and the cryosphere. Improving knowledge on that complexity is essential to determine drivers of extreme events and to predict changes under altered climate conditions. This is particularly true for cold region flooding where independent shifts in both precipitation and temperature can have significant influence on high flows. This study explores changes in the magnitude and the timing of streamflow in 18 Swedish Sub-Arctic catchments over their full record periods available and a common period (1990–2013). The Mann–Kendall trend test was used to estimate changes in several hydrological signatures (e.g. annual maximum daily flow, mean summer flow, snowmelt onset). Further, trends in the flood frequency were determined by fitting an extreme value type I (Gumbel) distribution to test selected flood percentiles for stationarity using a generalized least squares regression approach.Results highlight shifts from snowmelt-dominated to rainfall-dominated flow regimes with all significant trends (at the 5% significance level) pointing toward (1) lower magnitudes in the spring flood; (2) earlier flood occurrence; (3) earlier snowmelt onset; and (4) decreasing mean summer flows. Decreasing trends in flood magnitude and mean summer flows suggest widespread permafrost thawing and are supported by increasing trends in annual minimum daily flows. Trends in selected flood percentiles showed an increase in extreme events over the full periods of record (significant for only four catchments), while trends were variable over the common period of data among the catchments. An uncertainty analysis emphasizes that the observed trends are highly sensitive to the period of record considered. As such, no clear overall regional hydrological response pattern could be determined suggesting that catchment response to regionally consistent changes in climatic drivers is strongly influenced by their physical characteristics.
- Published
- 2016
13. Copula based drought frequency analysis considering the spatio-temporal variability in Southwest China
- Author
-
Kai Xu, Huimin Lei, Dawen Yang, and Xiangyu Xu
- Subjects
Return period ,Multivariate statistics ,Gumbel distribution ,Goodness of fit ,Joint probability distribution ,Climatology ,Copula (linguistics) ,Statistics ,Spatial ecology ,Probability distribution ,Water Science and Technology ,Mathematics - Abstract
Summary Drought frequency analysis is a prerequisite for drought resistance planning and drought risk management. Drought is a spatio-temporal dynamic process, usually characterized by its duration, spatial extent, and severity. Copula based multivariate frequency analysis has been widely used to calculate drought frequency. However, the spatial extent is scarcely considered in previous studies, due to the fact that drought event is usually identified either for a fixed spatial scale or for a fixed temporal scale. This study develops a regional drought frequency analysis model based on trivariate copulas by considering the spatio-temporal variations of drought events. Drought duration, drought affect area, and drought severity are identified first, and their trivariate joint distribution is constructed later. The model is applied for drought frequency analysis in Southwest China during 1961–2012. A variety of probability distribution functions and copula functions (including elliptical, symmetric and asymmetric Archimedean) are used as candidate choices, and the most appropriate ones are selected based on goodness of fit using different methods. The robustness of drought frequency analysis is then evaluated and discussed. The results show that drought frequency analysis needs to fully consider the three characteristic parameters (duration, affect area, and severity) reflecting drought spatio-temporal variability. And the drought return period estimated by the copula-based trivariate frequency analysis appropriately integrates the effects of drought duration, affect area and severity, which is a reliable drought statistical measurement. The 2009–2010 drought, which has a return period of about 94 years, is the most severe one in Southwest China during the period of 1961–2012. The Joe and Gumbel copulas are found to be more suitable to estimate the joint distribution of drought duration, affect area and severity, and the Asymmetric (nested) function forms perform better than the symmetric functions.
- Published
- 2015
14. Historical and future projected frequency of extreme precipitation indicators using the optimized cumulative distribution functions in China
- Author
-
Haixia Lin, De Li Liu, Ning Yao, Linchao Li, Xinguo Chen, Songbai Song, and Yi Li
- Subjects
Return period ,010504 meteorology & atmospheric sciences ,0207 environmental engineering ,Representative Concentration Pathways ,02 engineering and technology ,Spatial distribution ,01 natural sciences ,Gumbel distribution ,Generalized Pareto distribution ,Climatology ,Generalized extreme value distribution ,Environmental science ,Precipitation ,020701 environmental engineering ,Extreme value theory ,0105 earth and related environmental sciences ,Water Science and Technology - Abstract
The cumulative distribution functions (CDFs) of extreme precipitation indices (EPIs) characterize the occurrence of extreme precipitation events, but some CDFs have been either exaggerated or underestimated. In this study, we selected the most suitable CDFs from ten commonly used functions (i.e., Normal, two- and three-parameter log-normal, extreme value type-I, generalized extreme value-GEV, Weibull, two-parameter Gamma, Pearson-III type, Poisson, and Pareto) for the site- and regional-specific EPIs in China and projected their changes under the Representative Concentration Pathways (RCP) 4.5 and 8.5 scenarios combined with 28 general circulation models (GCMs) in Coupled Model Inter-comparison Project phase 5. Using the most appropriate site-specific CDFs, the spatial and temporal variations of the nine EPIs for the historical and projected future periods at 10-, 20-, 50- and 100-year return periods were investigated. The results showed that the GEV, Pearson-III, three- and two-parameter log-normal were generally the best CDFs for the 9 EPIs in China. By comparing the results during 2021–2100 with 1961–2017, the return periods increased as the EPIs increased. It was especially apparent in southeast China where the annual mean precipitation was above 1300 mm. The increases of the trends were significant for EPIs (especially in northwestern China), including the maximum 1 day precipitation, number of heavy precipitation days, number of very heavy precipitation days, extremely wet days and annual total wet day precipitation. The simple daily intensity index increased in northeast China, decreased in western China, but had no significant change in other regions. The maximum 5-day precipitation and consecutive wet days decreased in most areas except western China. The consecutive dry days decreased in northern China and increased in southeast China. This study provides region- and site-specific optimal CDFs and return period information and reveals the spatiotemporal changes of 9 EPIs at different return periods of 10-, 20-, 50- and 100-years under the RCP 4.5 and RCP 8.5 scenarios for 2021–2100. The obtained results are useful for guiding disaster-prevention efforts in different regions of China.
- Published
- 2019
15. The suitability assessment of a generalized exponential distribution for the description of maximum precipitation amounts
- Author
-
Bartosz Kaźmierczak and Andrzej Kotowski
- Subjects
Exponential distribution ,Meteorology ,Gumbel distribution ,Generalized beta distribution ,Statistics ,Gamma distribution ,Generalized integer gamma distribution ,Natural exponential family ,Distribution fitting ,Water Science and Technology ,Mathematics ,Weibull distribution - Abstract
Summary The paper is a methodological extension of the current description of maximum precipitation amounts on the basis of gamma, Gumbel, lognormal or Weibull distribution to newly developed theoretical distributions, namely, the two-(GED2) and three-parameter (GED3) generalized exponential distribution. The verification is carried out on the basis of meteorological data from the Wroclaw–Strachowice meteorological station of the Institute of Meteorology and Water Management from years 1960–2009.
- Published
- 2015
16. Copula-based frequency analysis of overflow and flooding in urban drainage systems
- Author
-
Guangtao Fu and David Butler
- Subjects
Hydrology ,Gumbel distribution ,Joint probability distribution ,Cumulative distribution function ,Monte Carlo method ,Copula (linguistics) ,Statistics ,Environmental science ,Combined sewer ,Marginal distribution ,Drainage ,Water Science and Technology - Abstract
Summary The performance evaluation of urban drainage systems is essentially based on accurate characterisation of rainfall events, where a particular challenge is development of the joint distributions of dependent rainfall variables such as duration and depth. In this study, the copula method is used to separate the dependence structure of rainfall variables from their marginal distributions and the different impacts of dependence structure and marginal distributions on system performance are analysed. Three one-parameter Archimedean copulas, including Clayton, Gumbel, and Frank families, are fitted and compared for different combinations of marginal distributions that cannot be rejected by statistical tests. The fitted copulas are used, through the Monte Carlo simulation method, to generate synthetic rainfall events for system performance analysis in terms of sewer flooding and Combined Sewer Overflow (CSO) discharges. The copula method is demonstrated using an urban drainage system in the UK, and the cumulative probability distributions of maximum flood depth at critical nodes and CSO discharge volume are calculated. The results obtained in this study highlight the importance of taking into account the dependence structure of rainfall variables in the context of urban drainage system evaluation and also reveal the different impacts of dependence structure and marginal distributions on the probabilities of sewer flooding and CSO volume.
- Published
- 2014
17. Derivation of low flow distribution functions using copulas
- Author
-
Lihua Xiong, Kun-xia Yu, and Lars Gottschalk
- Subjects
Frequency analysis ,Flow distribution ,Tail dependence ,law.invention ,Copula (probability theory) ,Gumbel distribution ,Heavy-tailed distribution ,law ,Statistics ,Statistical physics ,Marginal distribution ,Water Science and Technology ,Mathematics ,Weibull distribution - Abstract
Summary Derivation of low flow distribution using recession functions has been introduced in previous studies, but without taking into consideration the statistical dependence structure between the characteristics of low flow event, i.e. the duration of dry spell t and the recession parameter k. Low flow data of three basins in China with different climates demonstrate that statistical dependence actually exists between t and k. A copula-based derived distribution is proposed in this paper to take full account of this internal dependence within the low flow event. The proposed derived distribution can be flexibly constructed using a wide variety of copula functions and marginal distributions. Four types of copula functions (i.e. Student, Clayton, Gumbel, and Frank), each with twelve combinations of marginal distributions, are all employed to derive low flow distributions to find out which component, copula function or marginal distribution, has the most impact on the performance of derived low flow distributions in fitting the observed data. It turns out that the capability of this copula-based derived distribution is strongly influenced by the choice of marginal distribution, while different copula functions have more than negligible impacts on the tails’ goodness-of-fit. Student copula is preferred to model the chosen (t, k) samples with both lower and upper tail dependence. But the copula-based derived distribution is not recommended to describe low flow samples with long lower tails. The performance of the copula-based derived distributions is compared with that of the derived truncated Weibull distribution whose parameters are also process-oriented but without considering the statistical dependence structure of (t, k) in low flow events. The results highlight that the copula-based derived distribution is more flexible and can more reasonably describe both the upper and lower tails of low flow series than the derived truncated Weibull distribution. Two traditional fitted distributions, fitted truncated Weibull distribution and fitted Pearson Type III distribution, are also applied to describe the low flow series to evaluate the capability of the copula-based derived distribution. The fitted Pearson Type III distribution always provides highest accuracy, while copula-based derived distributions perform comparably given the appropriate marginal distributions and copula function. In general, the copula-based derived distribution can be a potential attractive alternative in low flow frequency analysis, for it can be used in studying the impacts of climate change and human activities on the frequency of low flows.
- Published
- 2014
18. Inland water bodies in Chile can locally increase rainfall intensity
- Author
-
Francina Dominguez, Roberto Pizarro, Francisco Balocchi, Peter F. Ffolliott, Rodrigo Valdés, Per Bro, Carolina Morales, Faisal Hossain, Pablo Garcia-Chevesich, and Claudio Olivares
- Subjects
Hydrology ,Gumbel distribution ,Elevation ,Mann–Whitney U test ,Environmental science ,Precipitation ,Longitude ,Intensity (heat transfer) ,Water Science and Technology ,Latitude - Abstract
Summary Analysis of precipitation observations from Chile indicated that man-made water reservoirs might be affecting the intensity of extreme precipitation events. Fifty rain gauges were used to evaluate rainfall intensities under different climates, using the Gumbel method ( T = 5 and 100 years) and average maximum recorded rainfall intensities to construct IDF curves for each station. A spatial analysis of the stations was undertaken to establish graphical relationships on documented maximum annual rainfall intensities for 1 h and those obtained by the Gumbel method as a function of latitude, longitude, elevation, and the distance from water bodies. The Mann–Whitney U test was applied with an error of 5%. Values obtained from stations located close to water bodies were compared to those located away from them. The results show significant changes in dryer climates.
- Published
- 2013
19. Frequency analysis of the 7–8 December 2010 extreme precipitation in the Panama Canal Watershed
- Author
-
Michael J. Murphy, Konstantine P. Georgakakos, and Eylon Shamir
- Subjects
Return period ,Gumbel distribution ,Generalized Pareto distribution ,Streamflow ,Statistics ,Generalized extreme value distribution ,Environmental science ,Extreme value theory ,Uncertainty analysis ,Water Science and Technology ,Event (probability theory) - Abstract
Summary The 7–8 December 2010 rainfall event in Panama produced record rainfall and streamflow that are about twice as much as for the previously observed large event in record. In this study we ask whether before the occurrence of this rainfall event, a return period estimate using the historical record and the commonly used statistical asymptotic distributions of extreme values could have indicated that such an event is probable. We examined the daily and 24-h mean areal rainfall over the entire Panama Canal Watershed with the Generalized Extreme Value, Gumbel, and Generalized Pareto distributions using the maximum likelihood approach for the parameter and uncertainty bounds estimation. We found that the solutions that maximized the log likelihood for these three distributions yield return period estimates that are larger than 2000 years. These return periods imply that the 2010 rainfall event was practically unforeseen. It is only the careful implementation of these distributions with full uncertainty analysis to define confidence intervals that yields estimates of return periods with substantial probabilities for such an event to occur. The GEV was found to be the most adequate distribution for this analysis, and the commonly-used Gumbel distribution, although indicated a good fit to the annual maxima series, attributed an extremely low probability for the occurrence of this event.
- Published
- 2013
20. Joint mapping of statistical streamflow descriptors
- Author
-
Etienne Leblois, Lars Gottschalk, Kun-xia Yu, Irina Krasovskaia, and Lihua Xiong
- Subjects
Gumbel distribution ,Streamflow ,Log-normal distribution ,Statistics ,Outlier ,Gamma distribution ,Maxima ,Extreme value theory ,Water Science and Technology ,Mathematics ,Parametric statistics - Abstract
Summary An approach is presented for a joint mapping and regionalization of statistical descriptors of streamflow like the mean value and the coefficient of variation of daily data, the flow duration curve, moments and distributions of annual minima and maxima. Probability theory, and specifically extreme value theory, offers possible parametric relations to be tested, linking knowledge of the basic mean value and coefficient of variation with the other descriptors. This initial study is limited to theoretical results derived from the assumption that the original daily data follow the lognormal or the gamma distribution. With these standard distributions the statistical regularity found is surprisingly good when confronting the theory against the empirical data of daily streamflow from 35 catchments in the Moselle drainage basin in France. The lognormal distribution appears to be the best candidate for the flow duration curve, as well as maximum and minimum streamflow. The tail behaviour of streamflow maxima is well described, while for minima the results are more difficult to interpret. One problem might be the relative higher uncertainty in annual low flow data, and especially the sensitivity to outliers towards high values.
- Published
- 2013
21. Point and standard error estimation for quantiles of mixed flood distributions
- Author
-
John M. Grego and Philip A. Yates
- Subjects
Observed information ,Gumbel distribution ,Statistics ,Log-normal distribution ,Expectation–maximization algorithm ,Generalized extreme value distribution ,Applied mathematics ,Mixture distribution ,Mixture model ,Water Science and Technology ,Mathematics ,Quantile - Abstract
This paper explores the use of finite mixture models in the study of flood frequency distributions with multiple components. It focuses on further methodological developments for finite mixture models, including an accelerated version of the EM algorithm to derive both parameter estimates and the observed information matrix, estimation of the .99 quantile of the mixture distribution, and standard error estimation of the quantile. In case studies, the lognormal finite mixture model will be compared to other models - specifically the widely-used log-Pearson Type III distribution, the Gumbel distribution, and the GEV distribution. A multidimensional gradient plot and information criteria will be discussed as diagnostics for the number of mixing components.
- Published
- 2010
22. Comparison of methods for analysis of extremes when records are fragmented: A case study using Amazon basin rainfall data
- Author
-
Rodrigo Dias de Paiva, Robin T. Clarke, and Cintia Bertacchi Uvo
- Subjects
Return period ,Standard error ,Gumbel distribution ,Threshold limit value ,Statistics ,medicine ,Probability distribution ,Variance (accounting) ,Seasonality ,medicine.disease ,Maxima ,Water Science and Technology ,Mathematics - Abstract
The analysis of annual extremes of hydrological and meteorological variables is frequently complicated by the presence of gaps in record, and when records are not only fragmented but also short, it is necessary to utilize to the full the information contained in them. One method is to abstract for statistical analysis all extreme events whose peaks exceed a pre-selected threshold value, but the threshold must be carefully chosen if "clumps" of peaks are to be avoided. A common alternative is a statistical analysis of maxima in years that are complete, possibly including in the analysis values from incomplete years according to some empirical rule. A plausible probability distribution has been proposed by [Jones, D.A., 1997. Plotting positions via maximum likelihood for a non-standard situation. Hydrol. Earth Syst. Sci. 1, 357-366] for the extremes observed in incomplete years, which takes into account not only the proportion of record that is missing within an incomplete year, but also the effect of seasonality. As part of a larger study on the hydrology of the Amazon basin, this paper uses 484 records with length not less than 12 years from an extensive network of 750 rain gauges, to compare the method proposed by Jones (termed the DAJ method) with the following alternative procedures: (i) using only complete years of record and (ii) including years with less than 20% missing record, as if they were complete. Using the large-sample variance calculated for the annual maximum one-day rainfall with 100-year return period (P-100), the method proposed by Jones is shown to give smaller standard errors than either of the alternatives. Using the number of years in each record to calculate weighted mean variances over the 484 records, the mean standard errors of Ploo obtained by methods (i) and (ii) were 1.25 and 1.06 times the mean standard error given by the DAJ method. The precision of estimates obtained by the latter method was therefore better than either alternative. (c) 2009 Elsevier B.V. All rights reserved. (Less)
- Published
- 2009
23. Rainfall intensity–duration–frequency relationships derived from large partial duration series
- Author
-
Arie Ben-Zvi
- Subjects
Anderson–Darling test ,Standard error ,Goodness of fit ,Gumbel distribution ,Generalized Pareto distribution ,Log-normal distribution ,Statistics ,Generalized extreme value distribution ,Extreme value theory ,Water Science and Technology ,Mathematics - Abstract
Summary A procedure is proposed for basing intensity–duration–frequency (IDF) curves on partial duration series (PDS) which are substantially larger than those commonly used for this purpose. The PDS are derived from event maxima series (EMS), composed of the maximum average intensities, over a given duration, determined for all rainfall events recorded at a station. The generalized Pareto distribution (GP) is fitted to many PDS nested within the EMS and the goodness-of-fit is determined by the Anderson–Darling test. The best fitted distribution is selected for predicting intensities associated with the given duration and with a number of recurrence intervals. This procedure was repeated for eleven rainfall durations, from 5 to 240 min, at four stations of the Israel Meteorological Service. For comparison, the GP and the generalized extreme value (GEV) distributions were fitted to annual maxima series (AMS) and the Gumbel and lognormal distributions were fitted to the PDS and to the AMS at these stations. In almost all cases, the GP distribution well fits to ranges of PDS within an EMS, while in a few cases the best fit is fair only. Another result is that the GP distribution does not fit to AMS and to EMS. The GEV distribution well fits to most AMS, and fairly fits to the others. The Gumbel and the lognormal distributions well fit to most of the AMS and to a very few PDS. In most cases of good fits of different distributions, the predicted values by the different distributions are not much different from one another. This indicates the importance of good fit of the distribution and of the power of the AD test used for determining it. In most cases the best fit of the GP distribution is to a PDS series substantially larger than its corresponding AMS. In most cases, the standard error of the estimated 100-year intensity, through the best fitted GP to PDS, is smaller than that estimated through the GEV fitted to the corresponding AMS. All these make the proposed procedure superior to the current ones. It also enables interpolated predictions down to recurrence intervals of N/n years (N is number of years of complete records and n is PDS size). The use of large samples would reduce the sensitivity of predicted intensities to sampling variations.
- Published
- 2009
24. Establishing acceptance regions for L-moments based goodness-of-fit tests by stochastic simulation
- Author
-
Yii-Chen Wu, Ke-Sheng Cheng, and Jun-Jih Liou
- Subjects
Goodness of fit ,Gumbel distribution ,Sample size determination ,Joint probability distribution ,Statistics ,Stochastic simulation ,Kurtosis ,Estimator ,Multivariate normal distribution ,Water Science and Technology ,Mathematics - Abstract
Before conducting a hydrological frequency analysis the best-fit distribution for the hydrological variable of interest must be decided by a goodness-of-fit test or other appropriate methods. In recent years the L-moment-ratio diagram has been suggested as a useful tool for discrimination between candidate distributions. However, few research works have been conducted on the effect of sample size on goodness-of-fit test using the L-moment-ratio diagram. In this study, through stochastic simulation, statistical properties of two estimators, namely the probability-weighted-moment estimator and the plotting-position estimator, of the L-skewness and L-kurtosis of the normal and Gumbel distributions are discussed. The joint distribution of the sample L-skewness and L-kurtosis is found to be approximately bivariate normal for larger sample sizes. Consequently, a set of sample-size-dependent 95% acceptance regions for L-moments-based goodness-of-fit tests of the normal and Gumbel distributions was established using stochastic simulation technique. Such acceptance regions were further validated using simulated random samples, with regard to the consistence of the acceptance rate and the desired level of significance, and were found to be applicable for goodness-of-fit tests for random samples of any sample size between 20 and 1000.
- Published
- 2008
25. Regression equations of probability plot correlation coefficient test statistics from several probability distributions
- Author
-
Youn Woo Kho, Sooyoung Kim, Taesoon Kim, Hongjoon Shin, and Jun Haeng Heo
- Subjects
Probability plot ,Gumbel distribution ,Skewness ,Sample size determination ,Cramér–von Mises criterion ,Statistics ,Probability distribution ,Regression analysis ,Water Science and Technology ,Weibull distribution ,Mathematics - Abstract
Summary The probability plot correlation coefficient (PPCC) test has been known as a powerful but easy-to-use goodness-of-fit test. However, the application of PPCC test statistics is sometimes difficult since the test statistics are generally derived in tabulated form and the number of test statistics is significant. In this study, the PPCC test statistics for the normal, Gumbel, gamma, GEV, and Weibull distributions are derived, and regression equations of the PPCC test statistics for those models are formulated as a function of the significance levels, sample sizes, and skewness coefficients depending on the models. Monte Carlo simulation for power tests were performed to compare the rejection capability of the PPCC test with those of the χ2, Cramer von Mises, and Kolmogorov–Smirnov tests for several probability distributions. The power test results indicated that the PPCC and χ2-tests had better rejection performances than the CVM and K–S tests did when the parent and applied models were identical. Moreover, the PPCC test showed the most powerful rejection rate, followed by the χ2-test, while the CVM was the worst when the parent and applied models were different. In addition, the power of rejection increased with sample size when the parent and applied models were different. However, the rejection power did not vary appreciably with sample size when the parent and applied models were identical.
- Published
- 2008
26. Drought in the Netherlands – Regional frequency analysis versus time series simulation
- Author
-
Jules J. Beersma and T. Adri Buishand
- Subjects
Return period ,Standard error ,Gumbel distribution ,Resampling ,Statistics ,Econometrics ,Extrapolation ,Time series ,Jackknife resampling ,Water Science and Technology ,Mathematics ,Quantile - Abstract
Summary The distribution of the annual maximum precipitation deficit is studied for six districts within the Netherlands. Gumbel probability plots of this precipitation deficit show a common extraordinary curvature in the upper tail. A regional frequency analysis yields a regional growth curve that can be approximated by a spline consisting of two linear segments on the standard Gumbel scale and a smooth transition between them. Alternatively, the application of a time series model based on nearest-neighbour resampling is explored. To reproduce the persistence structure a 4-month memory term is needed in the resampling model. Using this memory term there is an enhanced positive correlation between past and future precipitation deficits during extremely dry summers, which seems to be responsible for the curvature in the precipitation deficit distributions. This term, however, also leads to a considerable increase of the standard error of large quantile estimates. Much attention is given to the use of the bootstrap and the jackknife to determine the standard errors of quantile estimates based on nearest-neighbour resampling. A simulation experiment with a first-order autoregressive time series model shows that these standard errors can be biased, in particular for the bootstrap. The relative standard errors of quantile estimates are large in the area of large curvature of the Gumbel probability plots. This holds both for nearest-neighbour resampling and regional frequency analysis. When the two methods are used for extrapolation, nearest-neighbour resampling clearly outperforms the regional frequency analysis. The latter then shows a strong increase in the relative standard error of quantile estimates with increasing return period due to the large uncertainty of the parameters in the spline approximation to the regional growth curve. Using nearest-neighbour resampling and the bootstrap, confidence intervals are constructed for the return periods of the largest observed precipitation deficit for each of the six districts. Although these confidence intervals are quite wide, they are on average a factor of two narrower than the interval expected from the size of the sample only.
- Published
- 2007
27. Consistency and normality of estimates of hydrological extremes derived from a model for fragmented data
- Author
-
Robin T. Clarke
- Subjects
Cumulative distribution function ,media_common.quotation_subject ,Estimator ,Asymptotic distribution ,Missing data ,Gumbel distribution ,Consistency (statistics) ,Statistics ,Econometrics ,Extreme value theory ,Normality ,Water Science and Technology ,Mathematics ,media_common - Abstract
Summary This paper deals with a non-standard application of frequency analysis of extremes of hydrologic variables in which some years of record are incomplete. If F ( y ) is the cumulative distribution function (cdf) of the extreme variable in a complete year (or ‘block’), the cdf of the variable in an incomplete year is taken as F ( y ) p , where 0 p F ( y ) are estimated by maximum likelihood (ML), the usual large-sample characteristics of ML estimators (consistency, asymptotic Normality) may be modified. The paper examines the consistency, bias and approach to Normality of ML estimates of Gumbel parameters for position and scale, and of the Gumbel extreme event y 100 with 100-year return period. In the non-standard model F ( y ) p , consistency of ML estimates of Gumbel position and scale parameters is considerably modified, but the consistency of the estimated y 100 much less so. Estimates of y 100 are negatively biased, but the bias is similar to that found in the standard (no missing data) case. The results are relevant where hydrologic records are short and incomplete, such that all existing data must be fully utilized.
- Published
- 2007
28. Effect of percent non-detects on estimation bias in censored distributions
- Author
-
Umed Singh Panu, William C. Lennox, and Z. Zhang
- Subjects
Heaviside step function ,Computer science ,Monte Carlo method ,Inference ,Probability density function ,Censoring (statistics) ,symbols.namesake ,Gumbel distribution ,Statistics ,symbols ,Econometrics ,Probability distribution ,Uniqueness ,Water Science and Technology - Abstract
Uniqueness of the problem surrounding non-detects has been a concern alike to researchers and statisticians dealing with summary statistics while analyzing censored data. To incorporate non-detects in the estimation process, a simple substitution by the MDL (method detection limit) and the maximum likelihood estimation method are routinely implemented as standard methods by US-EPA laboratories. In situations where numerical standards are set at or near the MDL by regulatory agencies, it is prudent and important to closely investigate both the variability in test measurements and the estimation bias, because an inference based on biased estimates could entail significant liabilities. Variability is understood to be not only inevitable but also an inherent and integral part of any chemical analysis or test. In situations where regulatory agencies fail to account for the inherently present variability of test measurements, there is a need for regulated facilities to seek remedial action merely as a consequence of inadequate statistical procedure. This paper utilizes a mathematical approach to derive the bias functions and resulting bias curves are developed to investigate the censored samples from a variety of probability distributions such as normal, log-normal, gamma, and Gumbel distributions. Finally, the bias functions and bias curves are also compared to the results obtained by using Monte Carlo simulations.
- Published
- 2004
29. Regional rainfall intensity formulas based on scaling property of rainfall
- Author
-
Pao Shan Yu, Chin Sheng Lin, and Tao Chang Yang
- Subjects
Data set ,Hydrology ,Series (mathematics) ,Gumbel distribution ,Calibration (statistics) ,Statistics ,Piecewise ,Exponent ,Scaling ,Intensity (heat transfer) ,Water Science and Technology ,Mathematics - Abstract
This work developed regional Intensity–Duration–Frequency (IDF) formulas for non-recording sites based on the scaling theory. Forty-six recording raingauges over northern Taiwan provide the data set for analysis. The temporal scaling properties of annual maximum rainfall series for various durations were first investigated. The hypothesis of piecewise simple scaling combined with Gumbel distribution was used to develop the IDF scaling formulas. Since most of parameters in IDF scaling formulas can be estimated from the annual 1-day maximum series, these parameters were regionalized based on the relationship between the scaling exponent and the average of annual 1-day maximum rainfall. Three scaling homogeneous regions were classified by different scaling regimes and regional IDF scaling formulas were developed in each region. The analyzed results reveal that the regional IDF scaling formulas proposed herein resulted in reasonable simulations and verifications.
- Published
- 2004
30. Sampling variances of regional flood quantiles affected by intersite correlation
- Author
-
M. Bayazit and Bihrat Önöz
- Subjects
Gumbel distribution ,Flood myth ,Coefficient of variation ,Statistics ,Econometrics ,Sampling (statistics) ,Multivariate normal distribution ,Variance (accounting) ,Water Science and Technology ,Quantile ,Weibull distribution ,Mathematics - Abstract
Flood flows at the sites of a homogeneous region usually have significant cross-correlations. The effect of intersite correlation on the sampling variance of regional growth factors is studied. An analytical expression is derived for the asymptotic sampling variance of the regional average coefficient of variation for multivariate normal floods. The results for the variance of flood quantiles are checked by simulation. For other two-parameter distributions such as the Gumbel, Weibull and gamma, a simulation study is performed to obtain the sampling variances of regional growth factors.
- Published
- 2004
31. Alternative PWM-estimators of the Gumbel distribution
- Author
-
Peter F. Rasmussen and Navin Gautam
- Subjects
Statistics::Theory ,Standard error ,Mean squared error ,Gumbel distribution ,Statistics ,Generalized extreme value distribution ,Applied mathematics ,Estimator ,Probability distribution ,Extreme value theory ,Water Science and Technology ,Mathematics ,Quantile - Abstract
Probability weighted moments (PWM) are widely used in hydrology for estimating parameters of statistical distributions, including the Gumbel distribution. The classical PWM-approach considers the moments β i = E [ XF i ] with i =0,1 for estimation of the Gumbel scale and location parameters. However, there is no reason why these probability weights ( F 0 and F 1 ) should provide the most efficient PWM-estimators of Gumbel parameters and quantiles. We explore an extended class of PWMs that does not impose arbitrary restrictions on the values of i . Estimation based on the extended class of PWMs is called the generalized method of probability weighted moments (GPWM) to distinguish it from the classical procedure. In fact, our investigation demonstrates that it may be advantage to use weight functions that are not of the form F i . We propose an alternative PWM-estimator of the Gumbel distribution that maintains the computational simplicity of the classical PWM method, but provides slightly more accurate quantile estimates in terms of mean square error of estimation. A simple empirical formula for the standard error of the proposed quantile estimator is presented.
- Published
- 2003
32. The relationship between annual varve thickness and maximum annual discharge (1909–1971)
- Author
-
Ingemar Cato, Mikkel Sander, Barbara Wohlfarth, Lars Bengtsson, and Björn Holmquist
- Subjects
Return time ,Hydrology ,Varve ,Gumbel distribution ,Discharge ,Physical geography ,Geology ,Water Science and Technology - Abstract
Annually laminated (varved) sediments from the River Angermanalven, mid-central Sweden, have been used to construct an annual 2000-year long record of varve thickness. Maximum daily annual discharge and mean varve thickness for the years 1909-1971 are significantly correlated (r = 0.87). A relationship between maximum daily annual discharge for the observed period (1909-1971) and varve thickness was determined. The return time of two exceptionally thick varves in the 2000-year long record at the years 658 and 492 AD were estimated and their likelihood estimated based on a Gumbel frequency analysis. (C) 2002 Elsevier Science B.V. All rights reserved.
- Published
- 2002
33. Flood frequency estimation by a derived distribution procedure
- Author
-
Athanasios Loukas
- Subjects
Hydrology ,Watershed ,Meteorology ,Flood myth ,Gumbel distribution ,Monte Carlo method ,Environmental science ,Probability distribution ,Hydrograph ,Extreme value theory ,Time of concentration ,Water Science and Technology - Abstract
An event rainfall-runoff simulation procedure based on the method of derived distributions is proposed for the estimation of flood frequency for ungauged watersheds. The procedure uses a stochastic rainfall generation model and a rainfall-runoff watershed model. The results of previous research on rainfall characteristics and watershed response are incorporated into the two models. These rainfall characteristics are storm depth, storm duration, space and time distribution. The simplified watershed model, used in the procedure, has previously been tested and given good simulation of the watershed response. Some of the rainfall and watershed model parameters are stochastic in nature and are assumed to follow various probability distributions. Monte Carlo simulation is used for the generation of the various parameter values and simulation of the flood hydrographs. After 5000 realizations, the frequency of the hourly and daily peak flow and the flood volume is estimated. The proposed procedure is applied to eight coastal British Columbia watersheds and the results compare well with the observed data and with the Extreme Value type I (EVI or Gumbel) fitted probability distribution. The method is easy to apply, requires limited regional data and is shown to be reliable for small and medium forested watersheds with areas ranging from 10 to 600 km 2 . Sensitivity analysis shows that the procedure is stable and is not sensitive to the number of realizations. It is suggested that, given an appropriate adjustment of the rainfall generation model and testing and validation, the procedure could be used in areas with climates other than that of coastal British Columbia.
- Published
- 2002
34. Regionalization of extreme precipitation distribution using the principal components of the topographical environment
- Author
-
G. Wotling, Ch Bouvier, J Danloux, and J.-M Fritsch
- Subjects
Gumbel distribution ,Meteorology ,Estimation theory ,Kriging ,Principal component analysis ,Regression analysis ,Precipitation ,Spatial distribution ,Digital elevation model ,Geology ,Water Science and Technology - Abstract
This paper deals with the regionalization of the extreme rainfall intensities in the volcanic island of Tahiti (French Polynesia), and focuses how the method automatically takes into account the topographical relief features. Principal component analysis of a digital elevation model supplies a limited set of variables describing the topographical environment. These synthetic descriptors are linked to the parameters of the rainfall intensity Gumbel distribution by using a stepwise regression adjusted on 20 point-rainfall records. The model is then applied on a regular 300-points grid node, and interpolated using a spline function to provide an approximation of the pluviometric risk all over the island. In the case of the island of Tahiti, the relationship between the rainfall parameter distributions and the topographical descriptors is very strong, and the method supplies a direct estimation, in space, of the rainfall statistics through the regression model. Validation of the results and comparison with simple kriging interpolation show the relevance of the approach. However, more data are needed for a better confidence in the parameter estimation in some areas of the island. This automatic and objective method could be applied in any mountainous area, where topography has a major influence on the precipitation features, to characterize the non-stationarity of point-rainfall statistics in space.
- Published
- 2000
35. The Gumbel mixed model for flood frequency analysis
- Author
-
Pierre Legendre, Pierre Bruneau, Bernard Bobée, Taha B. M. J. Ouarda, and Sheng Yue
- Subjects
Hydrology ,Flood myth ,Gumbel distribution ,Joint probability distribution ,100-year flood ,Generalized extreme value distribution ,Conditional probability ,Environmental science ,Marginal distribution ,Extreme value theory ,Water Science and Technology - Abstract
Many hydrological engineering planning, design, and management problems require a detailed knowledge of flood event characteristics, such as flood peak, volume and duration. Flood frequency analysis often focuses on flood peak values, and hence, provides a limited assessment of flood events. This paper proposes the use of the Gumbel mixed model, the bivariate extreme value distribution model with Gumbel marginals, to analyze the joint probability distribution of correlated flood peaks and volumes, and the joint probability distribution of correlated flood volumes and durations. Based on the marginal distributions of these random variables, the joint distributions, the conditional probability functions, and the associated return periods are derived. The model is tested and validated using observed flood data from the Ashuapmushuan river basin in the province of Quebec, Canada. Results indicate that the model is suitable for representing the joint distributions of flood peaks and volumes, as well as flood volumes and durations.
- Published
- 1999
36. Transformation of point rainfall to areal rainfall: Intensity-duration-frequency curves
- Author
-
Günter Blöschl and Murugesu Sivapalan
- Subjects
Hydrology ,Return period ,Gumbel distribution ,Generalized extreme value distribution ,Range (statistics) ,Catchment area ,Spatial dependence ,Extreme value theory ,Atmospheric sciences ,Standard deviation ,Water Science and Technology ,Mathematics - Abstract
Current approaches to constructing catchment intensity-duration-frequency (IDF) curves are dominated by the use of empirically-derived areal reduction factors (ARFs). In this paper we present an alternative methodology which is based on the spatial correlation structure of rainfall. It represents an attempt to link current scientific theories of space-time rainfall fields with design methods. The starting point is to derive the parent distribution of catchment average rainfall intensity from that of point rainfall intensity. The parameters of the two parent distributions are related through a variance reduction factor which is a function of the spatial correlation structure of rainfall and catchment area. Assuming that the parent distribution is of the “exponential type”, it is then transformed to an extreme value distribution of the Gumbel type. The crucial step is to match the parameters of the extreme rainfall distribution derived above, for the particular case of zero catchment area, with those of empirical point IDF curves which have also been fitted to the Gumbel distribution. With this match, the proposed theory then naturally generalises to yield catchment IDF curves for catchments of any size, and for rainfall of any spatial correlation structure. The new catchment IDF curves have the attractive property that, with a minimum number of assumptions, they can reproduce a range of observed properties of catchment rainfall. For example, not only the mean and the standard deviation of extreme rainfall, but also its coefficient of variation, decrease with increasing catchment area. We also find that computed ARFs using the new approach depend not only on catchment area and storm duration, but also on the return period. We estimate ARFs using the new methodology for two major observed storms in Austria, and find that these estimates compare favourably with our understanding of the rainfall generating mechanisms associated with these two particular storm types.
- Published
- 1998
37. Conditional expectation for evaluation of risk groundwater flow and solute transport: one-dimensional analysis
- Author
-
Hund-Der Yeh and Tai Sheng Liou
- Subjects
Gumbel distribution ,Statistics ,Log-normal distribution ,Monte Carlo method ,Conditional probability ,Probability density function ,Statistical physics ,Conditional expectation ,Uncertainty analysis ,Water Science and Technology ,Weibull distribution ,Mathematics - Abstract
A one-dimensional groundwater transport equation with two uncertain parameters, groundwater velocity and longitudinal dispersivity, is investigated in this paper. The analytical uncertainty of the predicted contaminant concentration is derived by the first-order mean-centered uncertainty analysis. The risk of the contaminant transport is defined as the probability that the concentration exceeds a maximum acceptable upper limit. Five probability density functions including the normal, lognormal, gamma, Gumbel, and Weibull distributions are chosen as the models for predicting the concentration distribution. The risk for each distribution is derived analytically based on the conditional probability. The mean risk and confidence interval are then computed by Monte Carlo simulation where the groundwater velocity and longitudinal dispersivity are assumed to be lognormally and normally distributed, respectively. Results from the conditional expectation of an assumed damage function show that the unconditional expectation generally underestimates the damage for low risk events. It is found from the sensitivity analysis that the mean longitudinal dispersivity is the most sensitive parameter and the variance of longitudinal dispersivity is the least sensitive one among those distribution models except the gamma and Weibull distributions.
- Published
- 1997
38. Homogeneity tests based upon Gumbel distribution and a critical appraisal of Dalrymple's test
- Author
-
Jery R. Stedinger and Heinz D. Fill
- Subjects
Score test ,Normality test ,Exact test ,Sampling distribution ,Gumbel distribution ,Homogeneity (statistics) ,Statistics ,Chi-square test ,Goldfeld–Quandt test ,Water Science and Technology ,Mathematics - Abstract
Homogeneity tests are an important component of many regional flood frequency analysis methods, particularly index flood methods. The index flood method was first suggested by Dalrymple, who also proposed a homogeneity test. Dalrymple's test has been widely used in hydrologic practice for more than 30 years. This paper analyzes the relative performance of Dalrymple's test, a normalized quantile test based upon L-moment parameter estimation ( X -10 test), and a method of moment C ν test (MoM- C ν test). Dalrymple's original test is shown to be in error and a corrected version is developed. A Monte Carlo study compares the power of these three tests, after applying correction factors to the test statistics to achieve a 5% Type I error. The L-moment X -10 test was always more powerful then Dalrymple's test or the MoM- C ν test. Moreover, the X -10 test needs much less correction to achieve the specified Type I error. An analytical analysis of the power of the tests is consistent with the Monte Carlo results. The superior performance of the X -10 test is due to its L-moment estimation procedure, which yields a smaller sampling variance and a sampling distribution closer to normality than classical product moments. The relationship of the X -10 test with other L-moment based tests proposed by other workers is discussed. The original version of Dalrymple's test should not be used.
- Published
- 1995
39. Bivariate exponential model applied to intensities and durations of extreme rainfall
- Author
-
Gianfranco Becciu, Nath T. Kottegoda, and Baldassare Bacchi
- Subjects
Exponential distribution ,Meteorology ,Bivariate analysis ,Exponential function ,symbols.namesake ,Gumbel distribution ,Joint probability distribution ,Statistics ,symbols ,Probability distribution ,Poisson regression ,Random variable ,Water Science and Technology ,Mathematics - Abstract
The Poisson model for rainfall occurrences in which storm intensity and duration are represented by two independent random variables is extended to consider intensity and duration as bivariate random variables each with a marginal exponential distribution. A numerical optimization method using annual maxima is adopted for parameters estimation. Comparison is made with the results of a numerical procedure which uses the Gumbel distribution as an approximation to the probability distribution of the extremes of the bivariate exponential model. A case study is presented using data from 18 raingauge stations in northern Italy. For rainfall durations of practical interest the theoretically derived relationships between probabilities and intensities compare favourably with observed relationships.
- Published
- 1994
40. Likelihood-based confidence intervals for estimating floods with given return periods
- Author
-
Robin T. Clarke and Eduardo Sávio Passos Rodrigues Martins
- Subjects
Gumbel distribution ,Statistics ,Confidence distribution ,Econometrics ,Likelihood function ,CDF-based nonparametric confidence interval ,Confidence interval ,Robust confidence intervals ,Water Science and Technology ,Confidence and prediction bands ,Mathematics ,Confidence region - Abstract
This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.
- Published
- 1993
41. Floodflow frequency model selection in Australia
- Author
-
Richard M. Vogel, Thomas A. McMahon, and Francis H. S. Chiew
- Subjects
Distribution (mathematics) ,Gumbel distribution ,Flood frequency analysis ,Generalized Pareto distribution ,Model selection ,Statistics ,Econometrics ,Generalized extreme value distribution ,Water Science and Technology ,Exponential function ,Mathematics - Abstract
Uniform flood frequency guidelines in Australia and the United States recommend the use of the log Pearson type 3 (LP3) distribution in flood frequency investigations. Many investigators have suggested alternate models such as the Generalized Extreme Value (GEV) distribution as an improvement over the LP3 distribution. Using floodflow data at 61 sites across Australia, we explore the suitability of various flood frequency models using L -moment diagrams. We also repeat the experiment performed in the original US Water Resource Council report (Bulletin 17B) which led to the LP3 mandate in the United States. Our evaluations reveal that among the models tested, the GEV and Wakeby distributions provide the best approximation to floodflow data in the regions of Australia that are dominated by rainfall during the winter months, such as southwest Western Australia and Tasmania. For the remainder of the continent, the Generalized Pareto (GPA) and Wakeby distributions provide the best approximation to floodflow data. The two- and three-parameter log-normal models and the LP3 distribution performed satisfactorily, yet not as well as either the GEV or GPA distributions. Other models such as the Gumbel, log-normal, normal, Pearson, exponential, and uniform distributions are shown to perform poorly. Recent research indicates that regional index-flood type procedures should be more accurate and more robust than the type of at-site procedures evaluated here. Nevertheless, this study reveals that index-flood procedures should not be restricted to a single distribution such as the GEV distribution because other distributions such as the GPA distribution perform significantly better in the most densely populated regions of Australia.
- Published
- 1993
42. Variance of two- and three-parameter GEV/PWM quantile estimators: formulae, confidence intervals, and a comparison
- Author
-
Jery R. Stedinger and Li-Hsiung Lu
- Subjects
Gumbel distribution ,Estimation theory ,Statistics ,Generalized extreme value distribution ,Estimator ,Trimmed estimator ,Extreme value theory ,Shape parameter ,Water Science and Technology ,Mathematics ,Quantile - Abstract
Simple formulae are developed for the sampling variances of quantile estimators for generalized extreme value (GEV) distributions when probability weighted moments (PWM) or L-moments are used to estimate all three parameters, or just two parameters given the GEV shape parameter. Sampling variances of three-parameter 100-year flood estimators can be reduced by a factor of 2–3 if the GEV shape parameter is specified. For distributions with realistic shape parameters, the two-parameter 100-year flood estimator with a fixed regional shape parameter (such as a Gumbel estimator) generally has a smaller m.s.e. than does a three-parameter estimator, even if the shape parameter is misrepresented. On the other hand, for less extreme quantiles with exceedance probabilities greater than 0.10, when k ⩽ 0.0, the three-parameter quantile estimators generally had smaller m.s.e.
- Published
- 1992
43. Evaluation of the usefulness of historical and palaeological floods in quantile estimation
- Author
-
S.L. Guo and Conleth Cunnane
- Subjects
Gumbel distribution ,Mean squared error ,Maximum likelihood ,Statistics ,Monte Carlo method ,Econometrics ,Estimator ,Time series ,Censoring (statistics) ,Water Science and Technology ,Mathematics ,Quantile - Abstract
The methods of incorporation of historical floods and palaeological information into flood frequency analysis, and the usefulness of doing so, have been evaluated by many hydrologists. These evaluations are not in complete agreement. The results of a Monte Carlo study are presented comparing different simulation procedures and assessing the value of historical floods for at-site flood frequency analysis on the assumption of a Gumbel (EVI) distribution. It is shown that historical floods and palaeological information provide a useful source of information additional to the recorded series, and have great value in flood frequency analysis when floods are drawn from the Gumbel distribution. Simulation procedures based on type II censoring result in the largest bias and root mean square error in quantile estimation. This may be due to their assumption of type II censoring in the production of their simulated samples, an assumption that has some limitations. In the present work it was found that the type I censored-data maximum likelihood estimator is a robust model for the Gumbel distribution and that the type II censored-data maximum likelihood estimator performs poorly when the data are in fact obtained by type I censoring.
- Published
- 1991
44. Unbiased plotting positions for historical flood information
- Author
-
Quan J. Wang
- Subjects
Gumbel distribution ,Flood myth ,Position (vector) ,Statistics ,Econometrics ,Standard deviation ,Water Science and Technology - Abstract
It is explained why the existing plotting position formulae for historical floods introduce large bias. It is proposed that the exceedance probability of historical floods in the exceedance based formula can be estimated initially by an analytical method using information about magnitudes of both the systematic and historical floods and assuming an appropriate parent distribution. It is shown that, for the Gumbel distribution, such a modification to the exceedance based formula reduces the bias to an almost negligible amount.
- Published
- 1991
45. Bias error in maximum likelihood estimation
- Author
-
S.P. Koch
- Subjects
Error function ,Distribution (mathematics) ,Gumbel distribution ,Statistics ,Margin of error ,Magnitude (mathematics) ,Extreme value theory ,Expression (mathematics) ,Water Science and Technology ,Weibull distribution ,Mathematics - Abstract
A study of bias error in the maximum likelihood estimates of parameters for the Gumbel, Frechet, and Weibull distributions is documented. The study suggests that a single polynomial expression can be used to correct the bias error of all three distributions. The expression is used to develop an error function for each distribution. The error function allows study of the impact of corrections for bias error on extreme value predictions for reasonable values of the distributional parameters. The results indicate that the impact for most data is negligible.
- Published
- 1991
46. Explicit expressions for the censored means and variances
- Author
-
Saralees Nadarajah
- Subjects
Gumbel distribution ,Log-normal distribution ,Statistics ,Zhàng ,Variance (accounting) ,Water Science and Technology ,Mathematics - Abstract
Summary Explicit expressions are derived for the mean and the variance of the censored distributions considered by Zhang et al. [Zhang, Z., Lennox, W.C., Panu, U.S. (2004). Effect of percent non-defects on estimation bias in censored distributions. Journal of Hydrology, 297, 74–94].
- Published
- 2008
47. Explicit expressions for the censored means and variances
- Author
-
Nadarajah, Saralees
- Subjects
- *
LOGNORMAL distribution , *ESTIMATION theory , *EARTH sciences - Abstract
Summary: Explicit expressions are derived for the mean and the variance of the censored distributions considered by Zhang et al. [Zhang, Z., Lennox, W.C., Panu, U.S. (2004). Effect of percent non-defects on estimation bias in censored distributions. Journal of Hydrology, 297, 74–94]. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
48. Corrigendum to 'The Gumbel mixed model for flood frequency analysis'
- Author
-
Pierre Legendre, Bernard Bobée, Pierre Bruneau, Taha B. M. J. Ouarda, and Sheng Yue
- Subjects
Mixed model ,Flood frequency analysis ,Gumbel distribution ,Statistics ,Environmental science ,Water Science and Technology - Published
- 2000
49. A correction for the bias of maximum-likelihood estimators of Gumbel parameters — Comment
- Author
-
Jonathan R. M. Hosking
- Subjects
Gumbel distribution ,Maximum likelihood ,Statistics ,Econometrics ,Estimator ,Type-1 Gumbel distribution ,Water Science and Technology ,Mathematics - Abstract
Fiorentino and Gabriele's results for the bias of maximum-likelihood estimators of Gumbel parameters are derived, and made more accurate, by a theoretical approach.
- Published
- 1985
50. An unbiased plotting position formula for the general extreme value distribution
- Author
-
Van-Thanh-Van Nyuyen and Nophadol In-na
- Subjects
Moment (mathematics) ,Distribution (number theory) ,Gumbel distribution ,Skewness ,Simple (abstract algebra) ,Position (vector) ,Generalized extreme value distribution ,Calculus ,Applied mathematics ,Development (differential geometry) ,Water Science and Technology ,Mathematics - Abstract
This paper introduces a new unbiased plotting position formula for the General Extreme Value distribution (GEV). The probability weighted moment (PWM) method is used to estimate the exact plotting positions. For practical application purposes, a simple formula representing a very reliable approximation to the exact plotting positions is proposed. It is found that the suggested formula provides a better agreement to the exact plotting positions than several existing formulas. Further, the proposed formula is conceptually more flexible and computationally more convenient because it can take explicity into account the skewness coefficient of the underlying distribution. It can be concluded that the plotting position formula developed in this study is the most appropriate for the GEV distribution. Finally, the development of probability papers for the GEV distribution for various skewness values is presented. Results of a numerical example have demonstrated the advantages related to the use of these special probability papers in engineering practice.
- Published
- 1989
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.