21,904 results on '"Extreme value theory"'
Search Results
2. Relative Error StreamingQuantiles.
- Author
-
CORMODE, GRAHAM, KARNIN, ZOHAR, LIBERTY, EDO, THALER, JUSTIN, and VESELÝ, PAVEL
- Subjects
APPROXIMATION algorithms ,DATA structures ,EXTREME value theory ,LINEAR orderings ,APPROXIMATION error ,TASK analysis - Abstract
Estimating ranks, quantiles, and distributions over streaming data is a central task in data analysis and monitoring. Given a stream of n items from a data universe equipped with a total order, the task is to compute a sketch (data structure) of size polylogarithmic in n. Given the sketch and a query item y, one should be able to approximate its rank in the stream, i.e., the number of stream elements smaller than or equal to y. Most works to date focused on additive εn error approximation, culminating in the KLL sketch that achieved optimal asymptotic behavior. This article investigates multiplicative (1±ε )-error approximations to the rank. Practical motivation for multiplicative error stems from demands to understand the tails of distributions, and hence for sketches to be more accurate near extreme values. The most space-efficient algorithms due to prior work store either O(log(ε²n)/ε²) or O(log³ (εn)/ε ) universe items. We present a randomized sketch storing O(log
1.5 (εn)/ε ) items that can (1 ± ε )-approximate the rank of each universe item with high constant probability; this space bound is within an O(√log(εn)) factor of optimal. Our algorithm does not require prior knowledge of the stream length and is fully mergeable, rendering it suitable for parallel and distributed computing environments. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
3. The Basel 2.5 capital regulatory framework and the COVID-19 crisis: evidence from the ethical investment market
- Author
-
Ben Ayed, Wassim and Ben Hassen, Rim
- Published
- 2024
- Full Text
- View/download PDF
4. Evaluation of daily precipitation modeling performance from different CMIP6 datasets: A case study in the Hanjiang River basin.
- Author
-
Pengxin, Deng, Jianping, Bing, Jianwei, Jia, and Dong, Wang
- Subjects
- *
CLIMATE change adaptation , *EXTREME value theory , *PRECIPITATION probabilities , *HYDROLOGICAL forecasting , *CLIMATE research , *WATERSHEDS - Abstract
• A method is devised to enable the quantitative assessment of precipitation precision among GCMs. • GCP6 demonstrates benefits in simulating precipitation probability, correlation, extreme values, and spatial variability. • Significant variations in the precision of precipitation simulations are evident across models, especially the CP6 models. • The top-ten models are selected for providing suitable rainfall simulations in the Hanjiang River Basin. To effectively compare and analyze daily precipitation modeling capabilities across different CMIP6 datasets, our study introduces a novel method to compare daily precipitation models across CMIP6 datasets in the Hanjiang River Basin (HRB). We quantify indicators such as precipitation distribution, temporal correlation, wet-dry detection, extreme value error, and spatio-temporal variability, enabling a comprehensive rating of precipitation accuracy. It has been found that while both CMIP6(CP6) and NEX-GDDP-CMIP6 (GCP6) models show similar simulation accuracy, GCP6 excels in several aspects like distribution, temporal correlation, extreme value simulation, and spatial variability, yet lags in wet-dry detection and temporal change. Notably, using the comprehensive rating score (CRS) analysis, significant differences in precipitation simulation accuracy exist between models, particularly CP6, with variations of up to 0.22 (51.2%) between the highest and lowest scores. Among the top ten models, GCP6 occupies four positions such as MRI-ESM2-0, GFDL-CM4, MPI-ESM1-2-HR, and TaiESM1, while CP6 holds the remaining six like CanESM5, EC-Earth3-Veg, MPI-ESM1-2-HR, GFDL-CM4, MRI-ESM2-0, and IPSL-CM6A-LR. These findings not only offer a clear understanding of the simulation performance of CMIP6 datasets across various precipitation characteristics, but also quantitatively compare the modeling capabilities of different models for watershed precipitation through CRS. This aids climate adaptation research, hydrological forecasting, and flood management in the basin. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Floods of Egypt's Nile in the 21st century.
- Author
-
Badawy, Ahmed, Sultan, Mohamed, Abdelmohsen, Karem, Yan, Eugene, Elhaddad, Hesham, Milewski, Adam, and Torres-Uribe, Hugo E.
- Subjects
- *
GENERAL circulation model , *WATER management , *EXTREME value theory , *RUNOFF analysis , *ATMOSPHERIC models - Abstract
Extreme precipitation and flooding events are rising globally, necessitating a thorough understanding and sustainable management of water resources. One such setting is the Nile River's source areas, where high precipitation has led to the filling of Lake Nasser (LN) twice (1998–2003; 2019–2022) in the last two decades and the diversion of overflow to depressions west of the Nile, where it is lost mainly to evaporation. Using temporal satellite-based data, climate models, and continuous rainfall-runoff models, we identified the primary contributor to increased runoff that reached LN in the past two decades and assessed the impact of climate change on the LN's runoff throughout the twenty-first century. Findings include: (1) the Blue Nile subbasin (BNS) is the primary contributor to increased downstream runoff, (2) the BNS runoff was simulated in the twenty-first century using a calibrated (1965–1992) rainfall-runoff model with global circulation models (GCMs), CCSM4, HadGEM3, and GFDL-CM4.0, projections as model inputs, (3) the extreme value analysis for projected runoff driven by GCMs' output indicates extreme floods are more severe in the twenty-first century, (4) one adaptation for the projected twenty-first century increase in precipitation (25–39%) and flood (2%-20%) extremes is to recharge Egypt's fossil aquifers during high flood years. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Quality Control for Second-Level Radiosonde Data Based on Bezier Curve Fitting.
- Author
-
Lai, Huixia, Chen, Lian, Zhang, Hualin, Tian, Ye, Zhang, Weijie, Wang, Bo, and Zhang, Shi
- Subjects
- *
CURVE fitting , *WEATHER forecasting , *EXTREME value theory , *CLIMATE research , *HUMIDITY , *ATMOSPHERIC temperature - Abstract
The balloon-borne radiosonde observations provide high-resolution profile observations of pressure, temperature, relative humidity, and winds from the surface to the middle stratosphere. These observations help validate space-based data and are used in climate research, weather forecasting. For the large amount of second-level radiosonde data, it is tedious and time-consuming for the manual quality control (QC). Furthermore, varying experiences and different judgment standards may lead to inconsistent judgments for abnormal data. To address these issues, we propose a two-stage QC method for second-level radiosonde data based on the Bezier curve fitting. In the stage QC1, the gross errors are filtered out according to the measurement range of the sensors, the change rates and extreme temperature values based on pressure segmentations. Also, the algorithm of the longest descending sequence(LDS) is used to identify the moment of sounding termination and eliminate items after that moment. In the stage QC2, we score each item with deviations calculated using Bezier curve fitting, and then use a decision tree model ,CART, to identify anomalies in second-level radiosonde data. The experiment results first demonstrate the efficacy of QC at each step, and finally validate the rationality of our method by comparing the statistical characteristics before and after QC. After QC, the error items are greatly reduced, and the percentile profile distribution of temperature, pressure and relative humidity becomes more reasonable. The overlap of items identified by manual QC and automatic QC reaches 86 %, verifying the effectiveness of our method. This research significantly boosts QC efficiency and unifies the QC standards, providing quality assurance for various applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Higher oxygen content and transport characterize high-altitude ethnic Tibetan women with the highest lifetime reproductive success.
- Author
-
Shenghao Ye, Jiayang Sun, Craig, Sienna R., Di Rienzo, Anna, Witonsky, David, Yu, James J., Moya, Esteban A., Simonson, Tatum S., Powell, Frank L., Basnyat, Buddha, Strohl, Kingman P., Hoit, Brian D., and Beall, Cynthia M.
- Subjects
- *
BLOOD viscosity , *OXYGEN saturation , *BIOLOGICAL fitness , *PHYSIOLOGY of women , *EXTREME value theory - Abstract
We chose the "natural laboratory" provided by high-altitude native ethnic Tibetan women who had completed childbearing to examine the hypothesis that multiple oxygen delivery traits were associated with lifetime reproductive success and had genomic associations. Four hundred seventeen (417) women aged 46 to 86 y residing at =3,500 m in Upper Mustang, Nepal, provided information on reproductive histories, sociocultural factors, physiological measurements, and DNA samples for this observational cohort study. Simultaneously assessing multiple traits identified combinations associated with lifetime reproductive success measured as the number of livebirths. Women with the most livebirths had distinctive hematological and cardiovascular traits. A hemoglobin concentration near the sample mode and a high percent of oxygen saturation of hemoglobin raised arterial oxygen concentration without risking elevated blood viscosity. We propose ongoing stabilizing selection on hemoglobin concentration because extreme values predicted fewer livebirths and directional selection favoring higher oxygen saturation because higher values had more predicted livebirths. EPAS1, an oxygen homeostasis locus with strong signals of positive natural selection and a high frequency of variants occurring only among populations indigenous to the Tibetan Plateau, associated with hemoglobin concentration. High blood flow into the lungs, wide left ventricles, and low hypoxic heart rate responses aided effective convective oxygen transport to tissues. Women with physiologies closer to unstressed, low altitude values had the highest lifetime reproductive success. This example of ethnic Tibetan women residing at high altitudes in Nepal links reproductive fitness with trait combinations increasing oxygen delivery under severe hypoxic stress and demonstrates ongoing natural selection. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. The expectation-maximization algorithm for autoregressive models with normal inverse Gaussian innovations.
- Author
-
Dhull, Monika S., Kumar, Arun, and Wyłomańska, Agnieszka
- Subjects
- *
MONTE Carlo method , *INVERSE Gaussian distribution , *GAS prices , *AUTOREGRESSIVE models , *EXTREME value theory - Abstract
In this paper, we study the autoregressive (AR) model with normal inverse Gaussian (NIG) innovations. The NIG distribution is semi heavy-tailed and is helpful in capturing the extreme observations present in the data. The expectation-maximization (EM) algorithm is used to estimate the parameters of the considered AR(p) model. The efficacy of the estimation procedure is shown on the simulated data for AR(2) and AR(1) models. A comparative study is presented, where the classical estimation algorithms are also incorporated, namely, Yule-Walker and conditional least squares methods along with EM method for model parameter estimation. In simulation study, the maximum likelihood estimation (MLE) of NIG distribution by EM algorithm and iterative Newton-Raphson method are also compared. The real-life applications of the introduced model are demonstrated on the NASDAQ stock market index data and US gasoline price data. The studies show that AR(1) model with NIG residuals is good fit for financial data with extreme values as well as for gasoline price data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Hierarchical clustering with spatial adjacency constraints in heavy-tailed areal data.
- Author
-
Mousavi, Seyedeh Somayeh, Mohammadpour, Adel, and Baghishani, Hossein
- Subjects
- *
EXTREME value theory , *AUTOREGRESSIVE models , *CITIES & towns , *MULTIVARIATE analysis , *DATA structures , *RANDOM fields - Abstract
Some natural phenomena with areal/lattice data structures include extreme values or outliers. In such situations, the assumption of Gaussianity for the random field may not be reasonable, and no Gaussian transformation can be found because they exhibit heavy tails. A non-Gaussian stable random field, which is heavy-tailed, could be a more appropriate choice in these cases. This article introduces a sub-Gaussian α-stable (SGαS) random field for spatial analysis of multivariate areal data using a multivariate conditional autoregressive model. We, specifically, focus on the spatial clustering problem of such areal data. To address it, we develop methods that work based on adjacency constraints. To group the data, we offer an adjacency-constrained hierarchical agglomerative clustering (HAC) technique that considers both spatial and non-spatial attributes existing in the multivariate areal data. The proposed clustering algorithm is developed based on spatial adjacency constraint criteria incorporated into the HAC technique. We employed the developed algorithm to group Luxembourg communes based on simulated areal data from the SGαS distributions and French cities along the Gironde estuary based on the estuary areal dataset. We compare the results of the proposed method with the mentioned adjacency criteria, various dissimilarity, and linkages measures in clustering these two datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Weighted Asymmetry Index: A New Graph-Theoretic Measure for Network Analysis and Optimization.
- Author
-
Koam, Ali N. A., Nadeem, Muhammad Faisal, Ahmad, Ali, and Eshaq, Hassan A.
- Subjects
- *
MOLECULAR graphs , *EXTREME value theory , *COMPUTER science , *SOCIAL networks , *MATHEMATICS - Abstract
Graph theory is a crucial branch of mathematics in fields like network analysis, molecular chemistry, and computer science, where it models complex relationships and structures. Many indices are used to capture the specific nuances in these structures. In this paper, we propose a new index, the weighted asymmetry index, a graph-theoretic metric quantifying the asymmetry in a network using the distances of the vertices connected by an edge. This index measures how uneven the distances from each vertex to the rest of the graph are when considering the contribution of each edge. We show how the index can capture the intrinsic asymmetries in diverse networks and is an important tool for applications in network analysis, optimization problems, social networks, chemical graph theory, and modeling complex systems. We first identify its extreme values and describe the corresponding extremal trees. We also give explicit formulas for the weighted asymmetry index for path, star, complete bipartite, complete tripartite, generalized star, and wheel graphs. At the end, we propose some open problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Classification and Identification of Frequency-Hopping Signals Based on Jacobi Salient Map for Adversarial Sample Attack Approach.
- Author
-
Zhu, Yanhan, Li, Yong, and Wei, Tianyi
- Subjects
- *
ARTIFICIAL neural networks , *ELECTRONIC countermeasures , *EXTREME value theory , *JACOBI method , *CLASSIFICATION - Abstract
Frequency-hopping (FH) communication adversarial research is a key area in modern electronic countermeasures. To address the challenge posed by interfering parties that use deep neural networks (DNNs) to classify and identify multiple intercepted FH signals—enabling targeted interference and degrading communication performance—this paper presents a batch feature point targetless adversarial sample generation method based on the Jacobi saliency map (BPNT-JSMA). This method builds on the traditional JSMA to generate feature saliency maps, selects the top 8% of salient feature points in batches for perturbation, and increases the perturbation limit to restrict the extreme values of single-point perturbations. Experimental results in a white-box environment show that, compared with the traditional JSMA method, BPNT-JSMA not only maintains a high attack success rate but also enhances attack efficiency and improves the stealthiness of the adversarial samples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Characterization of Electric Field Fluctuations in the High-Latitude Ionosphere Using a Dynamical Systems Approach: CSES-01 Observations.
- Author
-
Quattrociocchi, Virgilio, De Michelis, Paola, Alberti, Tommaso, Papini, Emanuele, D'Angelo, Giulia, and Consolini, Giuseppe
- Subjects
- *
ELECTRIC field strength , *EXTREME value theory , *ELECTRIC fields , *DEGREES of freedom , *DYNAMICAL systems - Abstract
We present an analysis of the ionospheric electric field dynamics at high latitudes during periods of quiet and disturbed geomagnetic activity by exploiting recent advancements in dynamical systems and extreme value theory. Specifically, we employed two key indicators: the instantaneous dimension d, which evaluates the degrees of freedom within the system, and the extremal index θ , which quantifies the system's persistence in a given state. Electric field measurements were obtained from the CSES-01 satellite at mid- and high latitudes in the Southern Hemisphere. Our analysis revealed that the instantaneous dimension increases upon crossing specific ionospheric regions corresponding to the auroral oval boundaries. Outside these regions, the instantaneous dimension fluctuates around the state-space dimension, suggesting an ergodic nature of the system. As geomagnetic activity intensifies, differences in the properties of various ionospheric regions persist, albeit with an increased system instability characterized by higher θ values, thus indicating the externally driven nature of the electric field response to geomagnetic activity. This study provides new insights into the spatial and temporal variability of electric field fluctuations in the ionosphere, highlighting the complex interplay between geomagnetic conditions and ionospheric dynamics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Global evaluation of sentinel 2 level 2A Sen2Cor aerosol optical thickness retrievals.
- Author
-
Kumar, Akhilesh and Mehta, Manu
- Subjects
- *
EXTREME value theory , *BODIES of water , *LAND cover , *CITIES & towns , *SPATIAL resolution - Abstract
The present study aims to conduct the first known comprehensive global evaluation of Aerosol Optical Thickness (AOT) retrievals derived from the Sentinel 2 Sen2Cor algorithm between 2018 and 2022, using data from over 400 AERONET stations. The results indicate that Sentinel 2 tends to underestimate AOT, especially at higher aerosol loadings. Although there appears to be a good overall correlation (
r = 0.57) with low RMSE (0.16) and relative mean bias (RMB = -2.46%) between AERONET and Sentinel 2 AOT datasets, regional analysis reveals significant variation in performance across regions, with areas like Europe and Americas exhibiting stronger correlations and lower RMSE than others like Indian subcontinent and Southern Africa. Temporal analysis also suggests an improvement in Sentinel 2 performance, particularly in capturing extreme AOT values in 2022. While elevation does not appear to have any noticeable impact on AOT estimation, slight variations could be observed over certain land cover types like sparse vegetation and permanent water bodies. Though improvements are certainly required over certain geographical regions like Indian subcontinent, Sentinel 2 AOT can, nevertheless, be reliably used for aerosol studies in urban areas or at a finer spatial resolution, especially in Europe, Mainland U.S.A. and East Asia. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
14. High‐sensitivity C‐reactive protein to high‐density lipoprotein cholesterol ratio predicts long‐term adverse outcomes in patients who underwent percutaneous coronary intervention: A prospective cohort study.
- Author
-
Dai, Xin‐Ya, Xue, Zheng‐Kai, Wang, Xiao‐Wen, Chen, Kang‐Yin, Hu, Su‐Tao, Tse, Gary, Rha, Seung‐Woon, and Liu, Tong
- Subjects
- *
PERCUTANEOUS coronary intervention , *EXTREME value theory , *CORONARY artery disease , *SENSITIVITY analysis , *C-reactive protein - Abstract
High‐sensitivity C‐reactive protein (hsCRP) to high‐density lipoprotein cholesterol (HDL‐C) ratio (CHR) is associated with coronary artery disease (CAD), but its predictive value for long‐term adverse outcomes in patients with CAD following percutaneous coronary intervention (PCI) remains unexplored and is the subject of this study. Patients with CAD who underwent PCI at the Korea University Guro Hospital‐Percutaneous Coronary Intervention (KUGH‐PCI) Registry since 2004 were included. Patients were categorized into tertiles according to their CHR. The end points were all‐cause mortality (ACM), cardiac mortality (CM) and major adverse cardiac events (MACEs). Kaplan–Meier analysis, multivariate Cox regression, restricted cubic spline (RCS) and sensitivity analyses were performed. A total of 3260 patients were included and divided into Group 1 (CHR <0.830, N = 1089), Group 2 (CHR = 0.830–3.782, N = 1085) and Group 3 (CHR >3.782, N = 1086). Higher CHR tertiles were associated with progressively greater risks of ACM, CM and MACEs (log‐rank, p < 0.001). Multivariate Cox regression showed that patients in the highest tertile had greater risks of ACM (HR: 2.127 [1.452–3.117]), CM (HR: 3.575 [1.938–6.593]) and MACEs (HR: 1.337 [1.089–1.641]) than those in the lowest tertile. RCS analyses did not reveal a significant non‐linear relationship between CHR and ACM, CM or MACEs. The significant associations remained significant in the sensitivity analyses, RCS analyses with or without extreme values, subgroup analyses and multiple imputations for missing data. Elevated CHR is a novel, independent risk factor for long‐term ACM, CM and MACEs in CAD patients following PCI. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Comparison of higher degree stop-loss transforms.
- Author
-
Arab, Idir, Oliveira, Paulo Eduardo, and Santos, Beatriz
- Subjects
- *
EXTREME value theory , *PROBABILITY theory , *EQUILIBRIUM - Abstract
We regard high degree stop-loss transforms as iterated equilibrium distributions to establish comparison criteria with respect to star-shape and convex transform orders. As an application, we derive monotonicity properties for exceedance probabilities, providing information on how likely these distributions are to be far from the extreme values of their support. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Foresight and forecasting of socio-economic development of rural territories.
- Author
-
Kovshov, Vitaliy, Stovba, Eugene, Lukyanova, Milyausha, Zalilova, Zariya, and Sitdikova, Guzalia
- Subjects
RURAL development ,ECONOMIC indicators ,EXTREME value theory ,ECONOMIC development ,SOCIOECONOMIC factors - Abstract
The development of rural territorial formations is an actual direction of agro-economic science and an essential component of the state and society functioning. An unprecedented aggravation characterizes the modern context of agrarian economic development. The study aims to apply foresight to design target indicators reflecting the socio-economic development of rural territorial entities. The research actualizes the importance of foresight technologies used as an instrument for forming strategic priorities for developing rural territorial entities. The methodological features of the foresight technologies application are revealed, and the extreme values of target rural development indicators are designed using cluster displaying. Step-by-step algorithm for foresight research of rural development involved seven stages. Clustering of twenty-two rural territories was carried out according to five or six target indicators of economic development with the allocation of four clusters. The research results can be practically used for strategic planning and forecasting of the development of rural territorial formations. The main research results include the clarification of methodological provisions on foresight forecasting of the development of territorial entities at the regional level and the design of critical indicators and parameters reflecting the socio-economic development of rural territories. The paper shows that foresight application in the management practice of strategic planning contributes to improving the processes of making, implementing, and controlling decisions aimed at the harmonious and balanced economic and social development of rural territorial entities. A distinctive feature of foresight is the active interaction between experts, government, business and the population to coordinate actions and develop a common vision for the future development of rural territorial entities. The paper summarizes that the conclusions of foresight forecasting determine the "mainstream" directions for rural development. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Anthropogenic and meteorological effects on the counts and sizes of moderate and extreme wildfires.
- Author
-
Lawler, Elizabeth S. and Shaby, Benjamin A.
- Subjects
EXTREME value theory ,PARETO distribution ,WILDFIRES ,AT-risk behavior ,ATMOSPHERIC models ,WILDFIRE prevention - Abstract
The growing frequency and size of wildfires across the US necessitates accurate quantitative assessment of evolving wildfire behavior to predict risk from future extreme wildfires. We build a joint model of wildfire counts and burned areas, regressing key model parameters on climate and demographic covariates. We use extended generalized Pareto distributions to model the full distribution of burned areas, capturing both moderate and extreme sizes, while leveraging extreme value theory to focus particularly on the right tail. We model wildfire counts with a zero‐inflated negative binomial model, and join the wildfire counts and burned areas sub‐models using a temporally‐varying shared random effect. Our model successfully captures the trends of wildfire counts and burned areas. By investigating the predictive power of different sets of covariates, we find that fire indices are better predictors of wildfire burned area behavior than individual climate covariates, whereas climate covariates are influential drivers of wildfire occurrence behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. CONSTRUCTION METHOD OF INFORMATION SECURITY DETECTION BASED ON CLUSTERING ALGORITHM.
- Author
-
SHAOBO CHEN
- Subjects
INFORMATION technology security ,EXTREME value theory ,GLOBAL optimization ,HYBRID securities ,ALGORITHMS - Abstract
A method for K-prototype clustering, which can process mixed data types, is proposed first. An algorithm for information security evaluation of hybrid clusters based on K-prototypes was constructed. This method makes full use of the excellent global optimization performance of PSO and effectively solves the defect that the K-protection function quickly falls into local optimization. Simulation results show that the proposed method can effectively prevent local extreme values and improve overall performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Regional frequency analysis of extreme wind in Pakistan using robust estimation methods.
- Author
-
Ahmad, Ishfaq, Salman, Muhammad, Almanjahie, Ibrahim Mufrah, Alshahrani, Fatimah, ul Rehman Khan, Muhammad Shafeeq, Fawad, Muhammad, and Haq, Ehtasham ul
- Subjects
- *
EXTREME value theory , *DISTRIBUTION (Probability theory) , *STANDARD deviations , *MONTE Carlo method , *WIND speed - Abstract
The quantile estimates of extreme wind speed are needed for various areas of interest using regional frequency analysis (RFA) and extreme value theory. These calculations are crucial for the coding of wind speed. The data was taken from the NASA official website at a 10-meter distance and measured in meters per second (m/s). RFA of annual maximum wind speed (AMWS) using L-moments is performed utilizing annual maximum wind speed data from sixteen sites (16) in Pakistan's Khyber Pakhtunkhwa province. There are no sites that are found to be discordant. The wards method is used to construct a homogenous region and make two homogenous regions from 16 sites. The heterogeneity test justifies that both clusters are homogeneous. The most appropriate probability distribution from the Generalized Normal (GNO), Generalized Logistic (GLO), Pearson Type-3 (P3), Generalized Pareto (GPA), and Generalized Extreme Value (GEV) distributions is chosen to calculate regional quantiles. According to the L-moments diagram and Z statistics, the GEV for Cluster- Ι and GLO for Cluster- ΙΙ are the best suggestions from the others. Both clusters' robustness is measured utilizing relative bias (RB) and relative root mean square error (RRMSE). Overall, GEV distribution is fit for cluster-Ι, and the GLO distribution is fit for cluster-ΙΙ. Utilizing the site mean and median as index parameters, we can also find at-site quantiles from regional quantiles. The study's quantile estimates can be employed in codified structural designs with policy consequences. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Strong Linkage Between Observed Daily Precipitation Extremes and Anthropogenic Emissions Across the Contiguous United States.
- Author
-
Nanditha, J. S., Villarini, Gabriele, Kim, Hanbeen, and Naveau, Philippe
- Subjects
- *
PARETO distribution , *GENERAL circulation model , *CLIMATE extremes , *EXTREME value theory , *TEMPERATE climate - Abstract
The results of probabilistic event attribution studies depend on the choice of the extreme value statistics used in the analysis, particularly with the arbitrariness in the selection of appropriate thresholds to define extremes. We bypass this issue by using the Extended Generalized Pareto Distribution (ExtGPD), which jointly models low precipitation with a generalized Pareto distribution and extremes with a different Pareto tail, to conduct daily precipitation attribution across the contiguous United States (CONUS). We apply the ExtGPD to 12 general circulation models from the Coupled Model Intercomparison Project Phase 6 and compare counterfactual scenarios with and without anthropogenic emissions. Observed precipitation by the Climate Prediction Center is used for evaluating the GCMs. We find that greenhouse gases rather than natural variability can explain the observed magnitude of extreme daily precipitation, especially in the temperate regions. Our results highlight an unambiguous linkage of anthropogenic emissions to daily precipitation extremes across CONUS. Plain Language Summary: We investigate how human‐induced emissions affect daily rainfall extremes across the United States. The attribution of an extreme event to human‐induced emissions depends on the selected extreme event statistics, with setting a threshold to define what counts as an extreme event remaining a major challenge. To overcome this, we used the Extended Generalized Pareto Distribution (ExtGPD) that jointly models both low and heavy rainfall events without defining a threshold, providing a more complete picture of the full distribution including extremes. We fitted the ExtGPD to 12 general circulation models and compared scenarios with and without human‐induced emissions. Our findings suggest that human emissions are responsible for the observed intensity of daily rainfall extremes across the United States, especially in regions with temperate climates, and that these extremes would have been smaller without greenhouse gases. Key Points: We apply the Extended Generalized Pareto Distribution for probabilistic event attribution to bypass issues with threshold specificationAnthropogenic emissions alone could exacerbate the observed magnitude of extreme daily precipitation across the United StatesThe study underscores the urgent need for mitigation, revealing a clear link between anthropogenic activities and extreme precipitation [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Applying Stationary and Nonstationary Generalized Extreme Value Distributions in Modeling Annual Extreme Temperature Patterns.
- Author
-
Kyojo, Erick A., Osima, Sarah E., Mirau, Silas S., Masanja, Verdiana G., and Lin, Gwo-Fong
- Subjects
- *
DISTRIBUTION (Probability theory) , *EXTREME value theory , *HEAT waves (Meteorology) , *ATMOSPHERIC models , *TREND analysis - Abstract
This study applies both stationary and nonstationary generalized extreme value (GEV) models to analyze annual extreme temperature patterns in four stations of Southern Highlands region of Tanzania: Iringa, Mbeya, Rukwa, and Ruvuma over a 30‐year period. Parameter estimates reveal varied distribution characteristics, with the location parameter μ ranging from 28.98 to 33.44, and shape parameter ξ indicating both bounded and heavy‐tailed distributions. These results highlight the potential for extreme temperature conditions, such as heatwaves and droughts, particularly in regions with heavy‐tailed distributions. Return level estimates show increasing temperature extremes, with 100‐year return levels reaching 33.95 °C in Ruvuma. Nonstationary models that incorporate time‐varying location and scale parameters significantly improve model fit, particularly in Mbeya, where such a model outperforms the stationary model (p value = 0.0092). Trend analyses identify significant temperature trends in Mbeya (p value = 0.0123) and Ruvuma (p value = 0.0015), emphasizing the need for adaptive climate strategies. These findings underscore the importance of accounting for nonstationarity in climate models to better understand and predict temperature extremes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Assessing financial risk with extreme value theory: US financial indemnity loss data analysis.
- Author
-
Aljadani, Abdussalam
- Subjects
EXTREME value theory ,FINANCIAL risk ,CHOICE (Psychology) ,CORPORATE finance ,VALUE at risk - Abstract
In this paper, we presented a financial analysis of the values of financial losses for real data in light of a set of indicators for measuring financial risks. The value-at-risk (VAR), tail mean–variance (TMV), tail-VAR (TVAR), tail variance (TV), Peaks Over a Random Threshold Value-at-Risk (PORT-VAR) and the mean of order P (MOO P ) indicators are used in identifying and describing important events or outliers within US financial indemnity loss data. Some extreme financial value theory (EFVT) models are compared in view of financial indemnity loss data and according to some confidence levels. The paper provided a clear financial framework for financial institutions to help them avoid large, sudden losses. Therefore, financial data with a long tail to the right were chosen, and several financial risk measures were used that study and analyze the behavior of the long tail. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Adaptive projection plane and reference point strategy for multi-objective particle swarm optimization.
- Author
-
Zhang, Yansong, Liu, Yanmin, Zhang, Xiaoyan, Song, Qian, and Yang, Jie
- Subjects
SURFACE plates ,CLUSTERING of particles ,EXTREME value theory ,PROBLEM solving ,PARTICLE swarm optimization ,ALGORITHMS - Abstract
Achieving a balance between convergence and diversity and their mutual enhancement is a complex task in the process of algorithm improvement. This is crucial because it is directly related to the effectiveness of the algorithm in obtaining accurate and uniformly distributed Pareto frontiers. Although significant progress has been made in particle swarm algorithms, exploring new approaches is necessary. In this paper, we construct a projection plane (projection line in 2D) based on the extreme values of the non-dominated solutions, select a set of uniform reference points on the projection plane, and then project the non-dominated solutions onto the constructed projection plane to form projection points. The reference points and projection points on the projection plane are thus utilized to guide the updating of the population as well as the maintenance of the external archive, a strategy that enhances the algorithm's global exploration and local exploitation capabilities. Secondly, we aggregate the target values of particles into a single scalar value and combine the idea of particle fusion to design a scheme for the particle selection of individual optimal particles. This paper further improves the algorithm's overall performance by using the information between populations to select individual optimal particles. Lastly, it is evaluated against a number of multi-objective algorithms that are currently in use and perform well on 22 test problems. The findings demonstrate that the algorithm this paper proposes performs better when solving multi-objective problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Evaluation of flood metrics across the Mississippi-Atchafalaya River Basin and their relation to flood damages.
- Author
-
Schilling, Keith E., Anderson, Elliot S., Mount, Jerry, Suttles, Kelly, Gassman, Philip W., Cerkasova, Natalja, White, Michael J., and Arnold, Jeffrey G.
- Subjects
- *
FLOOD damage , *WATERSHEDS , *EXTREME value theory , *HYDROLOGIC models , *STREAMFLOW , *FLOOD risk - Abstract
Societal risks from flooding are evident at a range of spatial scales and climate change will exacerbate these risks in the future. Assessing flood risks across broad geographical regions is a challenge, and often done using streamflow time-series records or hydrologic models. In this study, we used a national-scale hydrological model to identify, assess, and map 16 different streamflow metrics that could be used to describe flood risks across 34,987 HUC12 subwatersheds within the Mississippi-Atchafalaya River Basin (MARB). A clear spatial difference was observed among two different classes of metrics. Watersheds in the eastern half of the MARB exhibited higher overall flows as characterized by the mean, median, and maximum daily values, whereas western MARB watersheds were associated with flood indicative of high extreme flows such as skewness, standardized streamflow index and top days. Total agricultural and building losses within HUC12 watersheds were related to flood metrics and those focused on higher overall flows were more correlated to expected annual losses (EAL) than extreme value metrics. Results from this study are useful for identifying continental scale patterns of flood risks within the MARB and should be considered a launching point from which to improve the connections between watershed scale risks and the potential use of natural infrastructure practices to reduce these risks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Statistics of unidirectional wave groups with and without freak waves observed in the Norwegian Sea.
- Author
-
Fu, Ruili, Cai, Huayi, Wang, Gang, Zheng, Jinhai, and Tao, Aifeng
- Subjects
- *
ROGUE waves , *OCEAN waves , *OFFSHORE structures , *EXTREME value theory , *WAVE energy - Abstract
The statistical properties of observed wave groups are essential for designing marine structures. However, the characteristics of group energy, length, and profiles remain unclear. This paper analyzes more than 1 million measured ocean unidirectional wave groups in deep water of the Norwegian Sea during a decade. By classifying wave groups into ordinary and extreme categories based on the presence of a freak wave, it is found that both the distributions of the non-dimensional group energy and group duration follow the Generalized extreme value functions. Moreover, the statistics of wave groups are significantly influenced by the spectral width, with wave steepness having negligible effects. The ratio of the average group duration between extreme and ordinary categories varies slightly from 1.4 to 1.8, although the energy of extreme wave groups can reach 3.0–4.5 times than that of ordinary wave groups. Furthermore, unlike the typical shape of a freak wave with a high wave crest or deep wave trough significantly larger than the surrounding waves, consecutive large waves resembling the "three sisters" are quite common in this location. However, NewWave theory generally underestimates the wave amplitudes surrounding a freak wave, leading to the predicted energy of the most likely extreme wave groups being only about 50–80% of the measured values. Finally, a new modified model is proposed to predict the average shapes of extreme wave groups. After testing numerous wave cases, the model accurately captures the mean morphology of extreme wave groups in the Norwegian Sea. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Perspectives on local thermal non-equilibrium (LTNE) Darcy–Bénard convection: Variable permeability and viscosity effects.
- Author
-
Latha, N., Shankar, B. M., Naveen Kumar, S. B., and Shivakumara, I. S.
- Subjects
- *
HEAT transfer coefficient , *EXTREME value theory , *GALERKIN methods , *PERMEABILITY , *VISCOSITY - Abstract
The interplay between variations in permeability and viscosity on the onset of local thermal non-equilibrium in Darcy–Bénard convection has been investigated. Specifically, permeability is modeled as decreasing linearly with depth, while viscosity decreases exponentially. The validity of the principle of exchange of stabilities is confirmed. A linear instability analysis of the quiescent state is conducted through normal mode decomposition of disturbances, with threshold values for instability onset computed numerically using the Galerkin method. The individual and combined effects of increasing the variable permeability and viscosity parameters on the instability characteristics of the system are examined in detail, highlighting both commonalities and distinctions. It is observed that increasing each parameter individually hastens the onset of convection. However, their combined influence produces both stabilizing and destabilizing effects under certain parametric conditions. In all scenarios, an increase in the scaled interphase heat transfer coefficient consistently delays the onset of convection, whereas a higher ratio of porosity-modified conductivities has the opposite effect. Furthermore, the size of the convection cells remains unchanged at the extreme values of the scaled interphase heat transfer coefficient. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. HIV deathrate prediction by Gaidai multivariate risks assessment method.
- Author
-
Gaidai, Oleg
- Subjects
- *
HIV , *COMMUNICABLE diseases , *EXTREME value theory , *SPATIOTEMPORAL processes , *AIDS - Abstract
Objectives: HIV is a contagious disease with reportedly high transmissibility, being spread worldwide, with certain mortality, allegedly presenting a burden to public health worldwide. The main objective of this study was to determine excessive HIV death risks at any time within any region or country of interest. Study design: Current study presents a novel multivariate public health system bio‐risk assessment approach that is particularly applicable to environmental multi‐regional, biological, and public health systems, being observed over a representative period of time, yielding reliable long‐term HIV deathrate assessment. Hence, the development of a new bio‐statistical approach, that is, population‐based, multicenter, and medical survey‐based. The expansion of extreme value statistics from the univariate to the bivariate situation meets with numerous challenges. Firstly, the univariate extreme value types theorem cannot be directly extended to the bivariate (2D) case, ‐ not to mention challenges with system dimensionality higher than 2D. Methods: Existing bio‐statistical methods that process spatiotemporal clinical observations of multinational bio‐processes often do not have the advantage of efficiently dealing with high regional dimensionalities and complex nonlinear inter‐correlations between different national raw datasets. Hence, this study advocates the direct application of the novel bio‐statistical Gaidai method to a raw unfiltered clinical data set. Results: This investigation described the successful application of a novel bio‐risk assessment approach, yielding reliable long‐term HIV mortality risk assessments. Conclusions: The suggested risk assessment methodology may be utilized in various public bio and public health clinical applications based on available raw patient survey datasets. Key points: A novel public health bio‐risk assessment methodology was developed and successfully applied to HIV (i.e., Human Immunodeficiency Virus) recorded clinical death rates.Disease outbreak's multidimensional spatiotemporal risks estimated.Confidence bands assessed for predicted future HIV deathrates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. A Methodological Approach to Improving Extreme Precipitation Reanalysis Data Using the Clausius-Clapeyron Relationship: A Case Study in a Mediterranean City.
- Author
-
Papadopoulos-Zachos, Alexandros and Anagnostopoulou, Christina
- Subjects
- *
ATMOSPHERIC models , *RAINFALL , *EXTREME value theory , *TWENTY-first century , *CLIMATE change - Abstract
Climate change is a crucial issue of the 21st century, leading to more frequent and severe extreme precipitation events globally. These events result in significant social and economic disruptions, including flooding, loss of life, and damage to infrastructure. Projections suggest that extreme rainfall will intensify in the latter half of the century, underscoring the need for accurate and timely forecasting. Despite advancements in meteorological and climate models that offer high accuracy for various weather parameters, these models still struggle to detect extreme values, particularly for precipitation. This research examines the sensitivity of extreme precipitation events to temperature, based on the Clausius-Clapeyron relationship, focusing on Thessaloniki, Greece. It also evaluates the effectiveness of reanalysis data in identifying extreme precipitation and explores how rainfall-temperature relationships can enhance prediction accuracy. The findings are vital for improving the estimation of extreme rainfall events and informing the design of flood-resilient infrastructure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Weibull-like bivariate probability density function and associated estimation algorithms.
- Author
-
Saâdaoui, Foued
- Subjects
- *
OPTIMIZATION algorithms , *EXTREME value theory , *PROBABILITY density function , *MARGINAL distributions , *CHARACTERISTIC functions - Abstract
We propose to introduce a new class of bivariate probability distributions, which we believe is of great interest to statisticians and data scientists. However different from the conventional Weibull it might be, the density function posited herein allows to generalize its properties in two dimensions (2D). This new function, essentially, has structure characteristics and properties different from those of the various bivariate Weibull-type functions found in the literature. The main features, such as the marginal distributions, moments, characteristic functions of this bivariate density are defined. Two related maximum likelihood estimation algorithms are also explicated, tested, and compared. Numerical simulations show the practicality of these algorithms as well as the interest of the new density in several areas of data analysis and extreme values modeling. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. 结合虚拟样本生成的油菜花期集成学习预测模型.
- Author
-
谢乾伟, 薛丰昌, and 陈剑飞
- Subjects
- *
EXTREME value theory , *RANDOM forest algorithms , *DECISION trees , *MACHINE learning , *RAPESEED - Abstract
Linear regression cannot fully reveal the complex non-linear relationships among influencing factors and scarce samples in the flowering period. In this study, ensemble learning was proposed to predict the flowering periods of rapeseed. The generation of virtual samples was also incorporated. The rapeseed in full bloom and meteorological data was utilized in Longyou County, Quzhou City, Zhejiang Province, China from 1998 to 2023. The original samples were expanded using Gaussian Mixture Model-based Virtual Sample Generation and Cubic Spline Interpolation. Two new datasets were obtained, each of which contained 985 samples. The models were established using eight machine learning methods: Random Forest (RF), Kernel Ridge Regression (KRR), Ridge Regression (RR), Least Absolute Shrinkage and Selection Operator (Lasso), Support Vector Regression (SVR), Extreme Gradient Boosting (XGBoost), Light Gradient Boosting Machine (LightGBM), and Gradient Boosting Decision Tree (GBDT). Hyperparameter optimization was conducted using a Bayesian optimizer. Finally, a prediction model was established for the rapeseed flowering period using stacking ensemble learning. The vast majority of models demonstrated superior performance on the Cubic interpolation dataset, compared with the original and GMM-VSG dataset. Specifically, the RF model was achieved in an RMSE of 0.679 d, an MAE of 0.351 d, and an R² of 0.990, indicating significant improvements, compared with the original dataset with an RMSE of 6.286 d, an MAE of 5.028 d, and an R² of 0.201, as well as the GMM-VSG dataset with an RMSE of 2.680 d, an MAE of 1.588 d, and an R² of 0.881. Additionally, the SVR model also performed better on the Cubic dataset, with an RMSE of 0.849 d, an MAE of 0.333 d, and an R² of 0.984, indicating a better performance than before. LightGBM as an ensemble learning was performed the best on the Cubic dataset, with the lowest RMSE of 0.613 d MAE of 0.336 d, and the highest R² of 0.992. The strong feature learning and noise resistance were verified to capture the complex relationships within the dataset. In contrast, there was no significant improvement of Lasso and RR models on the Cubic dataset. For instance, Lasso exhibited an RMSE of 3.879 d and an MAE of 3.054 d on the Cubic dataset. There was a relative decrease in the error, compared with the original RMSE of 6.329 d and MAE of 5.567 d. There was a substantial gap relative to other models. Five models were developed using the Stacking ensemble learning approach: SRX_L, All_L, SLL_L, SRL_L, and SRK_L. Among them, the SRX_L model performed the best across various metrics. The highest R² value of 0.999 7 was achieved with the lowest RMSE and MAE values among all models, at 0.1227 d and 0.1056 d, respectively. There was a general consistency in the actual and predicted flowering trends, in terms of the fitting flowering period. The high predictive accuracy was also obtained over most years, particularly in 2001, 2011, and 2014. Among them, the prediction closely matched the actual data with minimal discrepancies, sometimes less than 0.01 or even approaching zero. However, there were some years with the larger differences, such as 1999 and 2023. Particularly, the year 1999 experienced the largest discrepancy, where the error was 0.442 1 d. The maximum actual flowering period occurred in 2005, reaching 92 days, with an error between the predicted and actual values of 0.041 6 d. The minimum actual flowering period was observed in 2020, at 63 days, with an error between the predicted and actual values of 0.132 5 d. Therefore, the model can be expected to highly accurately predict the extreme values. The virtual sample generation can also be suitable for small datasets. The predictive accuracy and generalizability of the improved model were significantly enhanced to reduce the costs and challenges of data collection. Compared with single machine learning, Stacking ensemble learning can substantially improve the predictive performance. Stacking ensemble learning is well-suited to complex tasks with nonlinear relationships, such as the flowering periods of rapeseed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Marine Radar Constant False Alarm Rate Detection in Generalized Extreme Value Distribution Based on Space-Time Adaptive Filtering Clutter Statistical Analysis.
- Author
-
Wen, Baotian, Lu, Zhizhong, and Zhou, Bowen
- Subjects
- *
DISTRIBUTION (Probability theory) , *ADAPTIVE filters , *EXTREME value theory , *FALSE alarms , *CLUTTER (Radar) , *DETECTION alarms - Abstract
The performance of marine radar constant false alarm rate (CFAR) detection method is significantly influenced by the modeling of sea clutter distribution and detector decision rules. The false alarm rate and detection rate are therefore unstable. In order to address low CFAR detection performance and the modeling problem of non-uniform, non-Gaussian, and non-stationary sea clutter distribution in marine radar images, in this paper, a CFAR detection method in generalized extreme value distribution modeling based on marine radar space-time filtering background clutter is proposed. Initially, three-dimensional (3D) frequency wave-number (space-time) domain adaptive filter is employed to filter the original radar image, so as to obtain uniform and stable background clutter. Subsequently, generalized extreme value (GEV) distribution is introduced to integrally model the filtered background clutter. Finally, Inclusion/Exclusion (IE) with the best performance under the GEV distribution is selected as the clutter range profile CFAR (CRP-CFAR) detector decision rule in the final detection. The proposed method is verified by utilizing real marine radar image data. The results indicate that when the P f a is set at 0.0001, the proposed method exhibits an average improvement in P D of 2.3% compared to STAF-RCBD-CFAR, and a 6.2% improvement compared to STCS-WL-CFAR. When the P f a is set at 0.001, the proposed method exhibits an average improvement in P D of 6.9% compared to STAF-RCBD-CFAR, and a 9.6% improvement compared to STCS-WL-CFAR. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Use of proxy observations to evaluate the accuracy of precipitation spatial gridding.
- Author
-
McGrath, Ray and Nolan, Paul
- Subjects
- *
EXTREME value theory , *TREND analysis , *SEASONS , *COMPUTER software - Abstract
A WRF‐based high‐resolution reanalysis of the Irish climate (1981–2010) is used to create proxy daily precipitation observations at the locations of climatological sites used for precipitation monitoring; the data are statistically representative of the real precipitation climate both for mean (over monthly, seasonal and annual periods) and extreme values. The proxy observations are spatially interpolated to the original WRF grid using a typical gridding package and compared against the original data to assess gridding errors. The errors are more complex than the estimates provided by the gridding software; systematic biases are evident which by the inclusion of strategically placed additional observing sites are shown to be greatly reduced. There is also evidence of systematic differences in trend analyses of extreme precipitation over the period. The method provides independent estimates of the errors that arise from actual gridding applications. It also facilitates the testing of the optimality of a network by highlighting possible inadequacies in an existing station layout and suggesting new observing site locations to fill gaps. Uncertainties regarding the errors in real precipitation observations, and possible spurious impacts linked to temporal changes in the real observing network, are avoided by this method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Statistical behaviour of laser-induced plasma and its complementary characteristic signals.
- Author
-
Buday, Jakub, Holub, Daniel, Pořízkaa, Pavel, and Kaisera, Jozef
- Subjects
- *
DISTRIBUTION (Probability theory) , *DATA distribution , *LASER plasmas , *EXTREME value theory , *STATISTICS - Abstract
In this work, we present a study aimed at the statistical distribution of characteristic signals of laser-induced plasmas. This work mainly focuses on observing statistical distribution for repetitive measurement of spectra, plasma plume imaging, and sound intensity. These were captured by using various laser irradiances, spanning between 1.72 and 6.25 GW cm-2 for a 266 nm laser. Their distributions were fitted by Gaussian, generalized extreme value (GEV), and Burr distributions, as typical representation models used in LIBS. These were compared using the Kolmogorov-Smirnov (KS) test by its null hypothesis on whether these models are suitable or fail to describe the statistical distribution of the data. The behavior of the data distribution has shown a certain connection to the plasma plume temperature. This was observed for all the used ablation energies. Performances of the statistical models were further compared in the outlier filtering process, where the relative standard deviation of the filtered data was observed. The results presented in this work suggest that an appropriate selection of a statistical model for the data representation can lead to an improvement in the LIBS performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Multispectral two-dimensional temperature field reconstruction using particle swarm optimization and the Broyden–Fletcher–Goldfarb–Shanno algorithm with multiple extreme value optimization.
- Author
-
Tong, Wei, Xiaojian, Hao, Shenxiang, Feng, Pan, Pei, Chenyang, Xu, Hongkai, Wei, and Xining, Wang
- Subjects
- *
PARTICLE swarm optimization , *FLAME temperature , *SPECTRAL sensitivity , *EXTREME value theory , *CAMERA calibration - Abstract
AbstractThe flame temperature is a crucial parameter in combustion, indicating heat generation and transfer. However, determining the correct temperature becomes challenging when flame emissivity is unknown. This article presents a multispectral 2D temperature field reconstruction method using particle swarm optimization and the Broyden–Fletcher–Goldfarb–Shanno algorithm, combined with multivariate extreme-value optimization. This method establishes an objective function based on the correlation between the true temperature and spectral data. The objective function was minimized by adjusting the emissivity, enabling the reconstruction of 2D temperature and emissivity images without assuming an emissivity model. Calibration with a multispectral camera yielded the spectral response coefficients, and three-time spline interpolation was used to verify the calibration data. The reconstructed 2D temperature field showed a mean absolute error (MAE) of 4.61 and structural similarity (SSIM) of 0.995. The reconstructed emissivity image had an MAE range of 0.04 to 0.053 and an SSIM range of 0.897 to 0.915. The system, tested on a butane flame, showed a temperature range of 1001 K to 1356 K, with an average temperature of 1194 K and emissivity values ranging from 0.3166 to 0.8621. The results align with the expected combustion process and radiation characteristics distribution of the butane flame. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Modeling non-stationarity in significant wave height over the Northern Indian Ocean.
- Author
-
Dhanyamol, P., Agilan, V., and KV, Anand
- Subjects
- *
DISTRIBUTION (Probability theory) , *EXTREME value theory , *SOUTHERN oscillation , *OCEAN engineering , *STRUCTURAL design - Abstract
Statistical descriptions of extreme met-ocean conditions are essential for the safe and reliable design and operation of structures in marine environments. The significant wave height ( H S ) is one of the most essential wave parameters for coastal and offshore structural design. Recent studies have reported that a time-varying component exists globally in the H S . Therefore, the non-stationary behavior of an annual maximum series of H S is important for various ocean engineering applications. This study aims to analyze the frequency of H S over the northern Indian Ocean by modeling the non-stationarity in the H S series using a non-stationary Generalized Extreme Value (GEV) distribution. The hourly maximum H S data (with a spatial resolution of 0.5° longitude × 0.5° latitude) collected from the global atmospheric reanalysis dataset of the European Centre for Medium-Range Weather Forecasts (ECMWF) is used for the study. To model the annual maximum series of H S using a non-stationary GEV distribution, two physical covariates (El-Ni n ~ o Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD)) and time covariates are introduced into the location and scale parameters of the GEV distribution. The return levels of various frequencies of H S are estimated under non-stationary conditions. From the results, average increases of 13.46%, 13.66%, 13.85%, and 14.02% are observed over the study area for the 25-year, 50-year, 100-year, and 200-year return periods, respectively. A maximum percentage decrease of 33.3% and a percentage increase of 167% are observed in the return levels of various return periods. The changes in the non-stationary return levels over time highlight the importance of modeling the non-stationarity in H S . [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Genetic parameters, genome-wide association study, and selection perspective on gestation length in 16 French cattle breeds.
- Author
-
Jourdain, Jeanlin, Capitan, Aurélien, Saintilan, Romain, Hozé, Chris, Fouéré, Corentin, Fritz, Sébastien, Boichard, Didier, and Barbat, Anne
- Subjects
- *
GENOME-wide association studies , *CATTLE breeding , *EXTREME value theory , *STANDARD deviations , *PREGNANCY , *CATTLE breeds - Abstract
The list of standard abbreviations for JDS is available at adsa.org/jds-abbreviations-24. Nonstandard abbreviations are available in the Notes. In this paper, we present a comprehensive study of gestation length (GL) in 16 cattle breeds by using large genotype and animal record databases. Data included over 20 million gestations since 2000 and genotypes from one million calves. The study addressed the GL variability within and between breeds, estimation of its direct and maternal heritability coefficients, association with fitness and several economic traits, and QTL detection. The breed average GL varied from 279.7 to 294.4 d in Holstein and Blonde d'Aquitaine breeds, respectively. Standard deviations per breed were similar and ranged from 5.2 to 5.8 d. Direct heritability (i.e., for GL defined as a trait of the calf) was moderate to high (h2 = 0.40–0.67), whereas the maternal heritability was low (0.04–0.06). Extreme breeding values for GL were strongly associated with a higher mortality during the first 2 d of life and were associated with milk production of dams for dairy breeds and precocity of females. Finally, several QTL were detected affecting GL with cumulated effects up to a few days, and at least 2 QTL were found to be shared between different breeds. Our study highlights the risks that would be associated with selection toward a reduced GL. Further genomic studies are needed to identify the causal variants and their association with juvenile mortality and other economic traits. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. How to define the threshold of takeover response ability of different drivers in conditional automated driving.
- Author
-
Chen, Haolin, Zhao, Xiaohua, Chen, Chen, Li, Zhenlong, Li, Haijian, and Gong, Jianguo
- Subjects
- *
EXTREME value theory , *PARETO distribution , *AUTOMOBILE driving simulators , *CONDITIONED response , *TRAFFIC accidents - Abstract
• This study introduce extreme value theory (EVT) and Peaks Over Threshold (POT) methods into the calculation of takeover time thresholds; a methodology for calculating and verifying takeover time thresholds has been developed. • This study calculated the takeover time thresholds of drivers with different attributes, which can provide support for differentiated management of driver qualifications. • This study can also support regulatory authorities to assess the driver's takeover response ability and support the responsibility division of automated vehicle accidents. In conditional automated driving, the takeover response ability threshold is necessary for driver qualification assessment and liability division of automated vehicle accidents. The primary objective of this study is to establish a clear and quantifiable threshold for drivers' takeover response ability in conditional automated driving scenarios. This threshold aims to serve as a benchmark for evaluating drivers' readiness and developing safety regulations in automated driving. We designed 18 takeover events and invited 42 drivers to participate in the driving simulation experiment, and obtained their takeover time data. First, we analyze the differences of takeover time among drivers with different attributes (gender, age, driving year). Second, based on the Peaks Over Threshold and the generalized Pareto distribution model, we use the graphic method to calculate the range of takeover time threshold for drivers with different attributes. The result shows that the difference in the threshold range of takeover time between male and female drivers is relatively tiny. There are differences in the threshold range of takeover time for different age drivers, and the threshold is negatively correlated with age. Drivers with high driving experience within a safe range are allowed to have longer takeover times. Finally, the rationality of the takeover time threshold for drivers with different attributes has been verified. The return level curves are approximately linear (R2 > 0.77), indicating that the GPD model can capture the overall trend of the return level, which is changing with the probability level. This proves that the takeover time threshold is reasonable. This study uses TTC min to calibrate takeover safety, and the takeover time threshold has a good classification performance for takeover safety (accuracy > 85 %). The above content proves the rationality of the takeover time threshold. The contribution of this study is to calculate the takeover time threshold of drivers with different attributes, which can help regulatory authorities assess the driver's takeover response ability and support the liability division of automated vehicle accidents. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Dynamic response of deep-buried circular loess tunnel under P-wave action.
- Author
-
Cheng, Xuansheng, Sun, Haodong, Zhang, Shanglong, Ding, Kai, and Xia, Peiyan
- Subjects
- *
TUNNEL lining , *GROUND motion , *ANALYTICAL solutions , *EXTREME value theory , *EARTHQUAKES , *TUNNELS , *ARCHES - Abstract
• The internal force analytical solution of lining structure under P-ware action was obtained by using the analytical method. • The influence of the hardness of surrounding rock and peak acceleration on the analytical solution were analyzed. • Through the engineering, the reasons for the large error between numerical solution and analytical solution was analyzed. • The most unfavorable position of lining structure under P-wave seismic load was proposed. In an earthquake, the strong interaction between the surrounding rock and the lining structure causes the lining structure susceptible to extrusion or shear damage, and predicting the internal force distribution trend of the lining structure by the analytical method was advantageous for the preliminary design of the tunnel structure. In this work, the quasi-static method was used to approximate the displacement and deformation caused by the P-wave seismic load as far-field compressive stress. The analytical solution of the internal force for the lining structure was obtained by the analytical method, and the reliability of the analytical method was further verified. The dynamic response law of the lining structure was analyzed using numerical solutions in conjunction with engineering examples. The results show that the analytical method can be used to predict the trend of internal force distribution in the lining structure under seismic action, which has important theoretical guiding significance for the preliminary design of the tunnel structure. With the decrease in the hardness of the loess surrounding, the relative deviation of the theoretical, numerical, and literature results gradually decreases. With the increase of ground motion intensity, the interaction between the loess surrounding and the lining structure becomes more intense, and the internal force of the lining structure gradually increases. The numerical results were greater than the analytical results due to the consideration of the influence of initial stress. Under the action of the P-wave earthquake, the extreme values of the internal force with the lining structure mainly occurred at the location of the vault, arch waist, and inverted, and the most unfavorable positions of circular loess tunnel lining structures under P-wave seismic load was proposed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Modeling Extreme Precipitation Data in a Mining Area.
- Author
-
Lymperi, Ourania-Anna and Varouchakis, Emmanouil A.
- Abstract
In recent decades, extreme precipitation events have increased in frequency and intensity in Greece and across regions of the Mediterranean, with significant environmental and socioeconomic impacts. Therefore, extensive statistical analysis of the extreme rainfall characteristics on a dense temporal scale is crucial for areas with important economic activity. For this reason, this paper uses the daily precipitation measurements of four meteorological stations in a mining area of northeastern Chalkidiki peninsula from 2006 to 2021. Three statistical approaches were carried out to develop the best-fitting probability distribution for annual extreme precipitation conditions, using the maximum likelihood method for parameter estimation: the block maxima of the generalized extreme value (GEV) distribution and the peak over threshold of the generalized Pareto distribution (GPD) based on extreme value theory (EVT), and the gamma distribution. Based upon this fitting distribution procedure, return periods for the extreme precipitation values were calculated. Results indicate that EVT distributions satisfactorily fit extreme precipitation, with GPD being the most appropriate, and lead to similar conclusions regarding extreme events. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Testing for Sufficient Follow‐Up in Censored Survival Data by Using Extremes.
- Author
-
Xie, Ping, Escobar‐Bach, Mikael, and Van Keilegom, Ingrid
- Abstract
In survival analysis, it often happens that some individuals, referred to as cured individuals, never experience the event of interest. When analyzing time‐to‐event data with a cure fraction, it is crucial to check the assumption of "sufficient follow‐up," which means that the right extreme of the censoring time distribution is larger than that of the survival time distribution for the noncured individuals. However, the available methods to test this assumption are limited in the literature. In this article, we study the problem of testing whether follow‐up is sufficient for light‐tailed distributions and develop a simple novel test. The proposed test statistic compares an estimator of the noncure proportion under sufficient follow‐up to one without the assumption of sufficient follow‐up. A bootstrap procedure is employed to approximate the critical values of the test. We also carry out extensive simulations to evaluate the finite sample performance of the test and illustrate the practical use with applications to leukemia and breast cancer data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. 地月L1点低能转移轨道设计与优化.
- Author
-
乔琛远 and 杨乐平
- Subjects
ORBITS (Astronomy) ,EXTREME value theory ,INVARIANT manifolds ,LAGRANGIAN points ,PROBLEM solving ,COMPUTER simulation - Abstract
Copyright of Systems Engineering & Electronics is the property of Journal of Systems Engineering & Electronics Editorial Department and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
42. Extreme Space Weather Impacts on GNSS Timing Signals for Electricity Grid Management.
- Author
-
Etchells, T., Aplin, K. L., Berthoud, L., Kalavana, A., and Larkins, A.
- Subjects
GLOBAL Positioning System ,SOLAR radio bursts ,EXTREME value theory ,EXTREME weather ,SPACE environment - Abstract
Extreme space weather events can have serious impacts on critical infrastructure, including Global Navigation Satellite Systems (GNSS). The use of GNSS, particularly as sources of accurate timing signals, is becoming more widespread, with one example being the measurement of electricity grid frequency and phase information to aid grid management and stability. Understanding the likelihood of extreme space weather impacts on GNSS timing signals is therefore becoming vital to maintain national electricity grid resilience. This study determines critical intensity thresholds above which the complete failure of a GNSS based timing system may occur. Solar radio bursts are identified as a simple example to investigate in more detail. The probability of occurrence of an extreme space weather event with an intensity equal to or greater than the critical intensity is estimated. Both a power law and extreme value theory were used to evaluate recurrence probabilities based on historical event frequencies. The probability was estimated to be between 3%–12% per decade to cause the complete failure of any GNSS‐based timing system. Plain Language Summary: Society is increasingly reliant on satellite technologies for a wide range of applications. If a huge space weather event were to impact the Earth today, it would likely have catastrophic impacts across many modern technologies including satellites and satellite systems, communications, and electricity grids. Here we assess the probabilities that intense solar events may affect timings derived from Global Navigation Satellite System (of which one commonly used example is the Global Positioning System, or GPS). These timing signals are increasingly used in electricity grid management. Solar radio bursts are used as an example, since they can overwhelm the weak GNSS signal. Statistical methods were employed to assess 46 years of solar radio burst data. Our findings suggest a 3%–12% probability per decade of an event large enough to disrupt the UK electricity grid. Key Points: Extreme space weather can disrupt Global Navigation Satellite Systems (GNSS) timing signals, an increasingly critical aspect of the electricity distribution networkFirst case study illustrating degradation of GNSS timing from solar radio burstsLikelihood of a significantly disruptive event is constrained at 3%–12% per decade [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Intensity-Duration-Frequency equations (IDF) for the state of Paraíba, Brazil, and regionalization of its parameters.
- Author
-
de Aragão, Ricardo, F. da Costa, Fagner, A. A. Rufino, Iana, Ramos Filho, Rivaildo da S., Srinivasan, Vajapeyam S., and do B. Truta Neto, José
- Subjects
RUNOFF models ,NONLINEAR regression ,HYDRAULIC structures ,EXTREME value theory ,TIME series analysis - Abstract
Copyright of Revista Brasileira de Engenharia Agricola e Ambiental - Agriambi is the property of Revista Brasileira de Engenharia Agricola e Ambiental and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
44. Extreme Value Index Estimation for Pareto-Type Tails under Random Censorship and via Generalized Means.
- Author
-
Gomes, M. Ivette, Henriques-Rodrigues, Lígia, Neves, M. Manuela, and Penalva, Helena
- Subjects
INFERENTIAL statistics ,SURVIVAL analysis (Biometry) ,UNIVARIATE analysis ,DATA analysis ,CENSORSHIP - Abstract
The field of statistical extreme value theory (EVT) focuses on estimating parameters associated with extreme events, such as the probability of exceeding a high threshold or determining a high quantile that lies at or beyond the observed data range. Typically, the assumption for univariate data analysis is that the sample is complete, independent, identically distributed, or weakly dependent and stationary, drawn from an unknown distribution F. However, in the context of lifetime data, censoring is a common issue. In this work, we consider the case of random censoring for data with a heavy-tailed, Pareto-type distribution. As is common in applications of EVT, the estimation of the extreme value index (EVI) is critical, as it quantifies the tail heaviness of the distribution. The EVI has been extensively studied in the literature. Here, we discuss several classical EVI-estimators and reduced-bias (RB) EVI-estimators within a semi-parametric framework, with a focus on RB EVI-estimators derived from generalized means, which will be applied to both simulated and real survival data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Counter-plying of two-ply yarn: inner structure changes and external response.
- Author
-
Vysanska, Monika
- Subjects
EXTREME value theory ,FIBERS ,YARN ,GEOMETRY - Abstract
The paper introduces a principle of iso-quantities for investigating the distribution of a number of fibers in a two-ply yarn cross-section in a regular grid. The main aim is to map the two-ply yarn inner structure and changes of this structure to various ply twists. So far, modifications of two-ply yarn in different ply twists have been explained only indirectly from a macroscopic point of view by observation of the process of two-ply yarn retraction. This paper applies cross-sections and the method of iso-quantities to find the causes of these macroscopic changes. The ratio of the twist coefficients of the twisted and single yarns α
s / αj was used to indicate the internal changes of the two-ply yarn. Using the iso-quantities method and the x–y difference curves of the fiber distribution in the two perpendicular directions of the cross-section, it was shown that counter-plying causes a change in the fiber distribution in the cross-section, resulting in the occurrence of negative values of the retraction, insignificant changes in the external geometry of the two-ply yarn, and a constant breaking strain behavior. These manifestations appear only up to the values of the ratio αs / αj of 1.2 or less. After this limit, the extreme value of the x–y difference settles down, the retraction begins to take on positive values, and the breaking strain of the two-ply yarn begins to increase as the two-ply yarn twist increases. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
46. Dynamic Response Prediction Model for Jack-Up Platform Pile Legs Based on Random Forest Algorithm.
- Author
-
Cui, Xiaohui, Liu, Hui, Lin, Xiang, Zou, Jiahe, Wang, Yu, and Zhou, Bo
- Subjects
MACHINE learning ,RANDOM forest algorithms ,OPTIMIZATION algorithms ,EXTREME value theory ,PREDICTION models - Abstract
Jack-up offshore platforms are widely used in many fields, and it is of great importance to quickly and accurately predict the dynamic response of platform pile leg structures in real time. The current analytical techniques are founded upon numerical modelling of the platform structure. Although these methods can be used to accurately analyze the dynamic response of the platform, they require a large quantity of computational resources and cannot meet the requirements of real-time prediction. A predictive model for the dynamic response of the pile leg of a jack-up platform based on the random forest algorithm is proposed. Firstly, a pile leg dynamic response database is established based on high-fidelity numerical model simulation calculations. The data are subjected to cleaning and dimensional reduction in order to facilitate the training of the random forest model. Cross-checking and Bayesian optimization algorithms are used for the selection of random forest parameters. The results show that the prediction model is capable of outputting response results for new environmental load inputs within a few milliseconds, and the prediction results remain highly accurate and perform well at extreme values. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. A Study of Free Surface Agitation in a Shipyard Using Numerical Modeling.
- Author
-
Herrera, Israel E., Galván, Arturo, Moreno-Martínez, Jatziri Y., and Gamiño, Edith A.
- Subjects
FREE surfaces ,EXTREME value theory ,NULL hypothesis ,SHIPYARDS ,STATISTICS - Abstract
In recent years, the Port of Topolobampo, Sinaloa, Mexico, has experienced unusual free sea surface elevations, particularly during the months of November and December, affecting the shipyard areas, service docks, and berthing locations. This study focuses on analyzing the oscillatory behavior of free surface elevations in shipyard regions. A hydrodynamic model was employed to simulate the circulation and sea surface agitation, aiming to quantify the elevation magnitudes based on oceanographic and meteorological data from November of the preceding year. A 30-day numerical simulation was conducted, revealing the velocity fields associated with coastal currents and tides during November, as well as the interaction between incident waves and wave transformations due to protective structures. The results demonstrated accurate behavior in 95% of the simulation period, while anomalous elevations exceeding those specified in the design and operational guidelines of the Port of Topolobampo were observed during the final five days of the simulation. An ANOVA test was performed between the surface elevation and vertically integrated velocity to assess whether the deviations in the last five days were statistically significant compared to the rest of the simulation period. With a P-value of less than 0.05, the null hypothesis of no difference was rejected, confirming a significant variation. These findings suggest that the extreme values recorded should be considered for the potential redesign of shipyard infrastructure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Local Path Planning of Unmanned Surface Vehicles' Formation Based on Vector Field and Flow Field Traction.
- Author
-
Liu, Yiping, Zhang, Jianqiang, Zhang, Yuanyuan, and Li, Zhixiao
- Subjects
DYNAMICAL systems ,EXTREME value theory ,AUTONOMOUS vehicles ,ALGORITHMS ,VELOCITY ,VECTOR fields - Abstract
Formation obstacle avoidance is an essential attribute of the cooperative task in unmanned surface vehicle (USV) formation. In real-world scenarios involving multiple USVs, both formation obstacle avoidance and formation recovery after obstacle avoidance play a critical role in ensuring the success of collaborative missions. In this study, an Interfered Fluid Dynamic System (IFDS) algorithm was used for obstacle avoidance due to its excellent robustness, high computational efficiency and path smoothness. The algorithm can provide good local path planning for USVs. However, the use of the IFDS on USVs still has the defect of local extreme values, which has been effectively modified to obtain an enhanced IFDS (EIFDS). In formation, based on the leader–follower method, the virtual leader was used to determine the desired position of USVs in formation, and the streamlines generated by the EIFDS guided the USVs. In order to make the formation converge to the desired formation better, the vector and scalar of the EIFDS algorithm were uncoupled, and different designs were made to achieve convergence to the desired formation. The interfered residue of the IFDS is not suitable for addressing collision avoidance between USVs in practice. Therefore, the vector field method was employed to tackle the issue, with some enhancements made to optimize its performance. Subsequently, a weighted separation method was applied to combine the vector field and EIFDS, resulting in a composite field solution. Finally, the formation obstacle avoidance strategy based on composite fields was formed. The feasibility of this scheme was verified by simulation, and compared with the single IFDS formation method, the pairwise spacing of USVs behind obstacles could be increased, and the reliability of formation obstacle avoidance was increased. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Features of extreme PM2.5 pollution and its influencing factors: evidence from China.
- Author
-
Deng, Lu and Liu, Xinzhu
- Subjects
CITIES & towns ,LOGISTIC regression analysis ,EXTREME value theory ,PARTICULATE matter ,GREENHOUSE gas mitigation - Abstract
Extreme PM 2.5 pollution has become a significant environmental problem in China in recent years, which is hazardous to human health and daily life. Noticing the importance of investigating the causes of extreme PM 2.5 pollution, this paper classifies cities across China into eight categories (four groups plus two scenarios) based on the generalized extreme value (GEV) distribution using hourly station-level PM 2.5 concentration data, and a series of multi-choice models are employed to assess the probabilities that cities fall into different categories. Various factors such as precursor pollutants and socio-economic factors are considered after controlling for meteorological conditions in each model. It turns out that SO 2 concentration, NO 2 concentration, and population density are the top three factors contributing most to the log ratios. Moreover, in both left- and right-skewed cases, the influence of a one-unit increase of SO 2 concentration on the relative probability of cities falling into different groups shows an increasing trend, while those of NO 2 concentration show a decreasing trend. At the same time, the higher the extreme pollution level, the bigger the effect of SO 2 and NO 2 concentrations on the probability of cities falling into normalized scenarios. The multivariate logit model is used for prediction and policy simulations. In summary, by analyzing the influences of various factors and the heterogeneity of their influence patterns, this paper provides valuable insights in formulating effective emission reduction policies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Bayesian Modelling of Tail Risk Using Extreme Value Theory with Application to Currency Exchange.
- Author
-
Adesina, Olumide Sunday and Obokoh, Lawrence Ogechukwu
- Subjects
EXTREME value theory ,PARETO distribution ,FINANCIAL risk ,INVESTMENT risk ,INVESTORS - Abstract
Modelling financial tail risk such as investment or financial risk is important to avoid high financial shocks. This study adopted Bayesian techniques to complement the classical extreme value theory (EVT) models to model the exchange rate risk of Nigeria against the South African ZAR. Hence, this study proposed the Bayesian Generalized Extreme Value (BGEV) model, Bayesian Generalized Pareto distribution (BGPD), Bayesian Gumbel (BG), and classical Generalized Pareto distribution (GPD) to fit the exchange rate returns over one hundred and four observations. The model selection criteria were used to determine the best model, consequently, the model selection criteria were in favour of BGEV model. The Value-at-Risk (VaR) and the Expected Shortfall (ES) were obtained from the estimated parameters. The results show that the Nigeria Naira exchange will experience losses against the ZAR both at 95% quantile and 99% quantile. This study recommends that investors should watch closely before making financial or investment decisions. This study aligns with the sustainable development goals (SDGs), 8.1 (sustainable economic growth), SDG 8 (Promote sustained, inclusive and sustainable economic growth). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.