915 results on '"Statistical distribution"'
Search Results
2. Revealing switching statistics and artificial synaptic properties of Bi2S3 memristor
- Author
-
Terdalkar, Priya, Kumbhar, Dhananjay D., Pawar, Somnath D., Nirmal, Kiran A., Kim, Tae Geun, Mukherjee, Shaibal, Khot, Kishorkumar V., and Dongale, Tukaram D.
- Published
- 2025
- Full Text
- View/download PDF
3. Evaluation of tensile strength variability in fiber reinforced composite rods using statistical distributions.
- Author
-
Qin, Hao, Ka, Thierno Aliou, Li, Xiang, Sun, Kangxin, Qin, Kaiqiang, Noor E Khuda, Sarkar, and Tafsirojjaman, T.
- Subjects
CARBON fiber-reinforced plastics ,FIBROUS composites ,DISTRIBUTION (Probability theory) ,TENSILE strength ,WEIBULL distribution - Abstract
Fiber Reinforced Polymer (FRP) composites are known for their exceptional resistance to harsh conditions, impressive durability, and high tensile strength, making them increasingly popular in structural applications. However, the inherent variability of composite materials poses a critical challenge, particularly in tensile strength, which directly impacts the safety and durability of structures. This study evaluated the tensile strength of 395 specimens, including 103 carbon fiber-reinforced polymer (CFRP) rods and 293 hybrid glass-carbon FRP (HFRP) rods, tested according to the GB 30022–2013 standard. To analyze the data, four statistical distributions—normal, lognormal, Weibull, and Gamma—were applied, and a goodness-of-fit test identified the Weibull distribution as the most suitable model. The study further proposed standardized tensile strength values of 2,912.40 MPa for 5 mm CFRP rods and 2,230.98 MPa, 2,385.12 MPa, and 2,517.44 MPa for 6, 7, and 8 mm HFRP rods, respectively. These findings provide valuable insights into the tensile performance of FRP rods, contributing to enhanced design and safety standards for FRP-based structural elements and offering practical references for mitigating material variability in construction applications. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
4. Influence of the Thermoplastic Fiber Ratio on the Mechanical Properties of Recycled Carbon Fibers During the Carding Process.
- Author
-
Ivars, Jean, Labanieh, Ahmad Rashed, and Soulat, Damien
- Subjects
- *
DISTRIBUTION (Probability theory) , *CARBON fibers , *FIBERS , *UNIFORMITY - Abstract
This study investigates the impact of carding and blending recycled carbon fibers (rCF) with crimped thermoplastic polypropylene (PP) fibers on the mechanical properties of rCF, using a Weibull statistical approach. Tensile properties of rCF were evaluated before and after carding with varying rCF/PP blend ratios (100/0%, 85/15%, 70/30%, and 50/50%). A comparison between the two-parameter and three-parameter Weibull models showed that the two-parameter model provided a better fit for rCF properties before carding. The results show that adding crimped PP fibers during carding helps to decrease the stress-at-break disparity and move their distribution to higher values. Furthermore, a slight increase in tensile modulus was observed in carded rCF, with higher PP ratios associated with smaller scatter modulus distributions. Elongation at break remained consistent, with the Weibull modulus increasing slightly with carding and the inclusion of PP fibers, indicating improved consistency. Overall, carding rCF with PP fibers helped in the mechanical property uniformity of the resulting carded webs without compromising tensile performance. This work shows the potential of the carding process with or without thermoplastic fibers to efficiently realign and give continuity to discontinuous recycled carbon fibers. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. Quantifying Uncertainty of Insurance Claims Based on Expert Judgments.
- Author
-
Handoko, Budhi, Franty, Yeny Krista, and Indrayatna, Fajar
- Subjects
- *
DISTRIBUTION (Probability theory) , *INSURANCE claims , *JUDGMENT (Psychology) , *LIFE insurance , *INSURANCE companies - Abstract
In Bayesian statistics, prior specification has an important role in determining the quality of posterior estimates. We use expert judgments to quantify uncertain quantities and produce appropriate prior distribution. The aim of this study was to quantify the uncertainty of life insurance claims, especially on the policy owner's age, as it is the main factor determining the insurance premium. A one-day workshop was conducted to elicit expert judgments from those who have experience in accepting claims. Four experts from different insurance companies were involved in the workshop. The elicitation protocol used in this study was The Sheffield Elicitation Framework (SHELF), which produces four different statistical distributions for each expert. A linear pooling method was used to aggregate the distributions to obtain the consensus distribution among experts. The consensus distribution suggested that the majority of policy owners will make a claim at the age of 54 years old. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
6. A Developed Computational Code to Build a 3D Fracture Network to Reduce the Uncertainty of Fracture Parameter Generation (A Case Study of the Emamzadeh Hashem Tunnel).
- Author
-
Etemadifar, Mahin, Shoaei, Gholamreza, Javadi, Morteza, and Hashemnejad, Arash
- Subjects
- *
DISTRIBUTION (Probability theory) , *ROCK deformation , *ENGINEERS - Abstract
Rock masses comprise intact rock and discontinuities, such as fractures, which significantly influence their mechanical and hydraulic properties. Uncertainty in constructing the fracture network can notably affect the outcomes of sensitive analyses, including tunnel stability simulations. Thus, accurately determining specific parameters of rock joints, including orientation and trace length, is essential. A discrete fracture network (DFN) is one technique used to simulate jointed rock. However, engineers often face challenges due to the inherent uncertainty in building a fracture network using statistical distribution functions. This study analyzed the fracture network of the Emamzadeh Hashem tunnel using MATLAB-developed code and 3DEC software. It focused on the impact of statistical distribution functions on the uncertainty of fracture network construction. The results reveal that using a negative exponential distribution can introduce significant errors in constructing the fracture network, especially when generating the dip direction. The parametric study shows that employing statistical distribution functions that account for data variance in the Probability Distribution Function (PDF) can enhance the accuracy of generating fracture parameters, such as dip, dip direction, and trace length, thereby reducing uncertainty in fracture network construction. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
7. Development of a novel equation for estimating the average volume of rock blocks in a rock mass with non-persistent joints.
- Author
-
Mahdavirad, Mahsa, Saeidi, Ali, Shahbazi, Alireza, and Noël, Jean-François
- Abstract
Accurate estimation of rock block size is crucial in geotechnical engineering, yet it often encounters challenges due to the complexity of modeling methods or the limitations of oversimplified approaches. This study introduces a novel equation that enhances the accuracy of rock block size estimation by incorporating joint persistence; an important factor frequently overlooked in existing models which describes the extent to which discontinuities split the rock mass. To this end, a three-dimensional discrete fracture network (DFN) was developed using 3DEC v.7.0 to model rock masses containing three non-persistent joint sets. The DFN model was carefully calibrated by removing boundary blocks and optimizing model sizes. The analysis of 125 distinct models led to the development of a practical correlation for estimating the size of rock blocks with non-persistent joints from an existing method for rock masses containing fully persistent discontinuities. The new equation, validated through cross-validation, offers a more reliable tool for practitioners, improving accuracy in rock block size estimation and supporting better decision-making in the field. The application of the newly developed equation to the Burgo Dam spillway in Australia was also shown to result in more accurate volume estimates than the existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
8. Some regularities of transaction statistics of cryptocurrency Ethereum: opportunities to study the impact of space weather on human economic behavior on a global scale.
- Author
-
Vitulyova, Yelizaveta, Moldakhan, Inabat, Grigoriev, Pavel, and Suleimenov, Ibragim
- Subjects
SPACE environment ,DISTRIBUTION (Probability theory) ,ECONOMIC impact ,CRYPTOCURRENCIES ,BITCOIN - Abstract
It is shown that the statistics of transactions of the Ethereum cryptocurrency obeys well-defined patterns. Log dependency ln N of the number of users N who carried out n transactions with the use of Ethereum cryptocurrency during a specific month on ln n is a linear one with high accuracy: ln N = b t ln n + a t . Similar statistical patterns are obtained for bitcoin transactions. It has also been established that the behavior of the coefficient b appearing in this dependence corresponds with high accuracy to the Bass diffusion model, which describes the dynamics of innovation implementation. It is shown that after the completion of the initial stage of the implementation of the Ethereum and bitcoin cryptocurrencies (since the beginning of 2018), the values of the coefficients a and b are approaching constants. On this basis, a method is proposed for identifying space weather factors on the economic behavior of the human population. In particular, it is shown that the analysis of the cross-correlation between the ratio a b for the Ethereum cryptocurrency and the Ap-index of geomagnetic activity gives an example of additional tools allowing to reveal the influence of space weather factors on the economic behavior of people on a global scale. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Features of the Method of Iteration of Means in Studying Populations.
- Author
-
Sukhorukikh, Yu. I. and Biganova, S. G.
- Abstract
This article considers the features of the mean iteration method when studying populations the quantitative indicators of which have different statistical distributions. The studies were conducted in the central part of the Northwest Caucasus. The published and field data used by the authors came from eight sample plots laid out in the lower mountain, forest–steppe, and steppe zones of the central part of the region, where various indicators were studied for seven plant species. The sample size in the sample plots was 122–485 individuals. Statistical data processing was carried out using the Stadia8.0 and Microsoft Excel for Windows programs. The values of inter-iteration means and gradations were established using known and original methods. It was revealed that, with a normal statistical distribution, the inter-iteration means have close values (difference of 0–4.23%) with the values of the means increased by 0.5–2 standard deviations. To correct extreme inter-iteration values, where the sample is insignificant (1–4 observations), it is recommended to use forecast models, which should be calculated separately for each option. The allocation of gradations of quantitative traits in populations by the iteration of means method ensures an adequate distribution of indicators in three or five gradations compared to methods focused on the average increased by the value of the standard deviation or dividing the indicators into equal values. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Global Distribution of Martian Upstream Magnetosonic Waves Observed by MAVEN.
- Author
-
Pang, Shuyue, Fu, Song, Yun, Xiaotong, Jin, Taifeng, Cao, Xing, Du, Hengle, and Ma, Xin
- Subjects
DYNAMIC pressure ,DISTRIBUTION (Probability theory) ,MARTIAN atmosphere ,WIND pressure ,SOLAR wind ,MAGNETIC fields - Abstract
Utilizing Mars Atmosphere and Volatile EvolutioN (MAVEN) observations from October 2014 to May 2023, we perform a detailed survey of magnetosonic waves generated in the solar wind (refer to as upstream MS waves), with frequencies near the proton gyrofrequency in the solar wind environment. The distribution of the solar wind‐generated MS waves has been carefully investigated, including in the solar wind and in the Martian magnetosphere by propagation. The results show that these MS waves are widely distributed below the Martian bow shock but are more concentrated below the magnetic pileup boundary, particularly in the subsolar region. The waves possess higher occurrence rates on the dayside with larger amplitudes; the occurrence rates also show dusk‐side‐preferred asymmetry. The Martian crustal magnetic field can prevent MS waves from penetrating into lower altitudes, while higher solar dynamic pressure benefits their penetration. The wave amplitudes exhibit a weak positive correlation with the solar wind dynamic pressure. These obtained global distribution features of Martian upstream MS waves observed by MAVEN are valuable to improve current understanding of the dynamic variations of Martian charged particles and the underlying contribution of wave‐particle interactions driven by MS waves. Plain Language Summary: Magnetosonic waves with frequencies near the proton gyrofrequency in the solar wind environment play an important role in altering the plasma and atmospheric environment of the planet through interactions with charged particles. In this study, we statistically investigate, using Mars Atmosphere and Volatile EvolutioN observations from October 2014 to May 2023, how these waves, referred to as upstream MS waves, are distributed in the Martian space. They are found to occupy the space downstream of the bow shock but are more concentrated below the magnetic pileup boundary. Also, they are seen more frequently on the dayside and dusk side than on the other halves. The penetration of upstream MS waves becomes difficult when encountering the Martian crustal field, or when the solar dynamic pressure is lower than usual. The amplitude of waves tends to grow larger when the solar dynamic pressure increases. These results can greatly benefit studies on how charged particles can be accelerated by the waves and escape from Mars, which could profoundly change the climate of this planet. Key Points: Upstream MS waves are predominantly below the dayside magnetic pileup boundary of Mars with a dawn‐dusk asymmetry in distributionUpstream MS waves can more easily propagate to lower altitudes in the dayside regions with weak crustal magnetic field of MarsSolar wind dynamic pressure can influence the occurrence of upstream MS waves [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Physico-chemical Characteristics of Soils of Phansidewa Village of Darjeeling in Perspective of Seasonal Variation, Statistical Distribution and Correlation Matrix.
- Author
-
Sarkar, Lovely
- Subjects
SOILS ,MANGANESE ,PHOSPHORUS ,POTASSIUM - Abstract
The article focuses on analyzing the physico-chemical characteristics of soils in Phansidewa Village, Darjeeling, India, with a specific emphasis on seasonal variations, statistical distributions, and correlations of soil parameters. It discusses the critical role of pH, organic carbon, macronutrients like nitrogen, phosphorus, potassium, and micronutrients such as iron, manganese, copper, and zinc in determining soil health and nutrient availability.
- Published
- 2024
- Full Text
- View/download PDF
12. A Numerical Assessment of the Effect of Concatenating Arbitrary Uncoupled Multicore Fiber Segments on Intercore Crosstalk in Long-Haul Communication Links.
- Author
-
Rebola, João L. and Cartaxo, Adolfo V. T.
- Subjects
DISTRIBUTION (Probability theory) ,GAUSSIAN distribution ,STANDARD deviations ,COUPLINGS (Gearing) ,DOUBLE standard ,KURTOSIS - Abstract
Random core dependent loss (CDL) has been shown to increase the direct average intercore crosstalk (ICXT) power in long-haul uncoupled multicore fiber (MCF) links. Longer links are composed of multiple MCF segments, and random CDL may arise on these links from manufacturing imperfections. During link implementation, other random effects may arise and enhance the ICXT power. In this work, the effect of concatenating MCF segments with random characteristics on the direct average ICXT power in long-haul links is assessed numerically by studying the influence of the randomness of segment length, coupling coefficient, and random CDL on the mean, standard deviation, relative spread, and excess kurtosis of the ICXT power. The numerical results show that the segment length randomness marginally affects the ICXT power. For 2000 km long links and a 6 dB maximum random variation of the coupling coefficients, the mean almost doubles and the standard deviation almost triples, relative to considering only random CDL. However, the effect of the coupling coefficients randomness on the relative spread and excess kurtosis is reduced, not affecting significantly the nearly Gaussian distribution of the direct average ICXT power and the excess of direct average ICXT power (less than a 0.26 dB increase relative to considering only random CDL). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Radiation Belt Electron Acceleration Inside the Plasmasphere
- Author
-
Man Hua and Jacob Bortnik
- Subjects
radiation belt electron fluxes ,electron acceleration ,inward radial transport ,statistical distribution ,Geophysics. Cosmic physics ,QC801-809 - Abstract
Abstract We report a new population of outer belt electron acceleration events ranging from hundreds of keV to ∼1.5 MeV that occurred inside the plasmasphere, which we named “Inside Events” (IEs). Based on 6 year observations from Van Allen Probes, we compare the statistical distributions of IEs with electron acceleration events outside the plasmasphere (OEs). We find that most IEs were observed at L 200), compared to stronger but less frequently occurred (peaking event numbers only reaching ∼80) OEs that were mostly observed at L > 4.0. The evolution of electron phase space density of a typical IE shows signature of inward radial diffusion or transport. Our study provides a feasible mechanism for IE, which is the results of the inward radial transport of the electron acceleration in the outer region of outer belt.
- Published
- 2025
- Full Text
- View/download PDF
14. Evaluation of tensile strength variability in fiber reinforced composite rods using statistical distributions
- Author
-
Hao Qin, Thierno Aliou Ka, Xiang Li, Kangxin Sun, Kaiqiang Qin, Sarkar Noor E Khuda, and T. Tafsirojjaman
- Subjects
fiber reinforced polymer composites ,statistical distribution ,Weibull ,hybrid glass-carbon FRP (HFRP) rod ,tensile strength ,Engineering (General). Civil engineering (General) ,TA1-2040 ,City planning ,HT165.5-169.9 - Abstract
Fiber Reinforced Polymer (FRP) composites are known for their exceptional resistance to harsh conditions, impressive durability, and high tensile strength, making them increasingly popular in structural applications. However, the inherent variability of composite materials poses a critical challenge, particularly in tensile strength, which directly impacts the safety and durability of structures. This study evaluated the tensile strength of 395 specimens, including 103 carbon fiber-reinforced polymer (CFRP) rods and 293 hybrid glass-carbon FRP (HFRP) rods, tested according to the GB 30022–2013 standard. To analyze the data, four statistical distributions—normal, lognormal, Weibull, and Gamma—were applied, and a goodness-of-fit test identified the Weibull distribution as the most suitable model. The study further proposed standardized tensile strength values of 2,912.40 MPa for 5 mm CFRP rods and 2,230.98 MPa, 2,385.12 MPa, and 2,517.44 MPa for 6, 7, and 8 mm HFRP rods, respectively. These findings provide valuable insights into the tensile performance of FRP rods, contributing to enhanced design and safety standards for FRP-based structural elements and offering practical references for mitigating material variability in construction applications.
- Published
- 2025
- Full Text
- View/download PDF
15. ROTI-based statistical regression models for GNSS precise point positioning errors associated with ionospheric plasma irregularities.
- Author
-
Jia, Haoyang, Yang, Zhe, and Li, Bofeng
- Abstract
Global Navigation Satellite System (GNSS) signals are susceptible to ionospheric plasma irregularities and associated scintillations, causing large deviations in the positioning solutions. This study aims to develop statistical regression models to estimate kinematic three-dimensional (3D) precise point positioning errors associated with ionospheric plasma irregularities based on the Rate Of Total electron content Index (ROTI). By assuming that the positioning errors follow the Laplace distribution, we perform nonlinear regression using the Levenberg–Marquardt algorithm on a collection of experimental data from 700 + Trimble receivers deployed in the NOAA Continuously Operating Reference Stations (CORS) Network. Three ROTI-based regression models are identified by curve fitting with nonlinear functions, i.e., third-degree polynomial (Poly3), two-term exponential (Exp2) and two-term power (Power2) models. A goodness-of-fit test suggests the models fit well into the relationship between ROTI and the 3D positioning errors with the adjusted coefficient of determination above 0.97. The regression models are subsequently employed to predict the 3D positioning errors with a given set of ROTI. Evaluation analysis using the observations from four CORS networks across different geographical regions indicate that the Exp2 model demonstrates encouraging prediction performance, with bias and root mean square error within − 0.14 m and 0.34 m, respectively, and the correct prediction ratio consistently surpasses 60.3%. The ROTI-based regression models have great potential in predictions of the degradation in GNSS positioning due to ionospheric space weather effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. GENERALIZATION OF BURR DISTRIBUTION AND INTRODUCTION OF A NEW FAMILY OF STATISTICAL DISTRIBUTIONS.
- Author
-
Hossein, Iravani and Gholamhossein, Yari
- Subjects
DISTRIBUTION (Probability theory) ,DIFFERENTIAL equations ,MAXIMUM likelihood statistics ,POISSON distribution ,ECONOMICS ,INSURANCE - Abstract
The family of Burr distributions consists of twelve different distributions that result from solving a differential equation. This family is part of the family of continuous distributions and its applications have been investigated in various topics such as survival function, simulation problems, and economic and insurance analyses. Since the flexibility of the generalized distributions is often greater than the distribution itself, the generalization of the distributions of this family is of great interest. Also, due to the diversity of the distribution type, various generalizations of the Burr distribution have been presented. Regarding the importance of generalized distributions in this family, it is enough that the family of Burr distributions can be considered a parametric generalized family. In this article, it is intended to present a generalization of the Burr distribution, which results in the special case of the type II Burr distribution; In this way, we add a parameter in the type II Burr distribution structure and by changing this parameter, we reach different Burr distributions, including the type II Burr distribution. The mentioned parameter along with other distribution parameters is estimated by the maximum likelihood method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Size Effect on the Statistical Distribution of Stress and Strain in Microforming
- Author
-
Feng, Z. Y., Li, H., Zhang, D., Fu, M. W., Chaari, Fakher, Series Editor, Gherardini, Francesco, Series Editor, Ivanov, Vitalii, Series Editor, Haddar, Mohamed, Series Editor, Cavas-Martínez, Francisco, Editorial Board Member, di Mare, Francesca, Editorial Board Member, Kwon, Young W., Editorial Board Member, Trojanowska, Justyna, Editorial Board Member, Xu, Jinyang, Editorial Board Member, Mocellin, Katia, editor, Bouchard, Pierre-Olivier, editor, Bigot, Régis, editor, and Balan, Tudor, editor
- Published
- 2024
- Full Text
- View/download PDF
18. A Bayesian model for online customer reviews data in tourism research: a robust analysis
- Author
-
Emilio Gómez-Déniz, María Martel-Escobar, and Francisco-José Vázquez-Polo
- Subjects
Bayesian statistics ,online consumer reviews ,e-WOM ,statistical distribution ,Bayesian robustness ,Kaouther Kooli, Bournemouth University, United Kingdom of Great Britain and Northern Ireland ,Business ,HF5001-6182 ,Management. Industrial management ,HD28-70 - Abstract
The Bayesian approach to data analysis is useful when the variables considered are already subjective or abstract, as is the case with online consumer reviews and ratings in tourism research. The Bayesian framework provides a method for combining observed data from prominent e-commerce platforms with other prior information, such as expert knowledge. Also, Bayesian statistical modelling has several advantages when the sample size of observed data is small. However, a source of uncertainty is introduced into the analysis by eliciting a unique prior distribution that adequately represents the expert’s judgement. We focus on the problem in a formal Bayesian robustness context by assuming that the hospitality manager is unable to choose a functional form for the prior distribution but that he or she may be able to restrict the possible priors to a class that is suitable for quantifying the practitioner’s uncertainty. Our interest is: We propose a new distribution that is suitable for fitting the rating data.We have shown how the practitioner can introduce his judgements about the feeling parameter using an appropriate prior distribution andWe develop a Bayesian robust methodology to manage hospitality managers’ uncertainty using a class of prior distributions suitable for quantifying the practitioner’s uncertainty.These ideas were illustrated using real data. We demonstrate that the Bayesian robustness methodology proposed allows us to manage this uncertainty in our model by using classes of prior distributions and how the measures of interest are transformed into intervals of interest that will allow the manager to make decisions.
- Published
- 2024
- Full Text
- View/download PDF
19. Some regularities of transaction statistics of cryptocurrency Ethereum: opportunities to study the impact of space weather on human economic behavior on a global scale
- Author
-
Yelizaveta Vitulyova, Inabat Moldakhan, Pavel Grigoriev, and Ibragim Suleimenov
- Subjects
space weather ,Ethereum cryptocurrency ,blockchain ,statistical distribution ,transac-tions ,noosphere ,Information technology ,T58.5-58.64 - Abstract
It is shown that the statistics of transactions of the Ethereum cryptocurrency obeys well-defined patterns. Log dependency lnN of the number of users N who carried out n transactions with the use of Ethereum cryptocurrency during a specific month on lnn is a linear one with high accuracy: lnN=btlnn+at. Similar statistical patterns are obtained for bitcoin transactions. It has also been established that the behavior of the coefficient b appearing in this dependence corresponds with high accuracy to the Bass diffusion model, which describes the dynamics of innovation implementation. It is shown that after the completion of the initial stage of the implementation of the Ethereum and bitcoin cryptocurrencies (since the beginning of 2018), the values of the coefficients a and b are approaching constants. On this basis, a method is proposed for identifying space weather factors on the economic behavior of the human population. In particular, it is shown that the analysis of the cross-correlation between the ratio ab for the Ethereum cryptocurrency and the Ap-index of geomagnetic activity gives an example of additional tools allowing to reveal the influence of space weather factors on the economic behavior of people on a global scale.
- Published
- 2024
- Full Text
- View/download PDF
20. Methodical Approach to Selecting the Appropriate Distribution for Reliability Analysis: Automotive Application
- Author
-
Bella Naoufal, Salhi Nohaila, and Lagrat Ismail
- Subjects
statistical distribution ,kolmogorov smirnov k-s test ,self-diagnosis ,reliability of complex automotive systems ,Environmental sciences ,GE1-350 - Abstract
In this study, we propose a methodical approach to selecting an appropriate statistical distribution for reliability analysis. In this approach, we have defined a methodology for testing reliability distributions based on the Kolmogorov Smirnov K-S test for MTBF Data collected from Self-Diagnostic of a sample of 50 critical components part of a complex automotive system. Finally, we proposed two solutions: the first involves migrating from one distribution to another according to the intervals, and the second allows for the selection of the distribution that is representative over a maximum number of intervals. These strategies were developed from the analysis of results after application of the K-S test on the distributions tested. This approach will contribute. to the reliability analysis of complex systems. As a result, in improving the models used to analyze complex systems behavioral analogies such as Petri nets or Markov chains.
- Published
- 2025
- Full Text
- View/download PDF
21. Uncertainty Evaluation for Autonomous Vehicles: A Case Study of AEB System
- Author
-
Duan, Shunchang, Bai, Xianxu, Shi, Qin, Li, Weihan, and Zhu, Anding
- Published
- 2024
- Full Text
- View/download PDF
22. Advancements and development trend in statistical damage constitutive models for rock: a comprehensive review
- Author
-
Liu, Wei, Yin, Shangxian, Thanh, Hung Vo, Soltanian, Mohamad Reza, Yu, Qingyang, Yang, Songlin, Li, Yarui, and Dai, Zhenxue
- Published
- 2024
- Full Text
- View/download PDF
23. Generalized inverse transformation method via representative points in statistical simulation.
- Author
-
Li, Yinan, Sun, Zhihua, and Fang, Kai-Tai
- Abstract
AbstractDiscrete approximations of continuous random variables play a crucial role in various areas of research and application, offering advantages in computational efficiency, interpretability, and modeling flexibility. This paper investigates discrete representations of continuous random variables using the mean-squared error criterion (MSE-RPs) and the inverse transformation method. We introduce a novel discrete approximation to the normal distribution that surpasses conventional MSE-RPs obtained from the normal density, particularly in matching lower-order moments. Furthermore, we propose a two-step generalized inverse transformation method to generate approximate MSE-RPs of random variables, inspired by the remarkable performance of the inverse transformation method in statistical simulation. Overall, the generalized inverse transformation method offers a more efficient and reliable alternative for obtaining discrete approximations to target continuous distributions, especially in scenarios where explicit computation and derivation of density functions are challenging or computationally expensive. Moreover, we extend our investigation to the case where the target distribution is a convolution of two random variables, thereby expanding the applicability of our proposed method. Furthermore, our findings hold potential applications in Monte Carlo simulation and resampling techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. An intelligent threshold selection method to improve orbital angular momentum-encoded quantum key distribution under turbulence.
- Author
-
Li, Jia-Hao, Tang, Jie, Wang, Xing-Yu, Xue, Yang, Yu, Hui-Cun, Deng, Zhi-Feng, Cao, Yue-Xiang, Liu, Ying, Wu, Dan, Hu, Hao-Ran, Wang, Ya, Lun, Hua-Zhi, Wei, Jia-Hua, Zhang, Bo, Liu, Bo, and Shi, Lei
- Subjects
ATMOSPHERIC turbulence ,DISTRIBUTION (Probability theory) ,TURBULENCE ,ROOT-mean-squares - Abstract
High-dimensional quantum key distribution (HD-QKD) encoded by orbital angular momentum (OAM) presents significant advantages in terms of information capacity. However, perturbations caused by free-space atmospheric turbulence decrease the performance of the system by introducing random fluctuations in the transmittance of OAM photons. Currently, the theoretical performance analysis of OAM-encoded QKD systems exists a gap when concerning the statistical distribution under the free-space link. In this article, we analyzed the security of QKD systems by combining probability distribution of transmission coefficient (PDTC) of OAM with decoy-state BB84 method. To address the problem that the invalid key rate is calculated in the part transmittance interval of the post-processing process, an intelligent threshold method based on neural network is proposed to improve OAM-encoded QKD, which aims to conserve computing resources and enhance system efficiency. Our findings reveal that the ratio of root mean square (RMS) OAM-beam radius to Fried constant plays a crucial role in ensuring secure key generation. Meanwhile, the training error of neural network is at the magnitude around 10
−3 , indicating the ability to predict optimization parameters quickly and accurately. Our work contributes to the advancement of parameter optimization and prediction for free-space OAM-encoded HD-QKD systems. Furthermore, it provides valuable theoretical insights to support the development of free-space experimental setups. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
25. Time stationarity, shape and ordinal ranking bias of RCA indexes: a new set of measures.
- Author
-
Stellian, Rémi, Ojeda-Joya, Jair N., and Danna-Buitrago, Jenny P.
- Subjects
DISTRIBUTION (Probability theory) ,MEASURING instruments ,STATISTICAL correlation ,ECONOMIC policy - Abstract
A new set of tools to measure the time stationarity, shape and ordinal ranking bias of Revealed Comparative Advantage (RCA) indexes is suggested. The aim is to help select the most consistent RCA index for a given set of countries, products and time periods, which is especially relevant for economic policy based on comparative advantages. The GMM estimation of an AR(1) process based on RCA indexes provides three measures of time stationarity that are more rational than the measures available in the literature. Furthermore, we revise the statistics that are commonly used to capture shape. Finally, with respect to ordinal ranking bias, we modify the use of Spearman's rank correlation coefficient in Leromain and Orefice (Int Econ 139:48–70, 2014) and generalize the non-parametric measure of Stellian and Danna-Buitrago (J Appl Econ 22(1):349–379, 2019). We discuss different methods of ranking RCA indexes according to all proposed measures. An application to 33 RCA indexes, 67 trade areas and three product classifications shows that, on average, the most accurate RCA indexes are Contribution-to-the-Trade-Balance indexes, Revealed Competitiveness indexes and regression-based indexes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Generative adversarial network: a statistical-based deep learning paradigm to improve detecting breast cancer in thermograms.
- Author
-
Shojaedini, Seyed Vahab, Abedini, Mehdi, and Monajemi, Mahsa
- Subjects
- *
GENERATIVE adversarial networks , *DISTRIBUTION (Probability theory) , *STATISTICAL learning , *CONCEPT learning , *BREAST cancer , *DEEP learning - Abstract
Thermography, as a harmless modality, thanks to its low equipment complexity in parallel with quick and cheap access, has been able to come up as a method with significant potential in the diagnosis of some cancers in recent years. However, the complexity of the images resulting from this method has caused the use of deep learning to interpret thermograms. A limiting factor in this process is the strong dependence of deep learning methods on the number of training data, which is a serious challenge in thermography due to the young age of this technology and the lack of available images. In this paper, an attempt is made to reduce the above challenge by utilizing the concept of statistical learning in such a way that the statistical distribution of the original data is estimated by using generative adversarial networks (i.e., GAN). Then, several fake images are reconstructed based on the estimated distribution in order to increase the training thermograms. Since the fake images are reconstructed based on similar statistics of real thermograms in each class, the effective features of each class are preserved to a significant extent in the reconstruction process. The use of this method indicates a significant improvement in the separation of healthy and cancerous thermograms compared to the benchmark method which does not use the concept of GAN in such a way that characteristics of sensitivity and accuracy are improved in ranges of 3–9% and 3–7%, respectively. In terms of specificity, although we have seen an improvement of up to 9%, in some cases, small drops of up to 2% have also been observed, which can still be justified due to the significant improvement in sensitivity and accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Switching Analysis in Hybrid OHL-Submarine Cable 500-kV Transmission System
- Author
-
Binhot P. Nababan, Kevin Marojahan Banjar-Nahor, Musa Partahi Marbun, Suwarno, and Nanang Hariyanto
- Subjects
Line de-energization ,line energization ,statistical distribution ,submarine cable ,transient overvoltages ,TRV ,Distribution or transmission of electric power ,TK3001-3521 ,Production of electric energy or power. Powerplants. Central stations ,TK1001-1841 - Abstract
This study focuses on analyzing switching transients in the upcoming 500 kV Java-Bali Connection (JBC) hybrid OHL and submarine cable project using DIgSILENT PowerFactory software, based on a realistic power system model. Distributed-parameter models with constant parameters of the Bergeron model are utilized. Analysis of the traveling wave effect on the line is conducted, and the integration time step based on the time of the traveling wave is carefully selected. Statistical distributions of energization are produced by varying the circuit configuration and system short-circuit power. It is found that the Switching Withstand Voltage (SWV) during the energization process remain below 1175 kV. The probability distribution is fitted to a normal distribution, with the skewness and kurtosis shown to be skewed to the right and having a lower peak than that of the normal distribution, respectively. When a three-phase short-circuit at the line breaker is induced, the rate of rise of recovery voltage (RRRV) exceeds the IEC standard envelope if only one circuit is operating. In this contribution, switching analysis during no-load energization and de-energization in the planning stage of the mixed OHL-submarine cable is examined.
- Published
- 2024
- Full Text
- View/download PDF
28. Enhancing the accuracy of tropospheric ozone prediction using probability distribution
- Author
-
Muhammad Ismail Jaffar, Hazrul Abdul Hamid, Riduan Yunus, and Ahmad Fauzi Raffee
- Subjects
air pollution modelling ,tropospheric ozone ,return period ,statistical distribution ,Technology ,Technology (General) ,T1-995 ,Science ,Science (General) ,Q1-390 - Abstract
Tropospheric ozone or ground-level ozone, mainly found near ground level, has adverse effects on human health. Distribution fitting is useful for predicting the probability, or forecasting the frequency of recurrence, of a phenomenon in a specific period of time. This study aimed to find the best fit distribution of ground-level ozone for specific industrial, rural, and suburban areas of monitoring locations in Malaysia, which were Kuala Terengganu, Jerantut, and Banting. Secondary data from 2017 to 2020 used in this study were obtained from the Department of Environment Malaysia (DoE). This study employed eight probability distributions namely Weibull, gamma, lognormal, logistic, log-logistic, Birnbaum–Saunders, Nakagami, and inverse Gaussian. The method of moments was used to estimate the parameters for each distribution and the best distribution can be used for predicting the return period of the concentration. The descriptive statistics analysis showed that ground-level ozone reached the highest peak at 1400 and 1500 hours, due to the UV radiation from sunlight, while the lowest concentration reading was at 0700 hours at all monitoring locations. By comparing the analysis of the eight distributions, Nakagami was found to be the best fit distribution to the actual monitoring data for Kuala Terengganu, Jerantut, and Banting stations from 2017 to 2020. As a result, this study suggests that the Nakagami distribution be used to predict exceedances and return periods, based on the performance indicators. Thus, it can take the place of the typical distributions employed in fitting the distribution of air pollutants, such as the lognormal distribution and the gamma distribution.
- Published
- 2023
29. Short-term forecasting of natural gas consumption by determining the statistical distribution of consumption data
- Author
-
Ivan Smajla, Domagoj Vulin, and Daria Karasalihović Sedlar
- Subjects
Forecasting methods ,Natural gas consumption ,Statistical distribution ,Smart metering ,Remote reading ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The development of gas smart meters has enabled the collection of data on daily natural gas consumption which can be used to develop and improve methods and models for natural gas consumption forecasting. This paper presents the development of a model for the short-term forecasting of total natural gas consumption, which is applicable in different distribution areas where smart meters are installed in large numbers. The advantages of this model are the use of only two input parameters (daily natural gas consumption and average daily temperature), forecasting the total consumption in the determined area by analyzing the consumption data of less than 10% of the total consumers as well as robustness to consumer types. Daily natural gas consumption data collected from the more than 3300 gas smart meters over a period of six months was used for the determination of correlations between lognormal distribution variables and temperature. The defined correlations between distribution variables and temperature were used for upscaling consumption to a specific number of final consumers, i.e., to obtain the total consumption of natural gas in the observed area. Best results were achieved using the “two-day rolling average temperature” in the consumption scenario up to 250 m3 per day (MAPE was 7.26%). When compared to using “average temperature” as an input parameter, “two-day rolling average temperature” and “shaving peaks temperature” produced better results due to the mitigated impact of sudden temperature changes that significantly affected the simulated consumption in the model while the actual consumption is a little more inert. Also, consumption scenarios up to 250 m3 can be considered the most representative for forecasting total natural gas consumption since it achieved the best results.
- Published
- 2023
- Full Text
- View/download PDF
30. Topp-Leone Cauchy Family of Distributions with Applications in Industrial Engineering
- Author
-
Mintodê Nicodème Atchadé, Mahoulé Jude Bogninou, Aliou Moussa Djibril, and Melchior N’bouké
- Subjects
Topp-Leone ,Statistical distribution ,Cauchy ,Moments ,Entropy ,Maximum likelihood estimation ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
Abstract The goal of this research is to create a new general family of Topp-Leone distributions called the Topp-Leone Cauchy Family (TLC), which is exceedingly versatile and results from a careful merging of the Topp-Leone and Cauchy distribution families. Some of the new family’s theoretical properties are investigated using specific results on stochastic functions, quantile functions and associated measures, generic moments, probability weighted moments, and Shannon entropy. A parametric statistical model is built from a specific member of the family. The maximum likelihood technique is used to estimate the model’s unknown parameters. Furthermore, to emphasize the new family’s practical potential, we applied our model to two real-world data sets and compared it to existing rival models.
- Published
- 2023
- Full Text
- View/download PDF
31. Which distribution to choose for deriving a species sensitivity distribution? Implications from analysis of acute and chronic ecotoxicity data
- Author
-
Miina Yanagihara, Kyoshiro Hiki, and Yuichi Iwasaki
- Subjects
Ecological risk assessment ,Hazard assessment ,Model selection ,Statistical distribution ,Environmental pollution ,TD172-193.5 ,Environmental sciences ,GE1-350 - Abstract
Species sensitivity distributions (SSDs) estimated by fitting a statistical distribution to ecotoxicity data are indispensable tools used to derive the hazardous concentration for 5 % of species (HC5) and thereby a predicted no-effect concentration in environmental risk assessment. Whereas various statistical distributions are available for SSD estimation, the fundamental question of which statistical distribution should be used has received limited systematic analysis. We aimed to address this knowledge gap by applying four frequently used statistical distributions (log-normal, log-logistic, Burr type III, and Weibull distributions) to acute and chronic SSD estimation using aquatic toxicity data for 191 and 31 chemicals, respectively. Based on the differences in the corrected Akaike’s information criterion (AICc) as well as visual inspection of the fitting of the lower tails of SSD curves, the log-normal SSD was generally better or equally good for the majority of chemicals examined. Together with the fact that the ratios of HC5 values of other alternative SSDs to those of log-normal SSDs generally fell within the range 0.1–10, our findings indicate that the log-normal distribution can be a reasonable first candidate for SSD derivation, which does not contest the existing widespread use of log-normal SSDs.
- Published
- 2024
- Full Text
- View/download PDF
32. Sturgeon Parasites: A Review of Their Diversity and Distribution.
- Author
-
Deák, György, Holban, Elena, Sadîca, Isabela, and Jawdhari, Abdulhusein
- Subjects
- *
DISTRIBUTION (Probability theory) , *STURGEONS , *PARASITES , *AQUATIC biodiversity , *SUSTAINABLE aquaculture , *AGRICULTURAL intensification , *NEMATODES , *TREMATODA - Abstract
Sturgeon species have inhabited the world's seas and rivers for more than 200 million years and hold significant taxonomic significance, representing a strong conservation interest in aquatic biodiversity as well as in the economic sector, as their meat and eggs (caviar) are highly valuable goods. Currently, sturgeon products and byproducts can be legally obtained from aquaculture as a sustainable source. Intensive farming practices are accompanied by parasitic infestations, while several groups of parasites have a significant impact on both wild and farmed sturgeons. The present article is a review of common sturgeon parasites from the genus: Protozoa, Trematoda, Crustacea, Nematodes, Monogenea, Hirudinea, Copepoda, Acanthocephala, Cestoda, Polypodiozoa, and Hyperoartia, while also addressing their pathology and statistical distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Distribution and concentration pathway of particulate pollution during pandemic-induced lockdown in metropolitan cities in India.
- Author
-
Anand, A., Garg, V. K., Agrawal, A., Mangla, S., and Pathak, A.
- Abstract
To characterize the pollutant dispersal across major metropolitan cities in India, daily particulate matter (PM
10 and PM2.5 ) data for the study areas were collected from the National Air Quality Monitoring stations database provided by the Central Pollution Control Board (CPCB) of India. The data were analysed for three temporal ranges, i.e. before the pandemic-induced lockdown, during the lockdown, and after the upliftment of lockdown restrictions. For the purpose, the time scale ranged from 1st April to 31st May for the years 2019 (pre), 2020, and 2021 (post). Statistical distributions (lognormal, Weibull, and Gamma), aerosol optical thickness, and back trajectories were assessed for all three time periods. Most cities followed the lognormal distribution for PM2.5 during the lockdown period except Mumbai and Hyderabad. For PM10 , all the regions followed the lognormal distribution. Delhi and Kolkata observed a maximum decline in particulate pollution of 41% and 52% for PM2.5 and 49% and 53% for PM10 , respectively. Air mass back trajectory suggests local transmission of air mass during the lockdown period, and an undeniable decline in aerosol optical thickness was observed from the MODIS sensor. It can be concluded that statistical distribution analysis coupled with pollution models can be a counterpart in studying the dispersal and developing pollution abatement policies for specific sites. Moreover, incorporating remote sensing in pollution study can enhance the knowledge about the origin and movement of air parcels and can be helpful in taking decisions beforehand. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
34. A statistics view of contact pressure distribution for normal contact of fractal surfaces.
- Author
-
Yang, Huiyi, Xu, Yang, and Xu, Chao
- Abstract
Due to inherent nonlinear stiffness and damping characteristics, dry friction interfaces have a significant impact on the dynamics of jointed structures. When two surfaces are brought into purely normal contact, contact pressure distribution is of high concern. This work focused on the statistics of the contact pressure distribution between a rough surface and a smooth rigid plane, which provides new insight into the interface contact behaviour. First, the fractal rough surface is generated using the Weierstrass–Mandelbrot function with measured roughness parameters. With meshed rough surfaces, an elastic–plastic finite element contact analysis is performed to determine the contact pressure distribution. Then, the results of contact pressure are statistically analysed. The effects of roughness and contact load on contact area, contact stiffness and mean contact pressure are thoroughly investigated. The probability distribution of contact pressure is determined by fitting a continuous function using a twofold Weibull mixture model. The proposed probability distribution function is found to be capable of describing contact pressure. The contact pressure distribution is affected by the surface fractal characteristics and evolves with the contact load. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Tone Density Based Sentiment Lexicon for Turkish
- Author
-
Karaşlar, Muazzez Şule, Sağlam, Fatih, Genç, Burkay, Xhafa, Fatos, Series Editor, Hemanth, D. Jude, editor, Yigit, Tuncay, editor, Kose, Utku, editor, and Guvenc, Ugur, editor
- Published
- 2023
- Full Text
- View/download PDF
36. A Numerical Assessment of the Effect of Concatenating Arbitrary Uncoupled Multicore Fiber Segments on Intercore Crosstalk in Long-Haul Communication Links
- Author
-
João L. Rebola and Adolfo V. T. Cartaxo
- Subjects
intercore crosstalk ,long-haul transmission ,multicore fibers ,random core dependent loss ,statistical distribution ,Applied optics. Photonics ,TA1501-1820 - Abstract
Random core dependent loss (CDL) has been shown to increase the direct average intercore crosstalk (ICXT) power in long-haul uncoupled multicore fiber (MCF) links. Longer links are composed of multiple MCF segments, and random CDL may arise on these links from manufacturing imperfections. During link implementation, other random effects may arise and enhance the ICXT power. In this work, the effect of concatenating MCF segments with random characteristics on the direct average ICXT power in long-haul links is assessed numerically by studying the influence of the randomness of segment length, coupling coefficient, and random CDL on the mean, standard deviation, relative spread, and excess kurtosis of the ICXT power. The numerical results show that the segment length randomness marginally affects the ICXT power. For 2000 km long links and a 6 dB maximum random variation of the coupling coefficients, the mean almost doubles and the standard deviation almost triples, relative to considering only random CDL. However, the effect of the coupling coefficients randomness on the relative spread and excess kurtosis is reduced, not affecting significantly the nearly Gaussian distribution of the direct average ICXT power and the excess of direct average ICXT power (less than a 0.26 dB increase relative to considering only random CDL).
- Published
- 2024
- Full Text
- View/download PDF
37. Quantitative and qualitative interpretation of community partitions by map overlaying and calculating the distribution of related geographical features
- Author
-
Haitao Zhang, Kang Ji, Huixian Shen, Rui Song, Yuan Liu Jin, and Xin Yang Yu
- Subjects
spatial interaction network ,community partitions ,geographical features ,statistical distribution ,Geology ,QE1-996.5 - Abstract
Applying community detection algorithms in spatial interaction networks constructed from modern human communication records is an essential means of evaluating urban territorial subdivisions. Previous studies have usually involved qualitative rather than quantitative interpretations of community detection results. This article proposes a method of quantitatively and qualitatively interpreting community partition results by map overlaying the spatial regions corresponding to the detected communities with the related geographical features and by calculating the distribution of the geographical features contained in the regions and the entropy value of each distribution. The interpretation of the communities detected from the spatial interaction networks is carried out from the perspective of multi-temporal and multi-spatial scales and multi-geographical features. Extensive experiments were conducted with Milan, Italy, as the study area. The spatial interaction records reflected by telephone calls, land use, and point of interest (POI) data were used as the experimental data. Experimental results demonstrated the effectiveness of our method, and the specific results include: (1) Qualitative interpretation of multi-spatial resolution scale communities detected from the long-term aggregated spatial interaction network. The cohesiveness, homogeneity, and heterogeneity of the detected communities were qualitatively interpreted by the spatial distribution patterns of the land use dataset and the POI dataset. (2) Quantitative interpretation of multi-spatial resolution scale communities detected from the long-term aggregated spatial interaction network. The low spatial resolution scale community partitions and the high spatial resolution scale community partitions were interpreted through the statistical distribution of the land use dataset and the POI dataset, respectively. (3) Qualitative interpretation of the stable and active regions discovered from the community time series. Regardless of the community partitions’ spatial resolution scales, the stable and active regions were distinguished with the statistical distributions of the land use dataset and the POI dataset.
- Published
- 2023
- Full Text
- View/download PDF
38. Topp-Leone Cauchy Family of Distributions with Applications in Industrial Engineering.
- Author
-
Atchadé, Mintodê Nicodème, Bogninou, Mahoulé Jude, Djibril, Aliou Moussa, and N'bouké, Melchior
- Subjects
CAUCHY sequences ,INDUSTRIAL engineering ,UNCERTAINTY (Information theory) ,MAXIMUM likelihood statistics ,QUANTILES - Abstract
The goal of this research is to create a new general family of Topp-Leone distributions called the Topp-Leone Cauchy Family (TLC), which is exceedingly versatile and results from a careful merging of the Topp-Leone and Cauchy distribution families. Some of the new family's theoretical properties are investigated using specific results on stochastic functions, quantile functions and associated measures, generic moments, probability weighted moments, and Shannon entropy. A parametric statistical model is built from a specific member of the family. The maximum likelihood technique is used to estimate the model's unknown parameters. Furthermore, to emphasize the new family's practical potential, we applied our model to two real-world data sets and compared it to existing rival models. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Review of the Monothematic Series of Publications Concerning Research on Statistical Distributions of Navigation Positioning System Errors.
- Author
-
Specht, Mariusz
- Subjects
- *
DISTRIBUTION (Probability theory) , *LOGNORMAL distribution , *GLOBAL Positioning System , *BETA distribution , *GAUSSIAN distribution , *ROOT-mean-squares , *LATITUDE - Abstract
This review presents the main results of the author's study, obtained as part of the post-doctoral (habilitation) dissertation entitled "Research on Statistical Distributions of Navigation Positioning System Errors", which constitutes a series of five thematically linked scientific publications. The main scientific aim of this series is to answer the question of what statistical distributions follow the position errors of navigation systems, such as Differential Global Positioning System (DGPS), European Geostationary Navigation Overlay Service (EGNOS), Global Positioning System (GPS), and others. All of the positioning systems under study (Decca Navigator, DGPS, EGNOS, and GPS) are characterised by the Position Random Walk (PRW), which means that latitude and longitude errors do not appear randomly, being a feature of the normal distribution. The research showed that the Gaussian distribution is not an optimal distribution for the modelling of navigation positioning system errors. A higher fit to the 1D and 2D position errors was exhibited by such distributions as beta, gamma, and lognormal. Moreover, it was proven that the Twice the Distance Root Mean Square (2DRMS(2D)) measure, which assumes a priori normal distribution of position errors in relation to latitude and latitude, was smaller by 10–14% than the position error value from which 95% fixes were smaller (it is known as the R95(2D) measure). [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. Fitting of Rainfall Data in Erbil City Using Statistical Distribution Techniques.
- Author
-
ABBAS, Khalid A. and Ibrahim, Hassan Sadi
- Subjects
- *
DISTRIBUTION (Probability theory) , *WEIBULL distribution , *METEOROLOGICAL stations , *GAMMA distributions , *GAMMA functions , *RAINFALL - Abstract
Rainfall Statistical distribution fitting is essential for the design of water management and water related infrastructure. Future prediction of rainfall events can be made such as floods and drought for a given area of study if the statistical distribution was known. In this paper the rainfall data of 48 years in Erbil meteorological station in Erbil city was analyzed and fitted to several types of statistical distributions to find the best distribution. The Normal, Gamma and Weibull distributions model were used for annual and monthly rainfall and to the goodness of fit was found based on the p-value and the Anderson-Darling tests. The result shows that Weibull and Gamma functions are successful for all cases, while Normal function failed in a number of months but did very well in others. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. Enhancing the accuracy of tropospheric ozone prediction using probability distribution.
- Author
-
Jaffar, Muhammad Ismail, Hamid, Hazrul Abdul, Yunus, Riduan, and Raffee, Ahmad Fauzi
- Subjects
- *
DISTRIBUTION (Probability theory) , *WEIBULL distribution , *LOGNORMAL distribution , *TROPOSPHERIC ozone , *GAMMA distributions , *AIR pollutants - Abstract
Tropospheric ozone or ground-level ozone, mainly found near ground level, has adverse effects on human health. Distribution fitting is useful for predicting the probability, or forecasting the frequency of recurrence, of a phenomenon in a specific period of time. This study aimed to find the best fit distribution of ground-level ozone for specific industrial, rural, and suburban areas of monitoring locations in Malaysia, which were Kuala Terengganu, Jerantut, and Banting. Secondary data from 2017 to 2020 used in this study were obtained from the Department of Environment Malaysia (DoE). This study employed eight probability distributions namely Weibull, gamma, lognormal, logistic, log-logistic, Birnbaum-Saunders, Nakagami, and inverse Gaussian. The method of moments was used to estimate the parameters for each distribution and the best distribution can be used for predicting the return period of the concentration. The descriptive statistics analysis showed that ground-level ozone reached the highest peak at 1400 and 1500 hours, due to the UV radiation from sunlight, while the lowest concentration reading was at 0700 hours at all monitoring locations. By comparing the analysis of the eight distributions, Nakagami was found to be the best fit distribution to the actual monitoring data for Kuala Terengganu, Jerantut, and Banting stations from 2017 to 2020. As a result, this study suggests that the Nakagami distribution be used to predict exceedances and return periods, based on the performance indicators. Thus, it can take the place of the typical distributions employed in fitting the distribution of air pollutants, such as the lognormal distribution and the gamma distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2023
42. Frequency Analysis of Maximum Daily Rainfall and Determining the Best Appropriate Distribution Functions in the Bostan Watershed, Golestan Province
- Author
-
S. yaghobi, Ch.B. Komaki, M. Hosseinalizadeh, A. Najafinejad, H.R. Pourghasemi, and M. Faramarzi
- Subjects
maximum daily rainfall ,frequency analysis ,statistical distribution ,meteorological and satellite data ,bostan watershed ,Agriculture ,Agriculture (General) ,S1-972 - Abstract
Frequency analysis of daily rainfall or return period of rainfall and flooding events is very important considering the behavioral complexity in water resources management; because ignoring it can lead to urban destructive floods. In the present research, three distribution functions of Pearson, Beta, and Gamma were compared to investigate and select the most appropriate distribution function for the precipitation data acquired from meteorology stations and CHIRPS satellite in seven stations in the watershed of Bustan Dam. Statistical analyses showed that satellite data were ineffective to estimate daily precipitation due to high errors in RMSE, MAD, and NASH. Meteorological data were used to spot the best distribution. Google Earth Engine and Python programming language were used. Then, the selected distribution function was used to determine the maximum daily rainfall, frequency probability, and return period of 2, 10, 50, 100, and 200 years. The results of the goodness of fit test, Error Sum of Squares, Bayesian Information Criterion, Akaike Information Criteria well as Kullback-Leibler Divergence showed that in five stations of Kalaleh, Qarnaq, Golestan National Park, Golestan Dam, and Glidagh, the Pearson function is the most suitable distribution function. Also, in the other two stations (Gonbad and Tamar), the Beta function was recognized as a suitable function. However, Gamma distribution in the study area is not efficient. So, it can be concluded that heavy and irregular rainfall can be effective in choosing the best distribution function at each station. Therefore, it is recommended to consider the maximum possible rainfall and as a result of the possible occurrence of floods with principled and accurate management to prevent human and financial losses in susceptible areas, especially in the study area.
- Published
- 2023
43. Criteria for selecting plus trees for protective forestry
- Author
-
Yu. I. Sukhorukikh, S. G. Biganova, A. P. Glinushkin, and L. L. Sviridova
- Subjects
protective forest stands ,plus trees ,selection ,selection criteria ,height ,excess ,statistical distribution ,breeding differential ,breeding categories of trees ,sanitary condition ,Technology - Abstract
Protective forest strips are the basis of the environmental frame on sparsely wooded areas. To create such highly productive objects, the selection of the corresponding gene pool is required. Plus trees the representatives of this gene pool. The aim of the research is to develop criteria for highlighting plus trees for protective forestry, focused on the creation of plantings, the main parameter of which is the working height. 16 trial areas of cherry oak (Quércus róbur L.), black locust (Robinia Pseudoacia L.), green ash (Fraxinus lanceolata B.), European ash (Fraxinus excelsior L.), thorney locust (Gleditschia triacanthos L.), walnut (Juglans regia L.) have been laid out. A continuous recalculation of heights in 100–142 individuals have been recalculated on each trial area and their statistical indicators have been determined. The height of the trees had a normal or close to it statistical distribution at the studied objects. Data processing was carried out using the Stadia 8.0/Prof licensed program for Windows. The method of selecting plus trees has been proposed, the height of which should exceed the average one by 25% or more. A comparison of the proposed and well -known method has revealed that the proposed one can increase the breeding differential with instrumental selection by 48,25–53,78%, and with eye-instrumental selection by 31,15–41,39%. Criteria for trees of various selection categories have been developed. Due to different conditions, it is recommended to highlight plus trees separately in the extreme and mid-protective forest strips. With breeding inventory, it is also necessary to take into account the sanitary condition of the trees.
- Published
- 2023
- Full Text
- View/download PDF
44. Generation of irregular particle packing with prescribed statistical distribution, spatial arrangement, and volume fraction
- Author
-
Libing Du, Xinrong Liu, Yafeng Han, and Zhiyun Deng
- Subjects
Minkowski sum ,Optimised advance front method (OAFM) ,Spatial arrangement ,Irregular particle packing ,Statistical distribution ,Engineering geology. Rock mechanics. Soil mechanics. Underground construction ,TA703-712 - Abstract
A method for packing irregular particles with a prescribed volume fraction is proposed. Furthermore, the generated granular material adheres to the prescribed statistical distribution and satisfies the desired complex spatial arrangement. First, the irregular geometries of the realistic particles were obtained from the original particle images. Second, the Minkowski sum was used to check the overlap between irregular particles and place an irregular particle in contact with other particles. Third, the optimised advance front method (OAFM) generated irregular particle packing with the prescribed statistical distribution and volume fraction based on the Minkowski sum. Moreover, the signed distance function was introduced to pack the particles in accordance with the desired spatial arrangement. Finally, seven biaxial tests were performed using the UDEC software, which demonstrated the accuracy and potential usefulness of the proposed method. It can model granular material efficiently and reflect the mesostructural characteristics of complex granular materials. This method has a wide range of applications where discrete modelling of granular media is necessary.
- Published
- 2023
- Full Text
- View/download PDF
45. Statistically Inspired Passivity Preserving Model Order Reduction
- Author
-
Namra Akram, Mehboob Alam, Rashida Hussain, and Yehia Massoud
- Subjects
Statistical distribution ,model order reduction ,passivity preserving ,spectral zeros ,on-chip interconnects ,computer aided design ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The continuous scaling of the on-chip devices and interconnects increases the complexity of the design space and becomes a crucial factor in the fabrication of modern integrated circuits. The ever decreasing of interconnect pitch along with process enhancement into the nanometer regime had shifted the paradigm from a device-dominated to an interconnect-dominated methodology. In the design methodology, Model Order Reduction (MOR) reduces the size of large-scale simulation of on-chip interconnect to speed up the performance of design tools and chip validation. In approximating the original system, the passivity preserving MOR technique of using spectral zeros as positive real interpolation points preserves the stability and passivity of the system. In this work, statistical distribution techniques are proposed for the selection of spectral zeros. The proposed method is based on using the gaussian, uniform, binomial, and weibull distributions to select spectral zeros to better match moments with the least absolute error between the original and reduced-order systems. The results show that the reduced-order model developed using the Gaussian distributed Spectral zeros Projection (GSP) method offers higher accuracy and numerical stability compared to other distributions.
- Published
- 2023
- Full Text
- View/download PDF
46. Statistical methods and distribution theory with applications to finance and cryptocurrencies
- Author
-
Zhang, Yuanyuan and Boshnakov, Georgi
- Subjects
Financial risk measure ,Backtesting VaR ,Stylised facts ,Statistical distribution ,Cryptocurrencies ,Bitcoin - Abstract
The whole thesis consists of seven chapters, where the main theme focuses on the development of statistical methods and distribution theory, with applications to finance and cryptocurrencies. Chapter 1 contains an introduction and background to my thesis. This is then accompanied by Chapters 2 through to 7, which provide the main contributions. There have been many backtesting methods proposed for Value at Risk (VaR). Yet they have rarely been applied in practice. Chapter 2 provides a comprehensive review of all of the recent backtesting methods for VaR. This review could encourage applications and also the development of further backtesting methods. A longstanding open problem in statistics is: what is the exact distribution of the sum of independent generalized Pareto or Pareto random variables? In Chapter 3, we derive single integral representations for the exact distribution with the integrand involving the incomplete and complementary incomplete gamma functions. Applications to insurance and catastrophe bonds are described. The term `stylised facts' has been extensively researched through the analysis of many different financial datasets. More recently, cryptocurrencies have been investigated as a new type of financial asset, and provide an interesting example, with a current market value of over $500 billion. Chapter 4 analyses the stylised facts of the four most popular cryptocurrencies ranked according to their market capitalisation. The analysis is conducted on high frequency returns data with varying lags. In addition to using the Hurst exponent, our analysis also considers features of dependence between different cryptocurrencies. Ethereum was the first decentralised platform to support smart contracts. It has attracted significant publicity and captured the interests of a wide range of institutions, enthusiasts and even world leaders. In Chapter 5, we have analysed the market price index for all exchanges trading in Ethereum versus three global currencies, the Korean Won; Euro; US Dollar; Bitcoin, and the Global Price Index for Ethereum, through the fitting of the Generalised Hyperbolic distribution and its subclasses. Our results show that returns are clearly non-normal and the Generalised Hyperbolic and its subset of distributions fit well jointly for all of the indices. We also analyse the long term memory effect for the returns of Ether, compare the Value at Risk and Expected Shortfall based on historical Ether data and other financial instruments, and perform backtesting to test the extreme tails. The market for cryptocurrencies has experienced extremely turbulent conditions in recent times, and we can clearly identify strong bull and bear market phenomena over the past year. In Chapter 6, we utilise algorithms for detecting turning points to identify both bull and bear phases in the high-frequency markets for the three largest cryptocurrencies of Bitcoin, Ethereum and Litecoin. We also examine the market efficiency and liquidity of the selected cryptocurrencies during these periods using high frequency data. Our findings show that the hourly returns of the three cryptocurrencies during a bull market indicate market efficiency when using the DFA method to analyse the Hurst exponent with a rolling window. However, when the conditions turn and there is a bear market period, we see signs that the market starts to become market inefficient. Furthermore, we illustrate the effect on liquidity during the bull and bear markets for chosen cryptocurrencies. Chapter 7 investigates the adaptive market hypothesis (AMH) with respect to the high frequency markets of the two largest cryptocurrencies | Bitcoin and Ethereum, versus the Euro and US Dollar. Our findings are consistent with the AMH and show that the efficiency of the markets varies over time. We also discuss possible news and events which coincide with significant changes in the market efficiency. Furthermore, we analyse the effect of the sentiment of these news and other factors (events) on the market efficiency in the high frequency setting, and provide a simple event analysis to investigate whether specific factors affect the market efficiency/inefficiency. The results show that the sentiment and types of news and events may not be a significant factor in determining the efficiency of cryptocurrency markets.
- Published
- 2020
47. Experiment‐based statistical distribution of buckling loads of cylindrical shells.
- Author
-
Li, Zheng, Pasternak, Hartmut, and Geißler, Karsten
- Subjects
CYLINDRICAL shells ,DISTRIBUTION (Probability theory) ,MECHANICAL buckling ,MAXIMUM entropy method ,LASER measurement ,NONLINEAR analysis - Abstract
A silo structure is usually constructed in the form of a cylindrical steel shell. It has the advantages of being lightweight, having a short construction period, and possessing a large storage space. It has been widely used in many fields of industry. Thin‐walled cylindrical shells often exhibit buckling failure and the experimental buckling load is usually lower than calculation results from classical theory and simulation without geometrical imperfection. Besides, test results with carefully conducted similar specimens still have substantial scatter due to imperfection sensitivity. The nonlinear analysis with FEM can obtain a high‐precision result comparing the experiment if the geometric parameters of the shell are fully known. However, this is almost impossible in practical engineering. The initial geometric imperfections and shell thickness of cylindrical shells are complex and random properties. Theoretically, these geometric imperfections can be described using a random field. This paper presents the experimental investigation of buckling analysis of cylindrical shells under axial compression considering the randomness of geometric imperfections and thickness. A total of 12 cylindrical shell specimens were fabricated and tested. Based on the test results, the optimal statistical distribution is obtained by the maximum entropy fitting method and the obtained results were compared with geometric imperfections based on laser scan measurements. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. A Method of Iteration of Averages in Ecological Research.
- Author
-
Sukhorukikh, Yu. I. and Biganova, S. G.
- Abstract
The study and conservation of biological diversity is one of the global contemporary environmental problems of world significance. The goal of the study is to propose a method for identifying promising plants based on iteration of average values. For this purpose, nine trial plots were laid in the steppe, forest-steppe, and mountain zones of the Northwestern Caucasus in protective and forest stands. Based on well-known methods, the height, trunk diameter, crown radius, abundance of fruiting, weight and selection value of nuts were studied. We studied the change in the average values of the indicators at different numbers of iterations. A method for identifying promising individuals based on the values of the average of iterations 4 and 5 was proposed. The proposed method was compared with the known method based on the values of the sum (difference) of the mean and double standard deviation. With a statistically normal distribution of indicators, the known and proposed methods give close values in the direction of increase or decrease in the indicator. Using the known method with a statistical distribution that differs from normal in some cases gives meaningless values. The proposed method based on the average value of iterations 4 and 5 in the direction of increase or decrease in the studied indicator allows one to use it to select objects with a statistical distribution that differs from normal. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Reliability of Ensemble Climatological Forecasts.
- Author
-
Huang, Zeqing, Zhao, Tongtiegang, Tian, Yu, Chen, Xiaohong, Duan, Qingyun, and Wang, Hao
- Subjects
PROBABILITY density function ,CUMULATIVE distribution function ,GAMMA distributions ,PARETO distribution ,BINOMIAL distribution ,FORECASTING ,EXTREME value theory ,DISTRIBUTION (Probability theory) ,CENSORING (Statistics) - Abstract
Ensemble climatological forecasts play a critical part in benchmarking the predictive performance of hydroclimatic forecasts. Accounting for the skewness and censoring characteristics of hydroclimatic variables, ensemble climatological forecasts can be generated by the log, Box‐Cox and log‐sinh transformations, by the combinations of the Bernoulli distribution with the Gaussian, Gamma, log‐normal, generalized extreme value, generalized logistic and Pearson type III distributions and by the non‐parametric resampling, empirical cumulative distribution function and kernel density estimation methods. This paper is concentrated on the reliability of the 12 types of ensemble climatological forecasts. Specifically, mathematical formulations are presented and large‐sample tests are devised to verify the forecast reliability for the Multi‐Source Weighted‐Ensemble Precipitation version 2 across the globe. Climatological forecasts of monthly precipitation over 18,425 grid cells are generated for 30 years under leave‐one‐year‐out cross validation, leading to 6,633,000 (12 × 18425 × 30) sets of ensemble climatological forecasts. The results point out that the reliability of climatological forecasts considerably varies across the 12 methods, particularly in regions with high hydroclimatic variability. One observation is that climatological forecasts tend to deviate from the distributions of observations when there is inadequate flexibility to fit precipitation data. Another observation is that ensemble spreads can be overly wide when there exist overfits of sample‐specific noises in cross validation. Through the tests of global precipitation, the robustness of the log‐sinh transformation and the Bernoulli‐Gamma distribution is highlighted. Overall, the investigations can serve as a guidance on the uses of transformations, distributions and non‐parametric methods in generating climatological forecasts. Plain Language Summary: Ensemble climatological forecasts have been extensively used as the benchmark to evaluate forecast skill. That is, forecasts generated by a certain forecasting model are skillful when they outperform climatological forecasts and otherwise they are not. In practice, ensemble climatological forecasts are generated by different methods, including the log, Box‐Cox and log‐sinh transformations, the combinations of the Bernoulli distribution with the Gaussian, Gamma, log‐normal, generalized extreme value, generalized logistic and Pearson type III distributions and the non‐parametric resampling, empirical cumulative distribution function and kernel density estimation methods. It is important to investigate pros and cons of different types of climatological forecasts. Focusing on the reliability, that is, statistical consistency between forecasts and observations, this paper has devised large‐sample tests of global monthly precipitation. The results show that owing to hydroclimatic variability, different types of climatological forecasts exhibit varying characteristics of reliability. On the one hand, climatological forecasts can deviate from observations when there is inadequate flexibility to fit precipitation data, especially for the Bernoulli‐Gaussian distribution. On the other hand, ensemble spreads can be too wide when there exist overfits of sample‐specific noises in cross validation. Among the 12 methods, the robustness of the log‐sinh transformation and the Bernoulli‐Gamma distribution is highlighted. Key Points: Climatological forecasts can be generated by using data transformations, statistical distributions and non‐parametric methodsThe reliability of climatological forecasts generated by different methods is shown to vary considerably in large‐sample testsThe robustness of log‐sinh transformation and Bernoulli‐Gamma distribution is illustrated through the tests of global precipitation [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. New horizon in fuzzy distributions: statistical distributions in continuous domains generated by Choquet integral.
- Author
-
Mehri-Dehnavi, Hossein, Agahi, Hamzeh, and Mesiar, Radko
- Subjects
- *
CONTINUOUS distributions , *DISTRIBUTION (Probability theory) , *GOLD sales & prices , *GAMMA distributions , *FUZZY integrals , *FUZZY measure theory , *HORIZON - Abstract
In this paper, some statistical properties of the Choquet integral are discussed. As an interesting application of Choquet integral and fuzzy measures, we introduce a new class of exponential-like distributions related to monotone set functions, called Choquet exponential distributions, by combining the properties of Choquet integral with the exponential distribution. We show some famous statistical distributions such as gamma, logistic, exponential, Rayleigh and other distributions are a special class of Choquet distributions. Then, we show that this new proposed Choquet exponential distribution is better on daily gold price data analysis. Also, a real dataset of the daily number of new infected people to coronavirus in the USA in the period of 2020/02/29 to 2020/10/19 is analyzed. The method presented in this article opens a new horizon for future research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.