4,630 results on '"BETA distribution"'
Search Results
2. Human Mastication Analysis—A DEM Based Numerical Approach.
- Author
-
Mishra, Rajat, Deb, Sagar Kumar, Chakrabarty, Swasti, Das, Manojit, Das, Monalisa, Panda, Sushanta Kumar, Tiwary, Chandra Shekhar, and Arora, Amit
- Subjects
- *
MOISTURE content of food , *DISCRETE element method , *BETA distribution , *MASTICATION , *POLYHEDRA - Abstract
ABSTRACT Mastication is an essential and preliminary step of the digestion process involving fragmentation and mixing of food. Controlled muscle movement of jaws with teeth executes crushing, leading towards fragmentation of food particles. Understanding various parameters involved with the process is essential to solve any biomedical complication in the area of interest. However, exploring and analyzing such process flow through an experimental route is challenging and inefficient. Computational techniques such as discrete element numerical modeling can effectively address such problems. The current work employs the Discrete Element Method (DEM) as a numerical modeling technique to simulate the human mastication process. Tavares and Ab‐T10 breakage models coupled with Gaudin Schumann and Incomplete Beta fragment distribution models have been implemented to analyze the fragmental distribution of food particles. The effect of particle shape (spherical, polyhedron, and faceted cylinder), size (aspect ratio), and orientation (vertical and horizontal) on breakage and fragment distribution is analyzed. To account for the elastic–plastic behavior and moisture content in food particles, modifications has been made in breakage models by incorporating numerical softening factor and adhesion force. The study demonstrates how numerical modeling techniques can be utilized to analyze the mastication process involving multiple process parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. On the Log-Concavity of the Wright Function.
- Author
-
Ferreira, Rui A. C. and Simon, Thomas
- Subjects
- *
BETA distribution , *PROBLEM solving , *ENTROPY , *LOGICAL prediction - Abstract
We investigate the log-concavity on the half-line of the Wright function ϕ (- α , β , - x) , in the probabilistic setting α ∈ (0 , 1) and β ≥ 0. Applications are given to the construction of generalized entropies associated to the corresponding Mittag-Leffler function. A natural conjecture for the equivalence between the log-concavity of the Wright function and the existence of such generalized entropies is formulated. The problem is solved for β ≥ α and in the classical case β = 1 - α of the Mittag-Leffler distribution, which exhibits a certain critical parameter α ∗ = 0.771667... defined implicitly on the Gamma function and characterizing the log-concavity. We also prove that the probabilistic Wright functions are always unimodal, and that they are multiplicatively strongly unimodal if and only if β ≥ α or α ≤ 1 / 2 and β = 0. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. The improbability of detecting trade-offs and some practical solutions.
- Author
-
Johnson, Marc T J and Nassrullah, Zain
- Subjects
- *
SKEWNESS (Probability theory) , *BETA distribution , *RESOURCE allocation , *BIODIVERSITY , *PLANT defenses - Abstract
Trade-offs are a fundamental concept in evolutionary biology because they are thought to explain much of nature's biological diversity, from variation in life-histories to differences in metabolism. Despite the predicted importance of trade-offs, they are notoriously difficult to detect. Here we contribute to the existing rich theoretical literature on trade-offs by examining how the shape of the distribution of resources or metabolites acquired in an allocation pathway influences the strength of trade-offs between traits. We further explore how variation in resource distribution interacts with two aspects of pathway complexity (i.e., the number of branches and hierarchical structure) affects tradeoffs. We simulate variation in the shape of the distribution of a resource by sampling 106 individuals from a beta distribution with varying parameters to alter the resource shape. In a simple "Y-model" allocation of resources to two traits, any variation in a resource leads to slopes less than −1, with left skewed and symmetrical distributions leading to negative relationships between traits, and highly right skewed distributions associated with positive relationships between traits. Adding more branches further weakens negative and positive relationships between traits, and the hierarchical structure of pathways typically weakens relationships between traits, although in some contexts hierarchical complexity can strengthen positive relationships between traits. Our results further illuminate how variation in the acquisition and allocation of resources, and particularly the shape of a resource distribution and how it interacts with pathway complexity, makes its challenging to detect trade-offs. We offer several practical suggestions on how to detect trade-offs given these challenges. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. User Behavior in Fast Charging of Electric Vehicles: An Analysis of Parameters and Clustering.
- Author
-
Capeletti, Marcelo Bruno, Hammerschmitt, Bruno Knevitz, Silva, Leonardo Nogueira Fontoura da, Knak Neto, Nelson, Passinato Sausen, Jordan, Barriquello, Carlos Henrique, and Abaide, Alzenira da Rosa
- Subjects
- *
PROBABILITY density function , *ELECTRIC charge , *RENEWABLE energy sources , *BETA distribution , *DATA mining - Abstract
The fast charging of electric vehicles (EVs) has stood out prominently as an alternative for long-distance travel. These charging events typically occur at public fast charging stations (FCSs) within brief timeframes, which requires a substantial demand for power and energy in a short period. To adequately prepare the system for the widespread adoption of EVs, it is imperative to comprehend and establish standards for user behavior. This study employs agglomerative clustering, kernel density estimation, beta distribution, and data mining techniques to model and identify patterns in these charging events. They utilize telemetry data from charging events on highways, which are public and cost-free. Critical parameters such as stage of charge (SoC), energy, power, time, and location are examined to understand user dynamics during charging events. The findings of this research provide a clear insight into user behavior by separating charging events into five groups, which significantly clarifies user behavior and allows for mathematical modeling. Also, the results show that the FCSs have varying patterns according to the location. They serve as a basis for future research, including topics for further investigations, such as integrating charging events with renewable energy sources, establishing load management policies, and generating accurate load forecasting models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Application and improvement of continuous monitoring methods for artificial radionuclides based on Bayesian statistics.
- Author
-
Li, Xiang, Huang, Qianhong, Xie, Yuxi, and Gong, Xueyu
- Subjects
- *
BETA distribution , *BINOMIAL distribution , *RADIOISOTOPES , *STATISTICS - Abstract
This paper delves into the Bayesian statistics applications of three preeminent models, Poisson distribution, Gaussian distribution, and Binomial distribution, in the continuous surveillance of artificial radionuclides. It introduces a slide-window method to accelerate the updating of the prior distribution of model parameters and compares the performances of three models before and after utilizing this method. Comparisons among the three models are made before and after using the slide-window. Experimental results demonstrate a marked enhancement in the performances of all models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Sparse-group boosting: Unbiased group and variable selection.
- Author
-
Obster, Fabian and Heumann, Christian
- Subjects
- *
RANDOM variables , *BETA distribution , *REGULARIZATION parameter , *DEGREES of freedom , *ORGANIZATIONAL research - Abstract
AbstractFor grouped covariates, we propose a framework for boosting that allows for sparsity within and between groups. By using component-wise and group-wise gradient ridge boosting simultaneously with adjusted degrees of freedom or penalty parameters, a model with similar properties as the sparse-group lasso can be fitted through boosting. We show that within-group and between-group sparsity can be controlled by a mixing parameter, and discuss similarities and differences to the mixing parameter in the sparse-group lasso. Furthermore, we show under which conditions variable selection on a group or individual variable basis happens and provide selection bounds for the regularization parameters depending solely on the singular values of the design matrix in a boosting iteration of linear Ridge penalized boosting. In special cases, we characterize the selection chance of an individual variable vs. a group of variables through a generalized beta prime distribution. With simulations as well as two real datasets from ecological and organizational research data, we show the effectiveness and predictive competitiveness of this novel estimator. The results suggest that in the presence of grouped variables, sparse-group boosting is associated with less biased variable selection and higher predictability compared to component-wise or group-component-wise boosting. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. New alpha power transformed beta distribution with its properties and applications.
- Author
-
Agegnehu, Adimias Wendimagegn, Goshu, Ayele Taye, Arero, Butte Gotu, Odoi, Benjamin, and Nascimento, Abraao
- Subjects
DISTRIBUTION (Probability theory) ,PROBABILITY density function ,BETA distribution ,STATISTICS ,PROBABILITY theory - Abstract
The main purpose of this paper is to introduce a new alpha power transformed beta probability distribution that reveals interesting properties. The studuy provide a comprehensive explanation of the statistical characteristics of this innovative model. Various properties of the new distribution were derived, using the baseline beta distribution, statistical techniques, and probabilistic axioms. These include the probability density, cumulative distribution, survival function, hazard function, moments about the origin, moment generating function, and order statistics. For parameter estimation, the maximum likelihood estimation method using Newton Raphson numerical technique is employed. To evaluate the performance of our estimation method, the mean squared errors of the estimated parameters for different simulated sample sizes are used. In addition simulation studies of the new distribution are conducted to demonstrate the behavior of the probability model. To demonstrate the practical utility and flexibility of the alpha power transformed beta distribution, it is fitted to two real-life datasets and compared to commonly known probability distributions such as the Weibull, exponential Weibull, Beta, and Kumaraswamy beta distributions. It offers a superior fit to the data considered. The distribution reviales of the microbes reveald a wide range of shapes of probability density functions and flexible hazard rates. The distribution is a new contribution to the field of statistical and probability theory. The findings of the study can be used as a basis for future research in the area of statistical science and health. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Wind speed and direction on predicting wind energy availability for a sustainable ecosystem.
- Author
-
Baranitharan, B., Sivakumar, D., and Selvan, M. Perarul
- Subjects
RENEWABLE energy sources ,CLEAN energy ,BETA distribution ,WIND power plants ,WEIBULL distribution ,WIND power - Abstract
BACKGROUND AND OBJECTIVES: Overusing renewable resources for various purposes is making it necessary to use fewer non-renewable ones to generate energy. Finding alternative renewable energy sources is essential for energy production. This study concentrated on using wind direction and speed to produce wind energy among renewable energy sources. Data on wind direction and speed were statistically analyzed to determine the current distribution pattern, which is then used to project the amount of wind energy that will be available in the future. METHODS: This study concentrated on choosing wind direction and speed to minimize the potential for current electricity generation from wind turbines, using data collected between 1981 and 2023. The wind speed and direction distribution pattern was assessed through the Weibull distribution, beta distribution, and three-parameter Weibull distribution. The Anderson-Darling test and the Kolmogorov-Smirnov test were employed in this study to determine the goodness-of-fit of a specific distribution. The forecasting analysis was expanded from 2024 to 2050 based on the three-parameter Weibull distribution and Anderson-Darling test results for future sustainable wind energy production. FINDINGS: The average wind speed was found to be 6.51 meters per second, with a standard deviation of 0.280 meters per second between 1981 and 2023. The wind direction varied between a minimum of 3.56 and a maximum of 356.44 degrees for the same duration. The study discovered that the three-parameter Weibull distribution caused less error in the wind speed data distribution pattern than both the Weibull distribution and beta distribution, based on the results of the Anderson-Darling and Kolmogorov-Smirnov tests. From both the tests on Weibull distribution, beta distribution, and three-parameter Weibull distribution, this study found that the Anderson-Darling test was the most appropriate for forecasting the wind speed corresponding to the wind direction for the periods between 2024 and 2050 to produce sustainable wind energy from the wind turbine. CONCLUSION: The outcomes of this study demonstrate that there is a good likelihood that the parameter Weibull distribution and Anderson-Darling test will be used in other nations to aid in the complementary integration of wind energy. This research has the potential to significantly reduce the amount of environmentally hazardous energy sources used to meet societal requirements. This study offers a trustworthy technique for assessing wind direction and speed, which helps design sustainable wind power plants, construct engineering curricula, and estimate clean, environmentally friendly energy sources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Applications of the prediction of satisfaction design for monitoring single-arm phase II trials.
- Author
-
Djeridi, Zohra, Ghouar, Ahlem, Boulares, Hamid, and Bouye, Mohamed
- Subjects
- *
FALSE positive error , *BETA distribution , *SATISFACTION , *FRUSTRATION , *LUNG cancer - Abstract
Prediction of satisfaction design, with binary endpoints, is an innovative strategy for phase II trials. We explain this hybrid frequentist-Bayesian strategy with an adept statistical plan and thorough findings, incorporating a description of study design features such as the sample size and the beta prior distribution, to simplify the Bayesian design. We also provide a set of tables and figures ranging from the stopping boundary for futility to the prediction of satisfaction, performance (type I error, power, and the probability of early termination PET), and sensitivity analysis for prediction of satisfaction. The statistical plan includes the operating characteristics through simulation study. Several trial examples from phase II lung cancer studies demonstrate the approach's practical use. The prediction of satisfaction design presents a flexible method in clinical study. This statistical study adds value to the application by broadening its scope. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. On Extended Beta Function and Related Inequalities.
- Author
-
Parmar, Rakesh K., Pogány, Tibor K., and Teofanov, Ljiljana
- Subjects
- *
BETA functions , *DISTRIBUTION (Probability theory) , *INTEGRAL functions , *BETA distribution , *INTEGRAL representations - Abstract
In this article, we consider a unified generalized version of extended Euler's Beta function's integral form a involving Macdonald function in the kernel. Moreover, we establish functional upper and lower bounds for this extended Beta function. Here, we consider the most general case of the four-parameter Macdonald function K ν + 1 2 p t − λ + q (1 − t) − μ when λ ≠ μ in the argument of the kernel. We prove related bounding inequalities, simultaneously complementing the recent results by Parmar and Pogány in which the extended Beta function case λ = μ is resolved. The main mathematical tools are integral representations and fixed-point iterations that are used for obtaining the stationary points of the argument of the Macdonald kernel function K ν + 1 2 . [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Quantum properties of classical Pearson random variables.
- Author
-
Accardi, Luigi, Ella, Abdon Ebang, Ji, Un Cig, and Lu, Yun Gang
- Subjects
- *
POLYNOMIAL operators , *PSEUDODIFFERENTIAL operators , *DIFFERENTIAL operators , *OPERATOR equations , *BETA distribution - Abstract
This paper discusses the properties of the canonical quantum decomposition of the classical Pearson random variables. We show that this leads to the problem of representing the creation–annihilation–preservation (CAP) operators canonically associated to a real-valued random variable X with all moments as (normally ordered) differential operators with polynomial coefficients — a problem already studied in the literature (see references in the introduction). We deduce a formula, for the polynomial coefficients in the representation of the CAP operators of X as pseudo-differential operators, more explicit than the one existing in the literature. We give a new characterization of the Pearson distributions in terms of the Hermitianity of the associated Sturm–Liouville operators. In the second part of the paper, we introduce the notion of finite type random variable X and characterize type- 2 and type- 3 real-valued random variables. We prove that a necessary condition for X to be of finite type is the polynomial growth of the corresponding principal Jacobi sequence. This allows to single out three classes of random variables of infinite type and to prove that the Beta and the uniform distributions are of infinite type. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Modeling the Liberation of Iron Ores with Different Grain Sizes Considering Intergranular Fracture.
- Author
-
Chen, Keqiang, Yin, Wanzhong, Zuo, Weiran, and Fu, Yafeng
- Subjects
- *
OXIDE minerals , *FERRIC oxide , *IRON ores , *BETA distribution , *GRAIN size - Abstract
Predicting the liberation distribution of minerals in comminuted ore particles is one of the fundamental problems in mineral processing. However, there is still no widely accepted liberation model due to the complexity of the mineral fracture mechanism. In this study, the shape constants of the cumulative beta distribution were further optimized based on the investigation of the liberation distribution and fracture mechanism of iron oxide minerals in three types of iron ores with different grain sizes by locked-cycle grinding and batch grinding to model the liberation distribution of iron oxide minerals with different grinding methods. The results showed that the liberation distribution of iron oxide minerals in the three types of iron ores with locked-cycle grinding and batch grinding was related to intergranular fracture proportion and the closeness of ore particle size and mineral grain size. By relating the shape constants of cumulative beta distribution to ore particle size, mineral grain size, and the proportion of intergranular fracture, the developed liberation model well predicted the liberation distribution of iron oxide minerals with different grinding methods and its coefficient of determination (R2) was between 0.85–0.93. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Beta Distribution Function for Cooperative Spectrum Sensing against Byzantine Attack in Cognitive Wireless Sensor Networks.
- Author
-
Wu, Jun, Liu, Tianle, and Zhao, Rui
- Subjects
DISTRIBUTION (Probability theory) ,BETA distribution ,CONSUMER cooperatives ,BETA functions ,ERROR probability - Abstract
In order to explore more spectrum resources to support sensors and their related applications, cognitive wireless sensor networks (CWSNs) have emerged to identify available channels being underutilized by the primary user (PU). To improve the detection accuracy of the PU signal, cooperative spectrum sensing (CSS) among sensor paradigms is proposed to make a global decision about the PU status for CWSNs. However, CSS is susceptible to Byzantine attacks from malicious sensor nodes due to its open nature, resulting in wastage of spectrum resources or causing harmful interference to PUs. To suppress the negative impact of Byzantine attacks, this paper proposes a beta distribution function (BDF) for CSS among multiple sensors, which includes a sequential process, beta reputation model, and weight evaluation. Based on the sequential probability ratio test (SPRT), we integrate the proposed beta reputation model into SPRT, while improving and reducing the positive and negative impacts of reliable and unreliable sensor nodes on the global decision, respectively. The numerical simulation results demonstrate that, compared to SPRT and weighted sequential probability ratio test (WSPRT), the proposed BDF has outstanding effects in terms of the error probability and average number of samples under various attack ratios and probabilities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Normal-beta exponential stochastic frontier model: Maximum simulated likelihood approach.
- Author
-
Nigusie, Misgan Desale
- Subjects
DISTRIBUTION (Probability theory) ,PROBABILITY density function ,BETA distribution ,LIKELIHOOD ratio tests ,STOCHASTIC models - Abstract
This paper considers the beta exponential distribution as a distribution function of inefficacy score in a stochastic frontier model. The beta exponential distribution is a three-parameter distribution, and it is more flexible than commonly used probability density functions in a stochastic frontier model (SFM). This new model, a "Normal-Beta Exponential SFM", nests another five SFMs. This paper presents a simulated log-likelihood function and simulated inefficiency estimator of a normal-beta exponential SFM, a closed form log-likelihood function and closed form inefficiency estimator of a normal-weighted exponential SFM, and an empirical study using a normal-beta exponential SFM. In our empirical study, we have used a likelihood ratio test to compare the performance of SFMs and a normal-beta exponential SFM fits the data better than other nested special case SFMs. Furthermore, the empirical result shows that parameters of a normal-beta exponential SFM can be estimated with less standard error or high certainty than a normal-gamma SFM. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Propagation of Non-Adiabatic Heterogeneous Solid Combustion Waves with Effect to Internal Microstructure and Heat Release.
- Author
-
Bharat, Naine Tarun and Mishra, Debi Prasad
- Subjects
HEAT release rates ,DISTRIBUTION (Probability theory) ,INTERNAL waves ,BETA distribution ,HEAT losses - Abstract
Propagation of combustion front with effect to the external heat loss is studied by the two-dimensional discrete periodic and disorder systems. Discrete periodic systems are modeled by the regular arrangement of burnt and unburnt point heat sources characterized by the identical rate of heat release and different heat loss parameter. Discrete disorder systems, namely, S1, S2 and S3 are considered with the concatenation of regular burnt and unburnt point heat sources distributed by different shaping parameter of beta distribution function. The unburnt point heat sources in a discrete disorder system are characterized by the different heat loss parameter and incorporation of two categories of distributions for the rate of heat release namely (i) gamma (a = 20) and (ii) gamma (a = 2). The oscillations in ignition time delay, with the manifestation of bifurcation pattern, have been detected for discrete periodic systems. In a discrete disorder system, the external heat loss plays a crucial role in developing the fingering instabilities and determining the early onset of unstable combustion, quenching and combustion limit. Degree of randomness in the distribution of heat release for the disorder system affects the duration of combustion, burn rate without affecting the combustion limit. Computed values of burn rate and combustion limit from the discrete combustion model have been compared with the available data from experiment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. The dynamics of evolutionary branching in an ecological model.
- Author
-
Cropp, Roger and Norbury, John
- Subjects
CLIMATE change models ,BETA distribution ,ECOSYSTEM dynamics ,EVOLUTION equations ,ECOLOGICAL models - Abstract
Eco-evolutionary modelling involves the coupling of ecological equations to evolutionary ones. The interaction between ecological dynamics and evolutionary processes is essential to simulating evolutionary branching, a precursor to speciation. The creation and maintenance of biodiversity in models depends upon their ability to capture the dynamics of evolutionary branching. Understanding these systems requires low-dimension models that are amenable to analysis. The rapid reproduction rates of marine plankton ecosystems and their importance in determining the fluxes of climatically important gases between the ocean and atmosphere suggest that the next generation of global climate models needs to incorporate eco-evolutionary models in the ocean. This requires simple population-level models, that can represent such eco-evolutionary processes with orders of magnitude fewer equations than models that follow the dynamics of individual phenotypes. We present a general framework for developing eco-evolutionary models and consider its general properties. This framework defines a fitness function and assumes a beta distribution of phenotype abundances within each population. It simulates the change in total population size, the mean trait value, and the trait differentiation, from which the variance of trait values in the population may be calculated. We test the efficacy of the eco-evolutionary modelling framework by comparing the dynamics of evolutionary branching in a six-equation eco-evolutionary model that has evolutionary branching, with that of an equivalent one-hundred equation model that simulates the dynamics of every phenotype in the population. The latter model does not involve a population fitness function, nor does it assume a distribution of phenotype abundance across trait values. The eco-evolutionary population model and the phenotype model produce similar evolutionary branching, both qualitatively and quantitatively, in both symmetric and asymmetric fitness landscapes. In order to better understand the six-equation model, we develop a heuristic three-equation eco-evolutionary model. We use the density-independent mortality parameter as a convenient bifurcation parameter, so that differences in evolutionary branching dynamics in symmetric and asymmetric fitness landscapes may be investigated. This model shows that evolutionary branching of a stable population is flagged by a zero in the local trait curvature; the trait curvature then changes sign from negative to positive and back to negative, along the solution. It suggests that evolutionary branching points may be generated differently, with different dynamical properties, depending upon, in this case, the symmetry of the system. It also suggests that a changing environment, that may change attributes such as mortality, could have profound effects on an ecosystem's ability to adapt. Our results suggest that the properties of the three-dimensional model can provide useful insights into the properties of the higher-dimension models. In particular, the bifurcation properties of the simple model predict the processes by which the more complicated models produce evolutionary branching points. The corresponding bifurcation properties of the phenotype and population models, evident in the dynamics of the phenotype distributions they predict, suggest that our eco-evolutionary modelling framework captures the essential properties that underlie the evolution of phenotypes in populations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Process accident prediction using Bayesian network based on IT2Fs and Z-number: A case study of spherical tanks.
- Author
-
Aliabadi, Mostafa Mirzaei, Abbassi, Rouzbeh, Kalatpour, Omid, Ahmadi, Omran, and Moshiran, Vahid Ahmadi
- Subjects
- *
SOFT sets , *BAYESIAN analysis , *FAULT trees (Reliability engineering) , *BETA distribution , *SYSTEM identification - Abstract
This study aimed to propose a novel method for dynamic risk assessment using a Bayesian network (BN) based on fuzzy data to decrease uncertainty compared to traditional methods by integrating Interval Type-2 Fuzzy Sets (IT2FS) and Z-numbers. A bow-tie diagram was constructed by employing the System Hazard Identification, Prediction, and Prevention (SHIPP) approach, the Top Event Fault Tree, and the Barriers Failure Fault Tree. The experts then provided their opinions and confidence levels on the prior probabilities of the basic events, which were then quantified utilizing the IT2FS and combined using the Z-number to reduce the uncertainty of the prior probability. The posterior probability of the critical basic events (CBEs) was obtained using the beta distribution based on recorded data on their requirements and failure rates over five years. This information was then fed into the BN. Updating the BN allowed calculating the posterior probability of barrier failure and consequences. Spherical tanks were used as a case study to demonstrate and confirm the significant benefits of the methodology. The results indicated that the overall posterior probability of Consequences after the failure probability of barriers displayed an upward trend over the 5-year period. This rise in IT2FS-Z calculation outcomes exhibited a shallower slope compared to the IT2FS mode, attributed to the impact of experts' confidence levels in the IT2FS-Z mode. These differences became more evident by considering the 10−4 variance compared to the 10−5. This study offers industry managers a more comprehensive and reliable understanding of achieving the most effective accident prevention performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. The Exact Density of the Eigenvalues of the Wishart and Matrix-Variate Gamma and Beta Random Variables.
- Author
-
Mathai, A. M. and Provost, Serge B.
- Subjects
- *
SYMMETRIC functions , *BETA distribution , *GAMMA distributions , *DIFFERENTIAL equations , *BETA functions , *GAMMA functions - Abstract
The determination of the distributions of the eigenvalues associated with matrix-variate gamma and beta random variables of either type proves to be a challenging problem. Several of the approaches utilized so far yield unwieldy representations that, for instance, are expressed in terms of multiple integrals, functions of skew symmetric matrices, ratios of determinants, solutions of differential equations, zonal polynomials, and products of incomplete gamma or beta functions. In the present paper, representations of the density functions of the smallest, largest and j th largest eigenvalues of matrix-variate gamma and each type of beta random variables are explicitly provided as finite sums when certain parameters are integers and, as explicit series, in the general situations. In each instance, both the real and complex cases are considered. The derivations initially involve an orthonormal or unitary transformation whereby the wedge products of the differential elements of the eigenvalues can be worked out from those of the original matrix-variate random variables. Some of these results also address the distribution of the eigenvalues of a central Wishart matrix as well as eigenvalue problems arising in connection with the analysis of variance procedure and certain tests of hypotheses in multivariate analysis. Additionally, three numerical examples are provided for illustration purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Asymptotic approximations of expectations of power means.
- Author
-
Burić, Tomislav and Mihoković, Lenka
- Subjects
- *
DISTRIBUTION (Probability theory) , *CONTINUOUS distributions , *FINANCIAL statistics , *BETA distribution , *LOGNORMAL distribution - Abstract
In this paper we study how the expectations of power means behave asymptotically as some relevant parameter approaches infinity and how to approximate them accurately for general nonnegative continuous probability distributions. We derive approximation formulae for such expectations as distribution mean increases, and apply them to some commonly used distributions in statistics and financial mathematics. By numerical computations we demonstrate the accuracy of the proposed formulae which behave well even for smaller sample sizes. Furthermore, analysis of behavior depending on sample size contributes to interesting connections with the power mean of probability distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. 基于研制阶段数据融合的舰炮制导弹药测试性评估方法.
- Author
-
应文健, 程雨森, 王 旋, and 孙世岩
- Subjects
DISTRIBUTION (Probability theory) ,BETA distribution ,MULTISENSOR data fusion ,INFORMATION resources ,EVALUATION methodology - Abstract
Copyright of Systems Engineering & Electronics is the property of Journal of Systems Engineering & Electronics Editorial Department and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
22. Integrating renewable energy sources and electric vehicles in dynamic economic emission dispatch: an oppositional-based equilibrium optimizer approach.
- Author
-
Soni, Jatin and Bhattacharjee, Kuntal
- Subjects
- *
RENEWABLE energy sources , *WEIBULL distribution , *BETA distribution , *SUSTAINABILITY , *ELECTRIC vehicles - Abstract
This article proposes a solution to the Dynamic Economic Emission Dispatch (DEED) problem, which incorporates wind, solar and plug-in electric vehicles (PEVs) into the optimization challenge. The new model, called Wind-Solar-Plug in Electric Vehicle (WSPEV) DEED, utilizes an optimization technique called the Oppositional-based Equilibrium Optimizer (OEO) method with the Weibull and Beta distributions to model wind and solar resources. The charging and discharging patterns of PEVs are also considered in the model. The proposed approach is evaluated through several scenarios involving Renewable Energy Sources (RESs) and PEVs, and the simulation results demonstrate the effectiveness of the proposed model in achieving a sustainable and cost-effective power system. The WSPEV DEED model provides a valuable solution to the DEED problem, which is crucial for successfully integrating RESs and PEVs into the power system for a sustainable future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. On Point Estimators for Gamma and Beta Distributions.
- Author
-
Papadatos, Nickos D.
- Abstract
Let X 1 , ... , X n be a random sample from the gamma distribution with density f (x) = λ α x α − 1 e − λ x / Γ (α) , x > 0, where both α > 0 (the shape parameter) and λ > 0 (the reciprocal scale parameter) are unknown. The main result shows that the uniformly minimum variance unbiased estimator (UMVUE) of the shape parameter, α, exists if and only if n ≥ 4 ; moreover, it has finite variance if and only if n ≥ 6 . More precisely, the form of the UMVUE is given for all parametric functions α, λ, 1 / α , and 1 / λ . Furthermore, a highly efficient estimating procedure for the two-parameter beta distribution is also given. This is based on a Stein-type covariance identity for the beta distribution, followed by an application of the theory of U-statistics and the delta-method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Optimal repetitive reliability inspection of manufactured lots for lifetime models using prior information.
- Author
-
Pérez-González, Carlos J., Fernández, Arturo J., Giner-Bosch, Vicent, and Carrión-García, Andrés
- Subjects
INTEGRATED circuits manufacturing ,BETA distribution ,LOCATION problems (Programming) ,NONLINEAR programming ,PRODUCTION planning ,ACCELERATED life testing ,RELIABILITY in engineering ,NONLINEAR equations ,TEST reliability - Abstract
Repetitive group inspection of production lots is considered to develop the failure censored plan with minimal expected sampling effort using prior information. Optimal reliability test plans are derived for the family of log-location-scale lifetime distributions, whereas a limited beta distribution is assumed to model the proportion nonconforming, p. A highly efficient and quick step-by-step algorithm is proposed to solve the underlying mixed nonlinear programming problem. Conventional repetitive group plans are often very effective in reducing the average sample number with respect to other inspection schemes, but sample sizes may increase under certain conditions such as high censoring. The inclusion of previous knowledge from past empirical results contributes to drastically reduce the amount of sampling required in life testing. Moreover, the use of expected sampling risks significantly improves the assessment of the actual producer and consumer sampling risks. Several tables and figures are presented to analyse the effect of the available prior evidence about p. The results show that the proposed lot inspection scheme clearly outperforms the standard repetitive group plans obtained under the traditional approach based on conventional risks. Finally, an application to the manufacture of integrated circuits is included for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. Classical and Bayesian estimations of performance measures in Geo/Geo/1 queue.
- Author
-
Song, Yang and Liu, Xinying
- Subjects
- *
MARKOV chain Monte Carlo , *BETA distribution , *HYPERGEOMETRIC functions , *GAUSSIAN function , *COMPUTER simulation - Abstract
Abstract.In this article, we consider an early arrival Geo/Geo/1 queue from the statistical perspective. The estimations of parameters and system performances under Classical and Bayesian frameworks are considered. Based on queue length data, we attempt to study the uniform minimum-variance unbiased estimators (UMVUEs) and the closed Bayesian estimators for various queueing characteristics. We compare their properties using Monte Carlo (MC) simulations. As a comparison and improvement, based on arrival and departure data over a period, we use a Bivariate Beta distribution as joint prior, and solve the performance estimators using the properties of Gaussian hypergeometric function. We derive Bayesian estimators and conduct numerical simulations using Markov Chain Monte Carlo (MCMC) method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. A Bayesian approach with double group sampling plan to estimate quality regions for proportion of nonconforming products in industry based on beta prior.
- Author
-
Hafeez, Waqar, Du, Jianguo, Aziz, Nazrina, Ullah, Khalil, Wong, Wing-Keung, Imran, Muhammad, and Abbas, Zameer
- Subjects
- *
MANUFACTURING defects , *ACCEPTANCE sampling , *STATISTICAL process control , *BETA distribution , *PRODUCT acceptance , *BINOMIAL distribution , *ESTIMATES - Abstract
AbstractTo succeed in the global market, firms must prioritize quality over individual goals and preferences. One of the two primary approaches for ensuring quality is acceptance sampling, which is employed in statistical process control to inspect attributes of the product for acceptance. In acceptance sampling, the lot is either accepted or not-accepted based on predetermined acceptance criteria for inspection. This paper presents a proposed Bayesian double group sampling plan (BDGSP) for estimating quality regions. The binomial distribution is used to construct the likelihood function for both nonconforming and conforming products based on acceptance criteria. To calculate the average probability of acceptance, the beta distribution is used as a suitable prior distribution of the binomial distribution. Four distinct quality regions are predicted for various indicated producer’s and consumer’s risk levels. Based on various combinations of values for design parameters, the suggested plan generates variation point values. Risks for producers and consumers are related to acceptable quality level and limiting quality level. To track the effects of changes in the values of the specified parameters, operating characteristic curves are utilized. The applicability of the proposed plan for current industrial strategies is demonstrated using a real dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Facilitating clinical use of the Amsterdam Instrumental Activities of Daily Living Questionnaire: Normative data and a diagnostic cutoff value.
- Author
-
Postema, Merel C., Dubbelman, Mark A., Claesen, Jürgen, Ritchie, Craig, Verrijp, Merike, Visser, Leonie, Visser, Pieter-Jelle, Zwan, Marissa D., van der Flier, Wiesje M., and Sikkes, Sietske A.M.
- Subjects
- *
ALZHEIMER'S disease , *ACTIVITIES of daily living , *BETA distribution , *BRAIN research , *REFERENCE values - Abstract
Objective: The Amsterdam Instrumental Activities of Daily Living Questionnaire (A-IADL-Q) is well validated and commonly used to assess difficulties in everyday functioning regarding dementia. To facilitate interpretation and clinical implementation across different European countries, we aim to provide normative data and a diagnostic cutoff for dementia. Methods: Cross-sectional data from Dutch Brain Research Registry (N = 1,064; mean (M) age = 62 ± 11 year; 69.5% female), European Medial Information Framework-Alzheimer's Disease 90 + (N = 63; Mage = 92 ± 2 year; 52.4% female), and European Prevention of Alzheimer's Dementia Longitudinal Cohort Study (N = 247; Mage = 63 ± 7 year; 72.1% female) were used. The generalized additive models for location, scale, and shape framework were used to obtain normative values (Z -scores). The beta distribution was applied, and combinations of age, sex, and educational attainment were modeled. The optimal cutoff for dementia was calculated using area under receiver operating curves (AUC-ROC) and Youden Index, using data from Amsterdam Dementia Cohort (N = 2,511, Mage = 64 ± 8 year, 44.4% female). Results: The best normative model accounted for a cubic-like decrease of IADL performance with age that was more pronounced in low compared to medium/high educational attainment. The cutoff for dementia was 1.85 standard deviation below the population mean (AUC = 0.97; 95% CI [0.97–0.98]). Conclusion: We provide regression-based norms for A-IADL-Q and a diagnostic cutoff for dementia, which help improve clinical assessment of IADL performance across European countries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. BETA GENERATED SLASH DISTRIBUTION: DERIVATION, PROPERTIES AND APPLICATION TO LIFETIME DATA.
- Author
-
BHATTACHARJEE, Sahana and BORAH, Nandita
- Subjects
- *
BETA distribution , *LORENZ curve , *HAZARD function (Statistics) , *MAXIMUM likelihood statistics , *ORDER statistics - Abstract
In this paper, we introduce a new distribution called beta generated slash distribution by applying the slash construction idea to the existing beta distribution of first kind. The statistical properties of the distribution such as moments, skewness, kurtosis, median, moment generating function, mean deviations, Lorenz and Bonferroni curves, order statistics, Mills ratio, hazard rate functions have been discussed. The location-scale form of the beta generated slash distribution is also established. The hazard rate function is seen to assume different shapes depending upon the values of the parameters. The method of maximum likelihood is used to estimate the unknown parameters of beta generated slash distribution and a simulation study is conducted to check the performance of these estimates. Finally, the proposed distribution is applied to a real-life data set on failure times and the goodness-of-fit of the fitted distribution is compared with four other competing distributions to show its flexibility and advantage particularly in modeling heavy tailed data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. NEUTRON POINT KINETICS MODEL WITH A DISTRIBUTED-ORDER FRACTIONAL DERIVATIVE.
- Author
-
GODÍNEZ, F. A., FERNÁNDEZ-ANAYA, G., QUEZADA-GARCÍA, S., QUEZADA-TÉLLEZ, L. A., and POLO-LABARRIOS, M. A.
- Subjects
- *
CAPUTO fractional derivatives , *BETA distribution , *NUCLEAR reactors , *GAUSSIAN distribution , *NEUTRONS - Abstract
In this paper, the solutions of an extended form of the Fractional-order Neutron Point Kinetics (FNPK) equation in terms of Caputo-time derivatives of the same order are investigated. Instead of using a Caputo derivative, a distributed-order fractional derivative in the Caputo sense was employed in the term of the FNPK equation which is multiplied by the reactivity. This term plays an important role in the description of neutron kinetics during the start-up, shutdown, and steady-state processes in nuclear reactors. The extended (DFNPK) model was solved using the beta, normal, bimodal and Dirac delta distributions to investigate their effect on the transient state solutions of the neutron density. Regardless of the distribution used, the most significant finding is that a destabilizing effect on the neutron density is induced when the mode (or the instant of application of the Dirac delta) of the distribution tends to one while maintaining the orders of the Caputo-time derivatives constant. What defines the destabilizing effect are large magnitude oscillations, a rapid decay, and an oscillation-free steady state with a monotonic increase that is parallel to but somewhat above the trend determined by the FNPK equation. The extended model is anticipated to be effective for modeling neutron density dispersion in a highly heterogeneous medium that may be described using distributed derivatives. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Probability density and information entropy of machine learning derived intracranial pressure predictions.
- Author
-
Abdul-Rahman, Anmar, Morgan, William, Vukmirovic, Aleksandar, and Yu, Dao-Yi
- Subjects
- *
INTRACRANIAL pressure , *ENTROPY (Information theory) , *PROBABILITY density function , *STATISTICAL decision making , *DEFINITE integrals , *MACHINE learning , *FORECASTING , *SKEWNESS (Probability theory) , *BETA distribution - Abstract
Even with the powerful statistical parameters derived from the Extreme Gradient Boost (XGB) algorithm, it would be advantageous to define the predicted accuracy to the level of a specific case, particularly when the model output is used to guide clinical decision-making. The probability density function (PDF) of the derived intracranial pressure predictions enables the computation of a definite integral around a point estimate, representing the event's probability within a range of values. Seven hold-out test cases used for the external validation of an XGB model underwent retinal vascular pulse and intracranial pressure measurement using modified photoplethysmography and lumbar puncture, respectively. The definite integral ±1 cm water from the median (DIICP) demonstrated a negative and highly significant correlation (-0.5213±0.17, p< 0.004) with the absolute difference between the measured and predicted median intracranial pressure (DiffICPmd). The concordance between the arterial and venous probability density functions was estimated using the two-sample Kolmogorov-Smirnov statistic, extending the distribution agreement across all data points. This parameter showed a statistically significant and positive correlation (0.4942±0.18, p< 0.001) with DiffICPmd. Two cautionary subset cases (Case 8 and Case 9), where disagreement was observed between measured and predicted intracranial pressure, were compared to the seven hold-out test cases. Arterial predictions from both cautionary subset cases converged on a uniform distribution in contrast to all other cases where distributions converged on either log-normal or closely related skewed distributions (gamma, logistic, beta). The mean±standard error of the arterial DIICP from cases 8 and 9 (3.83±0.56%) was lower compared to that of the hold-out test cases (14.14±1.07%) the between group difference was statistically significant (p<0.03). Although the sample size in this analysis was limited, these results support a dual and complementary analysis approach from independently derived retinal arterial and venous non-invasive intracranial pressure predictions. Results suggest that plotting the PDF and calculating the lower order moments, arterial DIICP, and the two sample Kolmogorov-Smirnov statistic may provide individualized predictive accuracy parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Detailed study of the decay of 21Mg.
- Author
-
Jensen, E. A. M., Nielsen, S. T., Andreyev, A., Borge, M. J. G., Cederkäll, J., Fraile, L. M., Fynbo, H. O. U., Harkness-Brennan, L. J., Jonson, B., Judson, D. S., Kirsebom, O. S., Lică, R., Lund, M. V., Madurga, M., Marginean, N., Mihai, C., Page, R. D., Perea, A., Riisager, K., and Tengblad, O.
- Subjects
- *
EXCITED states , *BETA distribution , *MIRROR symmetry , *PROTONS - Abstract
Beta-delayed proton and gamma emission in the decay of 21 Mg has been measured at ISOLDE, CERN with the ISOLDE Decay Station (IDS) set-up. The existing decay scheme is updated, in particular what concerns proton transitions to excited states in 20 Ne. Signatures of interference in several parts of the spectrum are used to settle spin and parity assignments to highly excited states in 21 Na. The previously reported β p α branch is confirmed. A half-life of 120.5(4) ms is extracted for 21 Mg. The revised decay scheme is employed to test mirror symmetry in the decay and to extract the beta strength distribution of 21 Mg that is compared with theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Matrix Variate Generalized Inverted Beta Distribution.
- Author
-
Nagar, Daya K., Roldán-Correa, Alejandro, and Morán-Vásquez, Raúl Alejandro
- Subjects
- *
BETA distribution , *BETA functions , *GAMMA functions - Abstract
The beta (type 1) and inverted beta distributions are of vital importance in science, engineering and economics. In this article, we introduce a matrix variate generalization of the inverted beta distribution. By using linear and quadratic transformations, we obtain several matrix variate distributions including matrix variate generalized beta type 1 distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
33. A statistical primer on classical period-finding techniques in astronomy.
- Author
-
Giertych, Naomi, Shaban, Ahmed, Haravu, Pragya, and P Williams, Jonathan
- Subjects
- *
CHI-square distribution , *BETA distribution , *EXTREME value theory , *PERIODIC functions , *NULL hypothesis , *CHI-squared test - Abstract
The aim of our paper is to investigate the properties of the classical phase-dispersion minimization (PDM), analysis of variance (AOV), string-length (SL), and Lomb–Scargle (LS) power statistics from a statistician's perspective. We confirm that when the data are perturbations of a constant function, i.e. under the null hypothesis of no period in the data, a scaled version of the PDM statistic follows a beta distribution, the AOV statistic follows an F distribution, and the LS power follows a chi-squared distribution with two degrees of freedom. However, the SL statistic does not have a closed-form distribution. We further verify these theoretical distributions through simulations and demonstrate that the extreme values of these statistics (over a range of trial periods), often used for period estimation and determination of the false alarm probability (FAP), follow different distributions than those derived for a single period. We emphasize that multiple-testing considerations are needed to correctly derive FAP bounds. Though, in fact, multiple-testing controls are built into the FAP bound for these extreme-value statistics, e.g. the FAP bound derived specifically for the maximum LS power statistic over a range of trial periods. Additionally, we find that all of these methods are robust to heteroscedastic noise aimed to mimic the degradation or miscalibration of an instrument over time. Finally, we examine the ability of these statistics to detect a non-constant periodic function via simulating data that mimics a well-detached binary system, and we find that the AOV statistic has the most power to detect the correct period, which agrees with what has been observed in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Beta Distribution Weighted Fuzzy C-Ordered-Means Clustering.
- Author
-
Wang Hengda, Mohamad Mohsin, Mohamad Farhan, and Mohd Pozi, Muhammad Syafiq
- Subjects
FUZZY algorithms ,ALGORITHMS ,NOISE - Abstract
The fuzzy C-ordered-means clustering (FCOM) is a fuzzy clustering algorithm that enhances robustness and clustering accuracy through the ordered mechanism based on fuzzy C-means (FCM). However, despite these improvements, the FCOM algorithm’s effectiveness remains unsatisfactory due to the significant time cost incurred by its ordered operation. To address this problem, an investigation was conducted on the ordered weighted model of the FCOM algorithm leading to proposed enhancements by introducing the beta distribution weighted fuzzy C-ordered-means clustering (BDFCOM). The BDFCOM algorithm utilises the properties of the Beta distribution to weight sample features, thus not only circumventing the time cost problem of the traditional ordered mechanism but also reducing the influence of noise. Experiments were conducted on six UCI datasets to validate the effectiveness of the BDFCOM, comparing its performance against seven other clustering algorithms using six evaluation indices. The results show that compared to the average of the other seven algorithms, BDFCOM improves about 15 percent on F1-score, 11 percent on Rand Index, 13 percent on Adjusted Rand Index, 3 percent on Fowlkes-Mallows Index and 16 percent on Jaccard Index. For the other two ordered mechanism FCM algorithms, the time consumption was also reduced by 90.15 percent on average. The proposed algorithm, which designs a new way of feature weighting for ordered mechanisms, advances the field of ordered mechanisms. And, this paper provides a new method in the application field where there is a lot of noise in the dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Generalized mixed spatiotemporal modeling with a continuous response and random effect via factor analysis.
- Author
-
de Oliveira, Natália Caroline Costa and Mayrink, Vinícius Diniz
- Subjects
MONTE Carlo method ,FACTOR analysis ,BETA distribution ,UNIVERSITY & college admission - Abstract
This work focuses on Generalized Linear Mixed Models that incorporate spatiotemporal random effects structured via Factor Model (FM) with nonlinear interaction between latent factors. A central aspect is to model continuous responses from Normal, Gamma, and Beta distributions. Discrete cases (Bernoulli and Poisson) have been previously explored in the literature. Spatial dependence is established through Conditional Autoregressive (CAR) modeling for the columns of the loadings matrix. Temporal dependence is defined through an Autoregressive AR(1) process for the rows of the factor scores matrix. By incorporating the nonlinear interaction, we can capture more detailed associations between regions and factors. Regions are grouped based on the impact of the main factors or their interaction. It is important to address identification issues arising in the FM, and this study discusses strategies to handle this obstacle. To evaluate the performance of the models, a comprehensive simulation study, including a Monte Carlo scheme, is conducted. Lastly, a real application is presented using the Beta model to analyze a nationwide high school exam called ENEM, administered between 2015 and 2021 to students in Brazil. ENEM scores are accepted by many Brazilian universities for admission purposes. The real analysis aims to estimate and interpret the behavior of the factors and identify groups of municipalities that share similar associations with them. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Inference for the Parameters of a Zero-Inflated Poisson Predictive Model.
- Author
-
Deng, Min, Aminzadeh, Mostafa S., and So, Banghee
- Subjects
BETA distribution ,BAYES' estimation ,INSURANCE companies ,GAMMA distributions ,MAXIMUM likelihood statistics - Abstract
In the insurance sector, Zero-Inflated models are commonly used due to the unique nature of insurance data, which often contain both genuine zeros (meaning no claims made) and potential claims. Although active developments in modeling excess zero data have occurred, the use of Bayesian techniques for parameter estimation in Zero-Inflated Poisson models has not been widely explored. This research aims to introduce a new Bayesian approach for estimating the parameters of the Zero-Inflated Poisson model. The method involves employing Gamma and Beta prior distributions to derive closed formulas for Bayes estimators and predictive density. Additionally, we propose a data-driven approach for selecting hyper-parameter values that produce highly accurate Bayes estimates. Simulation studies confirm that, for small and moderate sample sizes, the Bayesian method outperforms the maximum likelihood (ML) method in terms of accuracy. To illustrate the ML and Bayesian methods proposed in the article, a real dataset is analyzed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Improved Breitung and Roling estimator for mixed-frequency models with application to forecasting inflation rates.
- Author
-
Omer, Talha, Månsson, Kristofer, Sjölander, Pär, and Kibria, B. M. Golam
- Subjects
INFLATION forecasting ,PRICE inflation ,BETA distribution ,BASE oils ,DEPENDENT variables - Abstract
Instead of applying the commonly used parametric Almon or Beta lag distribution of MIDAS, Breitung and Roling (J Forecast 34:588–603, 2015) suggested a nonparametric smoothed least-squares shrinkage estimator (henceforth SLS 1 ) for estimating mixed-frequency models. This SLS 1 approach ensures a flexible smooth trending lag distribution. However, even if the biasing parameter in SLS 1 solves the overparameterization problem, the cost is a decreased goodness-of-fit. Therefore, we suggest a modification of this shrinkage regression into a two-parameter smoothed least-squares estimator ( SLS 2 ). This estimator solves the overparameterization problem, and it has superior properties since it ensures that the orthogonality assumption between residuals and the predicted dependent variable holds, which leads to an increased goodness-of-fit. Our theoretical comparisons, supported by simulations, demonstrate that the increase in goodness-of-fit of the proposed two-parameter estimator also leads to a decrease in the mean square error of SLS 2 , compared to that of SLS 1 . Empirical results, where the inflation rate is forecasted based on the oil returns, demonstrate that our proposed SLS 2 estimator for mixed-frequency models provides better estimates in terms of decreased MSE and improved R
2 , which in turn leads to better forecasts. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
38. Plug-in bandwidth selection rules for the kernel quantile estimator.
- Author
-
Savchuk, Olga
- Subjects
- *
BANDWIDTHS , *BETA distribution , *QUANTILE regression , *GAUSSIAN distribution , *QUANTILES - Abstract
AbstractThe large sample theory of the kernel quantile estimator is extended by separately treating the cases where the underlying density has critical points. Our attempts of improving the quality of quantile estimation resulted in proposing the beta distribution’s bandwidth selection method that is quite successful in the case of a normal distribution. The performance of the beta reference rule is compared to that of two other plug-in type bandwidth selectors. Based on the theoretical and numerical results, we provide certain recommendations regarding using different type of estimators in practice. In particular, the beta reference rule and truncated normal rule are recommended for estimating the quantiles of the distributions close to normal. The sample quantile estimator worked well in all simulation settings and is thus recommended as a default quantile estimation method in a non-normal case. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Optimal reconfiguration, renewable DGs, and energy storage units’ integration in distribution systems considering power generation uncertainty using hybrid GWO-SCA algorithms.
- Author
-
Pujari, Harish Kumar, Rudramoorthy, Mageshvaran, Gopi R, Reshma, Mishra, Soumya, Alluraiah, N. Chinna, and N. B., Vaishali
- Subjects
- *
MICROGRIDS , *GREY Wolf Optimizer algorithm , *ENERGY storage , *METAHEURISTIC algorithms , *DISTRIBUTION (Probability theory) , *RENEWABLE energy sources , *BETA distribution - Abstract
To optimize radial distribution systems, this study suggests the utilization of the Grey Wolf Optimizer (GWO), a hybrid metaheuristic optimization method, combined with the Sine Cosine method (SCA). The primary objective of this work is to enhance the distribution system by determining the most efficient network reconfiguration, sizing, and placement of various distributed energy sources in distribution system. The energy sources considered include capacitors, solar cells, wind turbines, biomass-based distributed generation units, and battery storage units. To achieve this goal, the proposed strategy incorporates the power loss sensitivity technique, which assists in identifying suitable candidate buses and accelerates the resolution process. Moreover, the model considers fluctuations in solar irradiance and wind speed using Weibull and Beta probability distribution functions, compensating for the intermittent nature of renewable energy sources and the variability in demand. To address power fluctuations, voltage surges, significant losses, and inadequate voltage stability challenges, battery energy storage, diesel generators, and dispatchable biomass DGs are employed to mitigate variability and enhance supply continuity. The proposed approach is evaluated and validated by comparing it to existing optimization strategies using IEEE 69-bus and 84-bus RDSs. The results demonstrate that the suggested technique achieves faster convergence to near-optimal solutions. The proposed methodology yields a significant reduction of up to 80% in power losses in the 69-bus system and a 35% reduction in the 84-bus system, signifying higher performance than existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. A Practical and Precise Technique for Determination of Beta Emitter Source in Thickness Gauging of Thin Film.
- Author
-
Islami rad, S. Z. and Peyvandi, R. Gholipour
- Subjects
- *
THIN films , *BETA distribution , *MOTION picture distribution , *STANDARD deviations , *DATA quality - Abstract
The nuclear thickness gauging systems play an important role in the industry for invasive, online, and continuous measurements. The goal of the Beta thickness gauge is to obtain a precise measurement of thin films in which the performance of these gauging systems and output data quality are evaluated with the parameters including resolution, contrast, etc. The choice of the emitted suitable energy distribution of the Beta source is one of the effective factors in the system performance and precise measurement of thin films. In this research, a Beta thickness gauge with 147Pm and 85Kr sources was simulated and evaluated in biaxially oriented polypropylene sheet production lines in order to calculate the system performance due to Beta emitter sources with different energy distribution and select the suitable Beta emitter source. The relative error percentage, standard deviation, resolution, and contrast parameters for 147Pm energy distribution were calculated 1.413, 0.113, 0.007, and 0.008, respectively. Also, these parameters for 85Kr energy distribution were measured 2.750, 0.220, 0.014, and 0.001, respectively. The results reveal that the 147Pm energy distribution has superior in comparison with the 85Kr energy distribution for measuring of films or sheets with thin thickness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Clustered and Unclustered Group Testing for Biosecurity.
- Author
-
Clark, Robert Graham, Barnes, Belinda, and Parsa, Mahdi
- Subjects
- *
BIOSECURITY , *BETA distribution , *DISTRIBUTION (Probability theory) , *FARM produce , *JUDGMENT (Psychology) , *CUCURBITACEAE , *SENSITIVITY analysis - Abstract
Group testing is an important element of biosecurity operations, designed to efficiently reduce the risk of introducing exotic pests and pathogens with imported agricultural products. Groups of units, such as seeds, are selected from a consignment and tested for contamination, with a positive or negative test returned for each group. These schemes are usually designed such that the probability of detecting contamination is high assuming random mixing and a somewhat arbitrary design prevalence. We propose supplementing this approach with an assessment of the distribution of the number of contaminated units conditional on testing results. We develop beta-binomial models that allow for between-consignment variability in contamination levels, as well as including beta random effects to allow for possible clustering within the groups for testing. The latent beta distributions can be considered as priors and chosen based on expert judgement, or estimated from historical test results. We show that the parameter representing within-group clustering is, unsurprisingly, effectively non-identifiable. Sensitivity analysis can be conducted by investigating the consequences of assuming different values of this parameter. We also demonstrate theoretically and empirically that the estimated probability of a consignment containing contamination and evading detection is almost perfectly robust to mis-specification of the clustering parameter. We apply the new models to large cucurbit seed lots imported into Australia where they provide important new insights on the level of undetected contamination. Supplementary materials accompanying this paper appear on-line. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Beta regression for double‐bounded response with correlated high‐dimensional covariates.
- Author
-
Liu, Jianxuan
- Subjects
- *
BETA distribution , *STATISTICAL models - Abstract
Continuous responses measured on a standard unit interval (0,1)$$ \left(0,1\right) $$ are ubiquitous in many scientific disciplines. Statistical models built upon a normal error structure do not generally work because they can produce biassed estimates or result in predictions outside either bound. In real‐life applications, data are often high‐dimensional, correlated and consist of a mixture of various data types. Little literature is available to address the unique data challenge. We propose a semiparametric approach to analyse the association between a double‐bounded response and high‐dimensional correlated covariates of mixed types. The proposed method makes full use of all available data through one or several linear combinations of the covariates without losing information from the data. The only assumption we make is that the response variable follows a Beta distribution; no additional assumption is required. The resulting estimators are consistent and efficient. We illustrate the proposed method in simulation studies and demonstrate it in a real‐life data application. The semiparametric approach contributes to the sufficient dimension reduction literature for its novelty in investigating double‐bounded response which is absent in the current literature. This work also provides a new tool for data practitioners to analyse the association between a popular unit interval response and mixed types of high‐dimensional correlated covariates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Testing Multivariate Normality Based on Beta-Representative Points.
- Author
-
Cao, Yiwen, Liang, Jiajuan, Xu, Longhao, and Kang, Jiangrui
- Subjects
- *
DISTRIBUTION (Probability theory) , *BETA distribution , *FALSE positive error , *GAUSSIAN distribution , *GOODNESS-of-fit tests - Abstract
Testing multivariate normality in high-dimensional data analysis has been a long-lasting topic in the area of goodness of fit. Numerous methods for this purpose can be found in the literature. Reviews on different methods given by influential researchers show that new methods keep emerging in the literature from different perspectives. The theory of statistical representative points provides a new perspective to construct tests for multivariate normality. To avoid the difficulty and huge computational load in finding the statistical representative points from a high-dimensional probability distribution, we develop an approach to constructing a test for high-dimensional normal distribution based on the representative points of the simple univariate beta distribution. The representative-points-based approach is extended to the the case that the sample size may be smaller than the dimension. A Monte Carlo study shows that the new test is able to control type I error rates fairly well for both large and small sample sizes when faced with a high dimension. The power of the new test against some non-normal distributions is generally or substantially improved for a set of selected alternative distributions. A real-data example is given for a simple application illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Attribute Sampling Plan for Submitted Lots Based on Prior Information and Bayesian Approach.
- Author
-
Zhao, Jing, Zhang, Fengyun, Zhang, Xuan, Hu, Yuping, and Ding, Wenxing
- Subjects
- *
ACCEPTANCE sampling , *PROCESS capability , *SAMPLE size (Statistics) , *PRODUCT acceptance , *MANUFACTURING processes - Abstract
An acceptance sampling plan is a method used to make a decision about acceptance or rejection of a product based on adherence to a standard. Meanwhile, prior information, such as the process capability index (PCI), has been applied in different manufacturing industries to improve the quality of manufacturing processes and the quality inspection of products. In this paper, an attribute sampling plan is developed for submitted lots based on prior information and Bayesian approach. The new attribute sampling plans adjust sample sizes to prior information based on the status of the inspection target. To be specific, the sampling plans in this paper are indexed by the parameter trust with levels of low, medium, and high, where increasing trust level reduces sample size or risk. PCIs are an important basis for the choice of the trust level. In addition, multiple comparisons have been performed, including producer's risk and consumer's risk under different prior information parameters and different sample sizes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Analysis of Traffic Injury Crash Proportions Using Geographically Weighted Beta Regression.
- Author
-
da Silva, Alan Ricardo and Buffone, Roberto de Souza Marques
- Subjects
BETA distribution ,SKEWNESS (Probability theory) ,GAUSSIAN distribution ,REGRESSION analysis ,TRAFFIC accidents - Abstract
The classical linear regression model allows for a continuous quantitative variable to be modeled simply from other variables. However, this model assumes independence between observations, which, if ignored, can lead to methodological issues. Additionally, not all data follow a normal distribution, prompting the need for alternative modeling methods. In this context, geographically weighted beta regression (GWBR) incorporates spatial dependence into the modeling process and analyzes rates or proportions using the beta distribution. In this study, GWBR was applied to the traffic injury (fatal and non-fatal) crash proportions in Fortaleza, Ceará, Brazil, from 2009 to 2011. The results demonstrated that the local approach using the beta distribution is a viable model for explaining the traffic injury crash proportions, due to its flexibility in handling both symmetric and skewed distributions. Therefore, when analyzing rates or proportions, the use of the GWBR model is recommended. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Economical Group Chain Sampling Plans for Weibull Distribution Using Bayesian Approach.
- Author
-
Tanveer, Kinza, Mughal, Abdur Razzaque, Shahzadi, Ishmal, Albassam, Mohammed, and Aslam, Muhammad
- Subjects
WEIBULL distribution ,BAYESIAN analysis ,COST estimates ,BETA distribution ,PERFORMANCE evaluation - Abstract
The paper focuses on the economic design of group chain sampling plans (GChSP) for the Weibull distribution using Bayesian methodology. The GChSP is a technique to accept or reject a product based on a sample from a lot. The study addresses situations where destructive testing is costly and utilizes the Bayesian approach to make informed decisions. The research outlines the methodology of developing GChSP including the stages of construction, performance evaluation, and cost estimation. The study compares the proposed plans with an existing one and demonstrates that the Bayesian approach generally yields lower costs. We will provide tables, figures, and calculations related to various aspects of the proposed plans and their comparison with existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Employing zero-inflated beta distribution in an exposure-response analysis of TYK2/JAK1 inhibitor brepocitinib in patients with plaque psoriasis.
- Author
-
Tsamandouras, Nikolaos, Qiu, Ruolun, Hughes, Jim H., Sweeney, Kevin, Prybylski, John P., Banfield, Christopher, and Nicholas, Timothy
- Abstract
Brepocitinib is an oral selective dual TYK2/JAK1 inhibitor and based on its cytokine inhibition profile is expected to provide therapeutic benefit in the treatment of plaque psoriasis. Efficacy data from a completed Phase 2a study in patients with moderate-to-severe plaque psoriasis were utilized to develop a population exposure-response model that can be employed to inform dose selection decisions for further clinical development. A modeling approach that employs the zero-inflated beta distribution was used to account for the bounded nature and distributional characteristics of the Psoriasis Area and Severity Index (PASI) score data. The developed exposure-response model provided an adequate description of the observed PASI scores across all the treatment arms tested and across both the induction and maintenance dosing periods of the study. In addition, the developed model exhibited a good predictive capacity with regard to the derived responder metrics (e.g., 75%/90%/100% improvement in PASI score [PASI75/90/100]). Clinical trial simulations indicated that the induction/maintenance dosing paradigm explored in this study does not offer any advantages from an efficacy perspective and that doses of 10, 30, and 60 mg once-daily may be suitable candidates for clinical evaluation in subsequent Phase 2b studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Microbial Inactivation Kinetics Models, Survival Curves Shapes, and the Temporal Distributions of the Individual Germs Deactivation.
- Author
-
Peleg, Micha
- Abstract
Regardless of the targeted microbe type, a thermal or nonthermal food preservation or disinfection method's efficacy is primarily assessed based on its kinetics. Yet, there is growing realization that inactivation kinetics and the individual microbes' spectrum of vulnerabilities or resistances to a lethal agent are two sides of the same coin. This creates the possibility to convert traditional survival data plotted on linear or semilogarithmic coordinates to temporal distributions of the individual microbes' deactivation, or vice versa. Such conversions are demonstrated with simulated microbial survival patterns generated with different kinds of survival models: the two-parameter Weibull distribution of which the single-parameter loglinear model is a special case, the normal, lognormal, and Fermi distribution functions, which imply that complete microbial inactivation is theoretically impossible, the three-parameter Gompertz survival model which allows for definite residual survival, and the three-parameter version of the beta distribution function, allowing for a definite thermal death time beyond which no survivors will ever be found. Also provided are simulated examples of the survival patterns of mixed microbial populations, and they all demonstrate that the common shapes of microbial survival curves do not contain enough information to infer whether the targeted microbial population is genetically or physiologically uniform or a mixture of subpopulations. The presented analysis lends support to the notion that any proposed microbial survival kinetic model's validity should be tested by its ability to predict survival patterns not used in its formulation and not by statistical fit criteria. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Maximizing renewable energy integration with battery storage in distribution systems using a modified Bald Eagle Search Optimization Algorithm.
- Author
-
Khasanov, Mansur, Kamel, Salah, Hassan, Mohamed H., and Domínguez-García, Jose Luis
- Subjects
- *
METAHEURISTIC algorithms , *BATTERY storage plants , *RENEWABLE energy sources , *DISTRIBUTION (Probability theory) , *SOLAR oscillations , *BETA distribution , *ELECTRIC vehicle batteries - Abstract
Due to environmental concerns associated with conventional energy production, the use of renewable energy sources (RES) has rapidly increased in power systems worldwide, with photovoltaic (PV) and wind turbine (WT) technologies being the most frequently integrated. This study proposes a modified Bald Eagle Search Optimization Algorithm (LBES) to enhance the performance of the conventional BES optimizer and optimize the size and location of RES-based Distribution Generation (DG) and Battery Energy Storage Systems (BESS) in distribution systems (DS) to minimize power and energy losses. The modified BES algorithm enhances the exploration phase by utilizing both crossover and mutation techniques with the top three leaders. Moreover, a loss sensitivity factor (LSF) is applied to expedite the solution process by identifying appropriate candidate buses. The variability of solar irradiation and wind speed is modeled using Weibull and Beta probability distribution functions (PDF). To address issues related to high penetration of renewables and demand fluctuations, BESS is used to improve power supply continuity and mitigate fluctuations. The suggested approach is tested on typical 33- and 118-bus systems and compared to alternative methods. The results show significant reduction in energy losses (49.32%, 67.82%, and 64.89% for the 33-bus system and 41.9157%, 60.3766%, and 54.8317% for the 118-bus system) when integrating PV, WT-based DG, and PV + BESS units into the DS. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Crowdsourcing with the drift diffusion model of decision making.
- Author
-
Lalvani, Shamal and Katsaggelos, Aggelos
- Subjects
- *
DRIFT diffusion models , *DECISION making , *CROWDSOURCING , *BETA distribution , *GAUSSIAN processes , *DICAMBA - Abstract
Crowdsourcing involves the use of annotated labels with unknown reliability to estimate ground truth labels in datasets. A common task in crowdsourcing involves estimating reliabilities of annotators (such as through the sensitivities and specificities of annotators in the binary label setting). In the literature, beta or dirichlet distributions are typically imposed as priors on annotator reliability. In this study, we investigated the use of a neuroscientifically validated model of decision making, known as the drift-diffusion model, as a prior on the annotator labeling process. Two experiments were conducted on synthetically generated data with non-linear (sinusoidal) decision boundaries. Variational inference was used to predict ground truth labels and annotator related parameters. Our method performed similarly to a state-of-the-art technique (SVGPCR) in prediction of crowdsourced data labels and prediction through a crowdsourced-generated Gaussian process classifier. By relying on a neuroscientifically validated model of decision making to model annotator behavior, our technique opens the avenue of predicting neuroscientific biomarkers of annotators, expanding the scope of what may be learnt about annotators in crowdsourcing tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.