6,736 results on '"Lognormal distribution"'
Search Results
2. Competition among recombination pathways in single FAPbBr3 nanocrystals.
- Author
-
Singha, Prajit Kumar, Mukhopadhyay, Tamoghna, Tarif, Ejaj, Ali, Fariyad, and Datta, Anindya
- Subjects
- *
LOGNORMAL distribution , *DISTRIBUTION (Probability theory) , *POWER density , *NANOCRYSTALS , *PHOTOLUMINESCENCE - Abstract
Single particle level microscopy of immobilized FAPbBr3 nanocrystals (NCs) has elucidated the involvement of different processes in their photoluminescence (PL) intermittency. Four different blinking patterns are observed in the data from more than 100 NCs. The dependence of PL decays on PL intensities brought out in fluorescence lifetime intensity distribution (FLID) plots is rationalized by the interplay of exciton- and trion-mediated recombinations along with hot carrier (HC) trapping. The high intensity-long lifetime component is attributed to neutral exciton recombination, the low intensity-short lifetime component is attributed to trion assisted recombination, and the low intensity-long lifetime component is attributed to hot carrier recombination. Change-point analysis (CPA) of the PL blinking data reveals the involvement of multiple intermediate states. Truncated power law distribution is found to be more appropriate than power law and lognormal distribution for on and off events. Probability distributions of PL trajectories of single NCs are obtained for two different excitation fluences and wavelengths (λex = 400, 440 nm). Trapping rate (kT) prevails at higher power densities for both excitation wavelengths. From a careful analysis of the FLID and probability distributions, it is concluded that there is competition between the HC and trion assisted blinking pathways and that the contribution of these mechanisms varies with excitation wavelength as well as fluence. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Improved flood quantile estimation for South Africa
- Author
-
van der Spuy, D and Plessis, JA du
- Published
- 2024
4. An approach for parametric survival ANOVA with application to Weibull distribution.
- Author
-
Weerahandi, Samaradasa, Ananda, Malwane M. A., and Dag, Osman
- Abstract
Abstract.Motivated by the serious Type-I error issues of widely used Cox-PH method in survival analysis, this article introduces an approach one can take in deriving superior parametric tests based on distributions such as the gamma, lognormal, and Weibull. Some of the data available from such distributions are allowed to be censored, as usually the case in dealing with lifetime distributions. Since the classical approach fails to provide reasonable procedures to test the equality of means of a number of Weibull populations, beyond the two populations case, we will show how to apply the generalized inference approach to do so in a novel manner. The approach should work in other scale invariant distributions, such as the gamma and lognormal. The approach is illustrated with the Weibull distributions, taking advantage of the fact that the original data from continuous distributions can be transformed into normally distributed data leading to exact generalized test variables. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. The band structure and carrier recombination mechanism of α/β-phase tellurium homojunction investigated by infrared photoluminescence.
- Author
-
Ye, Xiaoyun, Zhu, Liangqing, Shao, Jun, Hu, Rui, Shang, Liyan, Chen, Xiren, Li, Yawei, Zhang, Jinzhong, Jiang, Kai, Chu, Junhao, and Hu, Zhigao
- Subjects
- *
PHYSICAL vapor deposition , *CRYSTAL growth , *LOGNORMAL distribution , *COMPETITION (Psychology) , *RAMAN spectroscopy - Abstract
During the synthesis of tellurium (Te) crystals, the coexistence of multiple crystalline phases (α-Te, β-Te, and γ-Te) with diverse structures commonly occurs, leading to instability and complexity in the performance of Te-based optoelectronic devices. This study employs physical vapor deposition to synthesize Te crystals of various sizes and morphologies, followed by spatially and temperature-dependent evaluation using Raman mapping and infrared photoluminescence (PL) spectroscopy. Spatially resolved results reveal that the size and morphology of Te crystals significantly influence the energy and peak profiles of Raman and PL spectra. Statistical analysis of spatially random sampling indicates the PL peak energies of Te crystals follow a lognormal distribution in terms of their occurrence frequencies, reflecting the complex interplay of multiple factors during crystal growth. This results in the coexistence of α-Te and β-Te phases, forming α/β-Te heterophase homojunction (HPHJ). Meanwhile, temperature-dependent PL results, obtained for the range of 3–290 K, reveal multi-peak competitive behavior in the PL spectra, accompanied by S-shaped shifts in peak energy. These features can be rationally explained by an interface transition-recombination mechanism based on the I-type α/β-Te HPHJ model. It also confirms infrared PL spectroscopy is an effective method for identifying the crystalline phase composition of Te crystals. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
6. Simulation of zero field cooled/field cooled magnetization and isothermal magnetization curves of distributed nanoparticle assembly.
- Author
-
Yasin, Sk Mohammad
- Subjects
- *
DISTRIBUTION (Probability theory) , *LOGNORMAL distribution , *PARTICLE size distribution , *NANOPARTICLES , *TEST systems - Abstract
The simulation is often required to assess a new model, identify problems with an existing design, and test a system under conditions that are hard to reproduce in an actual system. In this article, necessary simulations have been carried out on superparamagnetic nanoparticles considering a wide range of particle sizes and distribution widths purposefully to obtain the underlying magnetic features which cannot be envisioned easily by experimental studies. The present study demonstrates that the particle sizes and distribution widths of nanoparticle assembly have significant effect on controlling its ZFC/FC magnetization and isothermal M-H curves. The log-normal distribution function helps us to simulate the magnetic features in a reasonable fashion for distributed nanoparticles. The algorithm used for computer simulation of the present work is quite efficient as far as time is concerned. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
7. Simultaneous Emergent Phenomena: Leadership and Team Synchrony.
- Author
-
Guastello, Stephen J., Peters, Nicholas R., and Peressini, Anthony F.
- Abstract
Emergent phenomena exhibit interesting dynamics when considered individually. The present article examines two emergent processes that could be occurring simultaneously in an intense team interaction: the emergence of leaders and the emergence of autonomic synchrony within teams making dynamic decisions. In the framework of panarchy theory and related studies on complex systems, autonomic synchrony would be a fast dynamic that is shaped or controlled by leadership emergence, which is a slower dynamic. The present study outlines three distinct statistical distributions – the swallowtail catastrophe model for phase shifts, inverse power laws that indicate fractal processes, and lognormal distributions – that are known to characterize emergent processes of different types. The objective was to determine the extent to which the two emergent processes reflected the same dynamics. Research participants were 136 undergraduates who were organized into teams of three to five members playing the computer-game Counter-Strike while wearing GSR sensors to measure autonomic arousal levels in a steady stream. After approximately two hours of interaction, team members rated each other on leadership behaviors. Autonomic synchrony was analyzed as a driver-empath process that produced individual driver scores (the total influence of one person on the rest of the group) and empath scores (the total influence of the group on one person). Results showed that leadership emergence displayed the swallowtail configuration that was consistent with prior studies. Autonomic synchrony started as a simpler process and unfolded into a swallowtail catastrophe toward the end of the experimental session. Lognormal distributions were second-best representations of all variables. Inverse power laws were least descriptive of any of the research variables. The implications of the temporal dynamics of the co-emerging processes for training and team development are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2025
8. Domain adaptation via gamma, Weibull, and lognormal distributions for fault detection in chemical and energy processes.
- Author
-
Yang, Lingkai, Cheng, Jian, Luo, Yi, Zhou, Tianbai, Zhang, Xiaoyu, Shi, Linsong, and Xu, Yuan
- Subjects
MACHINE learning ,LOGNORMAL distribution ,CHEMICAL processes ,GAMMA distributions ,COAL mining - Abstract
The burgeoning development of supervised machine learning (ML) has led to its widespread applications in chemical and energy processes, such as fault detection. However, in some scenarios, collecting labelled data can be costly, hazardous, or impossible. Moreover, data of the same process can follow varying distributions due to changes in, for example, devices and environment, causing ML models to be ineffective. These challenges pose a domain adaptation task, necessitating the refinement of existing ML models to tackle issues from related applications. This study proposes a domain adaptation approach to address label scarcity and data distribution variation. The method has three stages: data distribution modelling (knowledge discovery), adaptation of target domain samples to source domains (knowledge transformation), and classifier ensemble for fault detection (knowledge fusion). Gamma, Weibull, and lognormal distributions are applied for data modelling and domain adaptation. The effectiveness of the method is validated on synthetic datasets and then applied to identify anomalies in coal mine pressure data and detect faults in the Tennessee Eastman (TE) process. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
9. Refining within-subject biological variation estimation using routine laboratory data: practical applications of the refineR algorithm.
- Author
-
Røys, Eirik Åsen, Viste, Kristin, Farrell, Christopher-John, Kellmann, Ralf, Guldhaug, Nora Alicia, Theodorsson, Elvar, Jones, Graham Ross Dallas, and Aakre, Kristin Moberg
- Subjects
- *
LANGUAGE models , *MONTE Carlo method , *MEDICAL sciences , *LOGNORMAL distribution , *MEDICAL research ethics , *BIOLOGICAL variation - Abstract
The article discusses the refineR algorithm for estimating within-subject biological variation (CVI) using routine laboratory data, focusing on the practical applications of this method. By characterizing a central Gaussian peak from result ratios, refineR can estimate CVI values and reference change values (RCV). The study validates the refineR algorithm through Monte Carlo simulations and comparisons with a biological variation study, highlighting its accuracy in estimating CVI from ratio distributions. Despite some limitations, refineR offers a cost-effective tool for laboratories to estimate CVI, especially for subgroups with limited direct studies. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
10. The Win Ratio Approach in Bayesian Monitoring for Two‐Arm Phase II Clinical Trial Designs With Multiple Time‐To‐Event Endpoints.
- Author
-
Huang, Xinran, Wang, Jian, and Ning, Jing
- Subjects
- *
TREATMENT effectiveness , *GIBBS sampling , *LOGNORMAL distribution , *CANCER relapse , *FRUSTRATION - Abstract
To assess the preliminary therapeutic impact of a novel treatment, futility monitoring is commonly employed in Phase II clinical trials to facilitate informed decisions regarding the early termination of trials. Given the rapid evolution in cancer treatment development, particularly with new agents like immunotherapeutic agents, the focus has often shifted from objective response to time‐to‐event endpoints. In trials involving multiple time‐to‐event endpoints, existing monitoring designs typically select one as the primary endpoint or employ a composite endpoint as the time to the first occurrence of any event. However, relying on a single efficacy endpoint may not adequately evaluate an experimental treatment. Additionally, the time‐to‐first‐event endpoint treats all events equally, ignoring their differences in clinical priorities. To tackle these issues, we propose a Bayesian futility monitoring design for a two‐arm randomized Phase II trial, which incorporates the win ratio approach to account for the clinical priority of multiple time‐to‐event endpoints. A joint lognormal distribution was assumed to model the time‐to‐event variables for the estimation. We conducted simulation studies to assess the operating characteristics of the proposed monitoring design and compared them to those of conventional methods. The proposed design allows for early termination for futility if the endpoint with higher clinical priority (e.g., death) deteriorates in the treatment arm, compared to the time‐to‐first‐event approach. Meanwhile, it prevents an aggressive early termination if the endpoint with lower clinical priority (e.g., cancer recurrence) shows deterioration in the treatment arm, offering a more tailored approach to decision‐making in clinical trials with multiple time‐to‐event endpoints. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. The introduction of coinage in Europe did not change pre-existing monetary patterns.
- Author
-
Ialongo, Nicola
- Subjects
LOGNORMAL distribution ,ROMAN Republic, 510-30 B.C. ,COINAGE ,ACADEMIC debating ,BRONZE Age - Abstract
Introduction: This paper investigates whether the introduction of coinage in Europe fundamentally changed pre-existing monetary circulation patterns. By analysing the statistical properties of bronze money before and after the advent of coinage (c. 1500–27 BCE), it challenges the prevailing assumption that coinage revolutionized the use and exchange of money. The research engages with longstanding academic debates between competing theories, which posit that money is either market-driven or state-imposed. Methods: Using a combination of archaeological data and quantitative analysis, the study examines large datasets of pre-coinage money and early coinage, focusing on weight-based regulation and the log-normal distribution of mass values as key indicators of monetary behaviour. Results: The findings reveal that pre-coinage bronze money, consisting of weighed metal fragments, circulated in a manner similar to early coinage. Both forms of money complied with weight-based systems and exhibited log-normal distribution patterns, reflecting structured economic behaviours. The analysis suggests that the introduction of coinage did not lead to a fundamental transformation in how money circulated but rather continued pre-existing patterns. Discussion: These results challenge the assumption that state-issued coinage marked a watershed moment in the history of monetary economies. The paper proposes that the beginning of coinage introduced a minor technological improvement rather than a revolutionary change in monetary circulation, offering a new perspective on the continuity between pre-coinage and coinage-based economies in ancient Europe. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. The formation of multiples in small-N subclusters.
- Author
-
Ambrose, Hannah E and Whitworth, A P
- Subjects
- *
CELESTIAL mechanics , *LOGNORMAL distribution , *STAR formation , *KINETIC energy , *STANDARD deviations - Abstract
We explore the relative percentages of binary systems and higher-order multiples that are formed by pure stellar dynamics, within a small subcluster of N stars. The subcluster is intended to represent the fragmentation products of a single isolated core, after most of the residual gas of the natal core has dispersed. Initially, the stars have random positions, and masses drawn from a lognormal distribution. For low-mass cores spawning multiple systems with Sun-like primaries, the best fit to the observed percentages of singles, binaries, triples, and higher-order systems is obtained if a typical core spawns on average between |$N=$| 4.3 and 5.2 stars, specifically a distribution of N with mean |$\mu _{_{N}}\sim 4.8$| and standard deviation |$\sigma _{_N}\sim 2.4$|. This fit is obtained when |$\sim 50~{{\ \rm per\ cent}}$| of the subcluster's internal kinetic energy is invested in ordered rotation and |$\sim 50~{{\ \rm per\ cent}}$| in isotropic Maxwellian velocities. There is little dependence on other factors, for example mass segregation or the rotation law. While such high values of N are at variance with the lower values often quoted (i.e. |$N\!=\!1\,\,{\rm or}\,\,2$|), very similar values (|$N\!=\!4.3\pm 0.4$| and |$N\!=\!4.5\pm 1.9$|) have been derived previously by completely independent routes, and seem inescapable when the observed distribution of multiplicities is taken into account. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Variability in River Width Reveals Climatic Influence on Channel Geometry.
- Author
-
Phillips, C. B., Masteller, C. C., Blaylock, J., Van Iwaarden, F., and Johnson, J. P. L.
- Subjects
- *
FLOOD routing , *FLUVIAL geomorphology , *CHANNELS (Hydraulic engineering) , *SEDIMENT transport , *LOGNORMAL distribution , *WATERSHEDS - Abstract
During a flood, the geometry of a river channel constrains the flows of water and sediment, however, over many floods, bankfull channel geometry evolves to reflect the longer‐term fluxes of water and sediment supplied by the catchment. Physics‐based models predict the average relationship between bankfull geometry and discharge to within an order of magnitude, however, observed variability about the prediction remains unaccounted for. We used high‐resolution topography to extract continuous measurements of bankfull width from 67 sites spanning the continental United States, yielding a reach‐scale probabilistic description of river width for each site. Within an individual reach, bankfull river width is well‐described by a lognormal distribution. Rivers that spend a greater proportion of time above bankfull are wider for the same bankfull discharge, revealing an unrecognized pathway through which climatic or engineered changes in flow frequency could alter river geometry and therefore impact aquatic habitat and flooding risk. Plain Language Summary: Rivers adjust their shape to transport the water and sediment supplied from their watersheds. Predicting flooding throughout river networks depends on understanding how rivers grow downstream as they accumulate water from larger areas. This relationship between river size and the water discharge that just fills the channel (bankfull) underpins flood routing and hazard models and is remarkably consistent for creeks in the headwaters and throughout a watershed as rivers reach their terminus in lakes and oceans. However, for shorter sections of a river we find that the size can be highly variable, and it remains unclear how to incorporate this reach‐scale variability into watershed and basin scale models. Here we demonstrate that reach‐scale variation has a well‐defined average value and predictable variability. By accounting for the reach‐scale variability, we find that for the same amount of discharge rivers which experience more flooding are wider while those where floods are rarer are narrower. These findings may hold particular relevance for understanding how snowmelt systems which produce long duration floods may adjust to diminishing snowpacks and how to manage river systems below dams where flow durations and magnitudes may be closely regulated. Key Points: Reach‐scale bankfull river width is lognormally distributed with a constant geometric standard deviationVariation in bankfull river width across sites depends on the flow duration where rivers that spend more time above flood stage are widerBankfull flow intermittency, runoff, and basin aridity are correlated, representing an imprint of climate on river geometry [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. A Probabilistic Model for Predicting the Performance of a Stormwater Overflow Structure as Part of a Stormwater Treatment Plant.
- Author
-
Górski, Jarosław, Szeląg, Bartosz, Bąk, Łukasz, and Świercz, Anna
- Subjects
MONTE Carlo method ,LOGNORMAL distribution ,WATER management ,STOCHASTIC models ,WATER use - Abstract
The purpose of this study was to attempt to develop a stochastic model that describes the operation of the stormwater overflow located in the stormwater sewerage system. The model built for this study makes it possible to simulate the annual volume of the stormwater discharge, the maximum volume of the overflow discharge in a precipitation event, and the share of the latter in the total amount of stormwater conveyed directly, without pre-treatment, to the receiver. The dependence obtained with the linear regression method was employed to identify the occurrence of stormwater discharge. The prediction of the synthetic annual rainfall series was made using the Monte Carlo method. This was performed based on the determined log-normal distribution, the parameters of which were specified using 13-year rainfall series. Additionally, simulation of the stormwater overflow operation was performed with the use of a calibrated hydrodynamic model of the catchment. The model was developed using the Storm Water Management Model (SWMM). The results of the hydrodynamic simulations of the volume and number of discharges were within the scope of the probabilistic solution, which confirms the applicative character of the method presented in this study, intended to assess the operation of stormwater overflow. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Contact pressure explains half of the abdominal aortic aneurysms wall thickness inter-study variability.
- Author
-
Kracík, Jan, Kubíček, Luboš, Staffa, Robert, and Polzer, Stanislav
- Subjects
- *
ABDOMINAL aortic aneurysms , *STOCHASTIC analysis , *STOCHASTIC models , *RISK assessment , *LOGNORMAL distribution - Abstract
The stochastic rupture risk assessment of an abdominal aortic aneurysm (AAA) critically depends on sufficient data set size that would allow for the proper distribution estimate. However, in most published cases, the data sets comprise no more than 100 samples, which is deemed insufficient to describe the tails of AAA wall thickness distribution correctly. In this study, we propose a stochastic Bayesian model to merge thickness data from various groups. The thickness data adapted from the literature were supplemented by additional data from 81 patients. The wall thickness was measured at two different contact pressures for 34 cases, which allowed us to estimate the radial stiffness. Herein, the proposed stochastic model is formulated to predict the undeformed wall thickness. Furthermore, the model is able to handle data published solely as summary statistics. After accounting for the different contact pressures, the differences in the medians reported by individual groups decreased by 45%. Combined data can be fitted with a lognormal distribution with parameters μ = 0.85 and σ = 0.32 which can be further used in stochastic analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Conditional earthquake probabilities along the North Anatolian fault zone based on inverse Gaussian against lognormal distribution.
- Author
-
Nas, Murat, Bayrak, Yusuf, Mpapka, Eleni, and Tsapanos, Theodoros M.
- Subjects
- *
EARTHQUAKE hazard analysis , *INVERSE Gaussian distribution , *LOGNORMAL distribution , *EARTH sciences , *FAULT zones - Abstract
This study offers a comprehensive forecast of conditional earthquake recurrence probabilities in the North Anatolian Fault Zone (NAFZ), utilizing advanced statistical models and temporal analyses, aiming to discern the likelihood of future earthquakes. We sought to contribute insights into seismic hazard assessment by analyzing earthquakes (MW ≥ 4.0) from 1900–2022, employing Inverse Gaussian (aka Brownian Passage Time) and Lognormal distribution models, categorizing the NAFZ into ten seismic zones. Rigorous model fitness assessments were conducted, including Akaike and Bayesian information criteria, Kolmogorov–Smirnov, and Anderson–Darling tests. Conditional probabilities were calculated across eleven temporal intervals (0–50 years) and eleven residual periods (1–50 years), starting on January 1, 2023, and extending into the future. Results reveal nuanced earthquake probabilities, highlighting a heterogeneous seismic hazard landscape. Probability forecasts surge within the initial five years and continue to rise for another five years, underscoring the spatiotemporal sensitivity and widespread earthquake hazard. The findings enhance the understanding of seismic hazard assessment, extending the future applicability potential to global seismic regions. Acknowledging uncertainties and relying on instrumental data, future research could explore more extensive areas and refined data sources, along with new modeling techniques, to enhance forecasting accuracy. The findings stress the need for earthquake preparedness throughout the study area, not only for the anticipated large earthquakes but especially for medium-magnitude earthquakes. This remark manifestly underscores the necessity to develop strategies to reduce possible damage and loss of life stemming from the collapse of non-engineered and rural building stock unevenly scattered along the NAFZ that remain vulnerable to moderate-magnitude earthquakes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. The Lognormal Distribution Is Characterized by Its Integer Moments.
- Author
-
Novi Inverardi, Pier Luigi and Tagliani, Aldo
- Subjects
- *
DIFFERENTIAL entropy , *ENTROPY , *INTEGERS , *PROBABILITY theory - Abstract
The lognormal moment sequence is considered. Using the fractional moments technique, it is first proved that the lognormal has the largest differential entropy among the infinite positively supported probability densities with the same lognormal-moments. Then, relying on previous theoretical results on entropy convergence obtained by the authors concerning the indeterminate Stieltjes moment problem, the lognormal distribution is accurately reconstructed by the maximum entropy technique using only its integer moment sequence, although it is not uniquely determined by moments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A Bivariate Extension of Type-II Generalized Crack Distribution for Modeling Heavy-Tailed Losses.
- Author
-
Bae, Taehan and Quarshie, Hanson
- Subjects
- *
LOGNORMAL distribution , *DENSITY - Abstract
As an extension of the (univariate) Birnbaum–Saunders distribution, the Type-II generalized crack (GCR2) distribution, built on an appropriate base density, provides a sufficient level of flexibility to fit various distributional shapes, including heavy-tailed ones. In this paper, we develop a bivariate extension of the Type-II generalized crack distribution and study its dependency structure. For practical applications, three specific distributions, GCR2-Generalized Gaussian, GCR2-Student's t, and GCR2-Logistic, are considered for marginals. The expectation-maximization algorithm is implemented to estimate the parameters in the bivariate GCR2 models. The model fitting results on a catastrophic loss dataset show that the bivariate GCR2 distribution based on the generalized Gaussian density fits the data significantly better than other alternative models, such as the bivariate lognormal distribution and some Archimedean copula models with lognormal or Pareto marginals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. On the derivation of an analytical expression for wind power probability distribution function and capacity factor of turbine.
- Author
-
Gardashov, Emin and Gardashov, Rauf
- Subjects
- *
DISTRIBUTION (Probability theory) , *LOGNORMAL distribution , *WEIBULL distribution , *WIND power , *WIND speed , *RAYLEIGH model - Abstract
It is shown here that if the PDF (Probability Distribution Function) of wind speed is a Rayleigh distribution with parameter $s$ s , then the PDF of wind power is a Weibull distribution with parameters $a = \sqrt 2 \rho s^3\comma \; \; b = \displaystyle{2 \over {3\; }}$ a = 2 ρ s 3 , b = 2 3 , where $\rho$ ρ is the air density; if the wind speed PDF is a Weibull distribution with parameters $a\comma \; b$ a , b , then the wind power PDF is also a Weibull distribution with parameters $\; \displaystyle{{a^3\rho } \over 2}\comma \; \displaystyle{b \over 3}$ a 3 ρ 2 , b 3 ; if the PDF of wind speed is a log-normal distribution with parameters $\mu \comma \; \sigma$ μ , σ , then the PDF of wind power is also a log-normal distribution with parameters $\ln \displaystyle{\rho \over 2} + 3\mu \comma \; 3\sigma$ ln ρ 2 + 3 μ , 3 σ . The derived relationships allow us to quickly estimate parameters that indicate the wind power potential of the considered site, and the amount of wind energy generated by the turbine. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Two sunspot group populations and Gnevyshev-Waldmeier rule.
- Author
-
Nagovitsyn, Yury A., Osipova, Aleksandra A., and Fedoseeva, Sofia N.
- Subjects
- *
SOLAR cycle , *SOLAR activity , *LOGNORMAL distribution , *LOGARITHMS , *PARAMETERS (Statistics) , *SUNSPOTS - Abstract
Based on the materials of the Royal Greenwich Observatory catalog, the study of two different sunspot group populations continued: LLG – large long-lived, and SSG – small short-lived groups. The task of achieving a higher accuracy of the population separation parameter than before (Nagovitsyn & Pevtsov 2016, 2021) is being solved. A procedure for randomizing the lifetimes of sunspot groups observed once a day is proposed, which allows for statistical studies to achieve a higher time resolution. The form of the Gnevyshev-Waldmeier Rule is taken, which linearly connects the logarithm of the log S area and the lifetime of the sunspot group LT (over limited time intervals). It is shown that it has coefficients significantly different for SSG and LLG populations. The range of values of the group lifetime parameter separating the populations was found as LT ∗ = 4.75 ± 0.53 days, which is in agreement with the threshold values obtained earlier for the number of days of sunspot group observation: m SSG ⩽ 5 days and m LLG > 5 days. It is shown that the parameters of the bilognormal distribution of the sunspot group areas, obtained from their common grouped sample statistically by Levenberg–Marquardt method and with a preliminary division into lognormal distributions by lifetime, correspond to each other. It was clarified that, with an accuracy of up to a tenth of a day, the SSG population corresponds to the lifetimes of groups ⩽ 4.6 days, and the LLG population corresponds to times ⩾ 4.7 days. The results obtained make it possible to study various physical properties of SSG and LLG populations independently of each other in order to compare them and study their nature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Distribution of radon in large workplaces: an analysis performed on radon levels measured in UK schools.
- Author
-
Kouroukla, E and Gooding, T D
- Subjects
- *
LOGNORMAL distribution , *RADIOACTIVE decay , *RADON , *RADIATION sources , *RADIATION exposure - Abstract
Radon is a radioactive, carcinogenic gas formed by the radioactive decay of uranium and radium that occur naturally in small amounts in all rocks and soils. It is the largest single source of radiation exposure to the UK population, contributing to more than 1 100 lung cancer deaths each year according to an analysis conducted in 2005. Regulations exist to protect employees (and other persons) where radon concentrations exceed the reference level of 300 Bq m−3. Once the reference level is exceeded, annual doses of more than the public dose limit of 1 mSv a−1 are considered to be excessive. A radon measurement campaign for schools, which started in 2009, generated a large dataset, including those with high numbers of simultaneous radon measurements. Radon data between buildings (e.g. homes) have been shown to correspond broadly to the lognormal distribution, after the additive contribution of outside air has been removed. However, there are fewer studies of the distribution of radon levels within a single, large property. Radon data collected from 533 UK schools with at least 20 valid, simultaneous results were analysed against several statistical models. In approximately 50% of schools the radon levels could be represented by the lognormal distribution and in 60% by the loglogistic lognormal distribution, the latter being a better fit probably owing to its lower sensitivity to the tails of the distribution. Qualitatively, the lognormal and the loglogistic probability plots appeared to be indistinguishable. These findings indicate that the lognormal and loglogistic might be appropriate models to characterise the distribution of radon in most large workplaces. For each statistical model, the two distribution parameters can be used to provide a better estimate of the average dose to the occupants. However, caution is required when assessing doses, since the average estimator of the radon concentration does not predict the highest value and may significantly underestimate or overestimate the dose in specific areas. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Science or scientism? On the momentum illusion.
- Author
-
Grobys, Klaus
- Subjects
LOGNORMAL distribution ,SHARPE ratio ,FINANCIAL risk ,PRICES ,SCIENTISM - Abstract
This study explores the risk of the traditional momentum strategy in terms of its realized variance using various data frequencies. It is shown that momentum risk is infinite regardless of the data frequency, implying that (a) t-statistics for this strategy do not exist, (b) correlation-based metrics such as Sharpe ratios do not exist either, and (c) the momentum premium is not observable in reality. It is further shown that the time-honored lognormal distribution is unable to accurately model extreme events observed at various variance data frequencies. Finally, it is shown that the well-known effect of time aggregation does not work for this investment vehicle. Hence, the study is forced to conclude that momentum stories have no valid foundation for their claims. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Analysis of Floods Encounter of the Mainstream and Tributaries of Huaihe River Based on Vine Copula Function.
- Author
-
XU Peng-cheng, ZHANG Zhi-lang, LIU Chun-ming, FANG Hong-yuan, and WANG Lei-zhi
- Subjects
LOGNORMAL distribution ,MARGINAL distributions ,EXTREME weather ,FLOOD control ,COPULA functions - Abstract
As the severity of extreme weather escalates, the frequency of flood disasters is on the rise and the flood coincidence events may intensify the adverse effects on downstream regions. This study leverages 58 years of observed daily runoff data from 1959 to 2016 of four stations: Wangjiaba Station and Bengbu Station on the Huai River, Jiangjiaji Station on the Shi River, and Fuyang Station on the Ying River. The study extracts the flood occurrence time series and peak flow series during the flood season using the annual maximum flow method. The marginal distribution of flood occurrence time for the annual maximum floods is constructed using the Von Mises distribution, while the marginal distributions of peaks are constructed using the logarithmic normal distribution, Pearson type III distribution, and generalized extreme value model, respectively. The parameters of each distribution are estimated using maximum likelihood estimation, the optimal model is selected according to the Akaike Information Criterion, and the Kolmogorov-Smirnov test is used to determine whether the marginal distributions are qualified. Additionally, the Vine Copula is used to construct the joint distribution of flood occurrence time and peak flow for multiple stations. Quantitative analysis of encounter risks under various conditions such as encounters between two rivers, encounters between three stations, and the conditional probability of various combinations of flooding events occurring upstream when flooding occurs downstream. The results show that the Von Mises distribution, especially the unimodal Von Mises distribution, can well fit the flood occurrence time in the Huaihe River Basin. For encounters between tributaries of the Huaihe River, the most probable date for pairwise encounters and multiple station encounters is around July 15th. Among pairwise encounters of tributaries, the highest encounter probability is between Wangjiaba Station and Jiangjiaji Station, and the lowest encounter probability is between Fuyang Station and Jiangjiaji Station. The study also reveals that, when the downstream Bengbu Station experiences the flood with the return period of 50 years, the probability of all three upstream stations experiencing the flood exceeding the 10-year return period is 0.076. This study further extends the computational methods for flood encounter risk, and it is of great significance for flood control and disaster reduction in the Huaihe River basin. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Fatigue life prediction method based on data fusion.
- Author
-
ZHANG Xu, YAO Jianyao, LIU Xuyang, WANG Changyin, and GAO Youzhi
- Subjects
LOGNORMAL distribution ,WEIBULL distribution ,MULTISENSOR data fusion ,DATA augmentation ,FATIGUE testing machines - Abstract
To address the challenges posed by the time-consuming nature of fatigue test and the scattered nature of test data, it is evident that P-S-N curves derived from small samples with high survival rates lack sufficient accuracy, leading to unreliable predictions of fatigue life. The data fusion method based on the performance-life probability mapping principle is used to fuse small sample fatigue data of different stress levels, and the feasibility of obtaining accurate P-S-N curves by this method is analyzed and evaluated. The results demonstrated that P-S-N curves obtained post-fusion are closer to the P-S-N curve derived from larger sample datasets. This approach effectively enhances both reliability and accuracy in predicting fatigue life while simultaneously reducing the amount of required fatigue tests. A comparative evaluation is conducted on the predictive capabilities for fatigue life before and after fusion using different models; notably, it is found that the three-parameter power function model demonstrates superior predictive ability, whereas when ample fatigue data is available, the prediction capabilities among four models (Basquin S-N model, exponential S-N model, three-parameter power function S-N model (based on lognormal distribution), and three-parameter power function S-N model (based on three-parameter Weibull distribution) exhibit a considerable degree of resemblance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Direct Evidence of Bitcoin Wash Trading.
- Author
-
Aloosh, Arash and Li, Jiasun
- Subjects
BENFORD'S law (Statistics) ,LOGNORMAL distribution ,CRYPTOCURRENCY exchanges ,CRIMINAL investigation ,TRANSACTION records - Abstract
We use the internal trading records of a major Bitcoin exchange leaked by hackers to detect and characterize wash trading—a type of market manipulation in which a single trader clears the trader's own limit orders to "cook" transaction records. Our finding provides direct evidence for the widely suspected "fake volume" allegation against cryptocurrency exchanges, which has so far only been backed by indirect estimation. We then use our direct evidence to evaluate various indirect techniques for detecting the presence of wash trades and find measures based on Benford's law, trade size clustering, lognormal distributions, and structural breaks to be useful, whereas ones based on power law tail distributions to give opposite conclusions. We also provide suggestions to effectively apply various indirect estimation techniques. This paper was accepted by Professor Bruno Biais, finance. Funding: J. Li acknowledges support by the U.S. Department of Homeland Security [Grant 205187] through the Criminal Investigations and Network Analysis Center. Supplemental Material: The data files are available at https://doi.org/10.1287/mnsc.2021.01448. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Experimental Study on the Characteristics of Corrosion-Induced Cracks and Steel Corrosion Depth of Carbonated Recycled Aggregate Concrete Beams.
- Author
-
Gao, Pengfei, Wang, Jian, Chen, Bo, Bai, Mingxin, and Song, Yuanyuan
- Subjects
RECYCLED concrete aggregates ,DISTRIBUTION (Probability theory) ,MINERAL aggregates ,LOGNORMAL distribution ,STEEL bars - Abstract
The durability of carbonated recycled aggregate concrete (C-RAC) beams is still unclear at present. In this paper, the characteristics of corrosion-induced cracks and the steel corrosion depth of C-RAC beams were investigated through the accelerated corrosion test. The results showed that when accelerating corrosion to the 40th day, compared to the non-carbonated recycled aggregate concrete (NC-RAC) beam, the corrosion-induced cracking area of the C-RAC beam with a 100% carbonated recycled coarse aggregate (C-RCA) replacement ratio decreased by 40.00%, while the total length of the corrosion-induced cracks (CCs) increased by 51.82%. The type of probability distribution for the width of the CCs on the tension side of the C-RAC beams was a lognormal distribution. Compared with the NC-RAC beam, the mean value of the width of the CCs of the C-RAC beam with a 100% C-RCA replacement ratio decreased by 66.67%, the crack width distribution was more concentrated, and the quartiles and median were all reduced. With an increase in the C-RCA replacement ratio, the fractal dimension and the scale coefficient of CCs on the tension side of the beams showed an approximate trend of first increasing and then decreasing. The distribution of the corrosion depth of longitudinal tensile steel bars in the C-RAC beams was a mainly normal distribution. When the C-RCA replacement ratio increased from 30% to 100%, the mean value of the corrosion depth of the longitudinal tensile steel bars decreased by 33.46%, and the trend of changes in the quartiles and medians was basically the same as the trend of changes in the mean value. The research results can provide some reference for promoting the engineering application of C-RAC beams. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Seismic Analysis of Non-Regular Structures Based on Fragility Curves: A Study on Reinforced Concrete Frames †.
- Author
-
Smiroldo, Giovanni, Fasan, Marco, and Bedon, Chiara
- Subjects
GROUND motion ,LOGNORMAL distribution ,REINFORCED concrete ,NONLINEAR analysis ,CONCRETE analysis - Abstract
The seismic performance and expected structural damage in reinforced concrete (RC) frames, as in many others, is a critical aspect for design. In this study, a set of RC frames characterized by increasing in-plan and in-height non-regularity is specifically investigated. Four code-conforming three-dimensional (3D) buildings with varying regularity levels are numerically analyzed. Their seismic assessment is conducted by using unscaled real ground motion records (61 in total) and employing non-linear dynamic simulations within the Cloud Analysis framework. Three distinct intensity measures (IMs) are used to evaluate the impact of structural non-regularity on their seismic performance. Furthermore, fragility curves are preliminary derived based on conventional linear regression models and lognormal distribution. In contrast with the initial expectations and the typical results of non-linear dynamic analyses, the presented comparative results of the fragility curves show that the non-regularity level increase for the examined RC frames does not lead to progressively increasing fragility. Upon these considerations on the initial numerical findings, a re-evaluation of the methodology is performed using a reduced subset of ground motion records, in order to account for potential biases in their selection. Moreover, to uncover deeper insights into the unexpected outcomes, a logistic regression based on a maximum likelihood estimate is also employed to develop fragility curves. Comparative results are thus critically discussed, showing that the herein considered fragility development methods may lead to seismic assessment outcomes for code-conforming non-regular buildings that are in contrast with those of raw structural analyses. In fact, the considered building code design provisions seem to compensate non-regularity-induced torsional effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Pricing vulnerable reset options under stochastic volatility jump diffusion model using 3-D FFT.
- Author
-
Wang, Libin and Liu, Lixia
- Subjects
- *
INTEREST rates , *PRICES , *LOGNORMAL distribution , *CHARACTERISTIC functions , *JUMP processes - Abstract
Abstract.Under the comprehensive model of assets which satisfy the triple conditions of stochastic jump, stochastic volatility, and stochastic interest rate, the pricing problem of reset options with default risk is investigated. First, two-dimensional log-normal distribution and radical process with reverting property are applied to describe sudden jump of assets and time-varying characteristics of volatility and interest rate, respectively. Further, through the principle of risk neutral pricing with the characteristic function method, we establish the joint characteristic function related to the options. Second, according to measure transform and payoff function decomposition, the analytical pricing and hedging share formulas of vulnerable reset options are given. Third, 3-D fast Fourier transform is constructed to obtain a fast and asymptotic solution of option prices. Finally, the accuracy and stability of 3-D fast Fourier transform is analyzed through numerical examples. The experimental results show that the proposed method can solve the complex pricing problem of vulnerable reset options more efficiently. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Dynamic Data-Driven Deterioration Model for Sugarcane Shredder Hammers Oriented to Lifetime Extension.
- Author
-
Rodriguez-Obando, Diego, Rosero-García, Javier, and Rosero, Esteban
- Subjects
- *
SPARE parts , *JOB satisfaction , *LOGNORMAL distribution , *WASTE products as fuel , *ELECTRICAL load , *HAMMERS - Abstract
Several sugar mills operate as waste-to-energy plants. The shredder is the initial high-energy machine in the production chain and prepares sugarcane. Its hammers, essential spare parts, require continuous replacement. Then, the search for intelligent strategies to extend the lifetime of these hammers is fundamental. This paper presents (a) a dynamic data-driven model for estimating the deterioration and predicting remaining life of the sugarcane shredder hammers during operation, for which the real data of the entering sugarcane flow and the power required to prepare the sugarcane are analyzed, and (b) a management architecture intended for online decision-making assistance to extend the hammers' life by making a trade-off between the desired lifetime, along with a nominal shredder work satisfaction criterion. The deterioration model is validated with real data achieving an accuracy of 84.41%. The remaining life prognostic is within a confidence zone calculated from the historical sugarcane flow, with a probability close to 99%, fitting a lognormal probability distribution. A numerical example is also provided to illustrate a closed loop control, where the proposed architecture is used to extend the useful life of the hammers during operation, adjusting the incoming sugarcane flow while maintaining the nominal work satisfaction of the shredder. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Statistical Analysis of AC Breakdown Performance of Epoxy/Al 2 O 3 Micro-Composites for High-Voltage Applications.
- Author
-
Cheon, Changyeong, Seo, Dongmin, and Kim, Myungchin
- Subjects
WEIBULL distribution ,DISTRIBUTION (Probability theory) ,LOGNORMAL distribution ,GAUSSIAN distribution ,ELECTRIC insulators & insulation ,PERMITTIVITY measurement - Abstract
Thanks to the performance improvement introduced by micro sized functional fillers, application of epoxy composites for electrical insulation purposes has become popular. This paper investigates the dielectric properties of epoxy micro-composites filled with alumina (Al
2 O3 ). In particular, measurements of relative permittivity, dissipation factor, and electrical breakdown are performed, and a comprehensive statistical analysis on dielectric properties was conducted. AC breakdown strength (AC-BDS) was analyzed for normal distribution using four methods (Anderson–Darling, Shapiro–Wilk, Ryan–Joiner, and Kolmogorov–Smirnov). In addition, the AC-BDS was analyzed at risk probabilities of 1%, 5%, 10%, and 50% using Weibull distribution functions. Both normal and Weibull distributions were evaluated using the Anderson–Darling (A-D) statistic and p-value. Additionally, the log-normal, gamma, and exponential distributions of AC-BDS were examined by A-D goodness-of-fit test. The hypothesis test results of AC-BDS were fit by normal and Weibull distributions, and the compliance was evaluated by p-value and each method statistics. In addition, the experimental results of AC-BDS were fit by log-normal and gamma distributions, and the goodness-of-fit was evaluated by p-value and A-D testing. On the other hand, exponential distribution was not suitable for p-value and A-D testing. The results showed that the distributions of AC-BDS were the best using log-normal distribution. Meanwhile, statistical analysis results verified the apparent effect of temperature on dielectric properties using a paired t-test. The analysis results of this paper not only contribute to better characterization of epoxy/Al2 O3 micro-composites but also introduce a comprehensive approach for performing statistical analysis for electrical insulation materials. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
31. Robust estimation of the incubation period and the time of exposure using <italic>γ</italic>-divergence.
- Author
-
Yoneoka, Daisuke, Kawashima, Takayuki, Tanoue, Yuta, Nomura, Shuhei, and Eguchi, Akifumi
- Subjects
- *
TIME management , *COMMUNICABLE diseases , *SYSTEMS design , *DATA analysis , *PUBLIC health , *LOGNORMAL distribution , *CONFIDENCE intervals - Abstract
Estimating the exposure time to single infectious pathogens and the associated incubation period, based on symptom onset data, is crucial for identifying infection sources and implementing public health interventions. However, data from rapid surveillance systems designed for early outbreak warning often come with outliers originated from individuals who were not directly exposed to the initial source of infection (i.e. tertiary and subsequent infection cases), making the estimation of exposure time challenging. To address this issue, this study uses a three-parameter lognormal distribution and proposes a new
γ -divergence-based robust approach for estimating the parameter corresponding to exposure time with a tailored optimization procedure using the majorization-minimization algorithm, which ensures the monotonic decreasing property of the objective function. Comprehensive numerical experiments and real data analyses suggest that our method is superior to conventional methods in terms of bias, mean squared error, and coverage probability of 95% confidence intervals. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
32. Fragility of Highway Embankments Exposed to Permanent Deformations Due to Underlying Fault Rupture.
- Author
-
Petala, Eleni and Klimis, Nikolaos
- Subjects
- *
DISTRIBUTION (Probability theory) , *LOGNORMAL distribution , *SANDY soils , *EMBANKMENTS , *PARAMETRIC modeling - Abstract
Seismic risk expresses the expected degree of damage and loss following a catastrophic event. An efficient tool for assessing the seismic risk of embankments is fragility curves. This research investigates the influence of embankment's geometry, the depth of rupture occurrence, and the underlying sandy soil's conditions on the embankment's fragility. To achieve this, the response of three highway embankments resting on sandy soil was examined through quasi-static parametric numerical analyses. For the establishment of fragility curves, a cumulative lognormal probability distribution function was used. The maximum vertical displacement of the embankments' external surface and the fault displacement were considered as the damage indicator and the intensity measure, respectively. Damage levels were categorized into three qualitative thresholds: minor, moderate, and extensive. All fragility curves were generated for normal and reverse faults, as well as the combination of those fault types (dip-slip fault). Finally, the proposed curves were verified via their comparison with those provided by HAZUS. It was concluded that embankment geometry and depth of fault rupture appearance do not significantly affect fragility, as exceedance probabilities show minimal differences (<4%). However, an embankment founded on dense sandy soil reveals slightly higher fragility compared to the one founded on loose sand. Differences regarding the probability of exceedance of a certain damage level are restricted by a maximum of 7%. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Fluid flow in three-dimensional porous systems shows power law scaling with Minkowski functionals.
- Author
-
Haque, R. A. I., Mitra, A. J., and Dutta, T.
- Subjects
- *
THREE-dimensional flow , *FLUID flow , *PARTICLE size distribution , *EULER characteristic , *LOGNORMAL distribution - Abstract
Integral geometry uses four geometric invariants—the Minkowski functionals—to characterize certain subsets of three-dimensional (3D) space. The question was, how is the fluid flow in a 3D porous system related to these invariants? In this work, we systematically study the dependency of permeability on the geometrical characteristics of two categories of 3D porous systems generated: (i) stochastic and (ii) deterministic. For the stochastic systems, we investigated both normal and lognormal size distribution of grains. For the deterministic porous systems, we checked for a cubic arrangement and a hexagonal arrangement of grains of equal size. Our studies reveal that for any three-dimensional porous system, ordered or disordered, permeability k follows a unique scaling relation with the Minkowski functionals: (a) volume of the pore space, (b) integral mean curvature, (c) Euler characteristic, and (d) critical cross-sectional area of the pore space. The cubic and the hexagonal symmetrical systems formed the upper and lower bounds of the scaling relations, respectively. The disordered systems lay between these bounds. Moreover, we propose a combinatoric F that weaves together the four Minkowski functionals and follows a power-law scaling with permeability. The scaling exponent is independent of particle size and distribution and has a universal value of 0.428 for 3D porous systems built of spherical grains. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Statistics of drops generated from ensembles of randomly corrugated ligaments.
- Author
-
Pal, Sagar, Pairetti, César, Crialesi-Esposito, Marco, Fuster, Daniel, and Zaleski, Stéphane
- Subjects
- *
DROP size distribution , *LOGNORMAL distribution , *GAMMA distributions , *DISTRIBUTION (Probability theory) , *COMPLEX fluids - Abstract
The size of drops generated by the capillary-driven disintegration of liquid ligaments plays a fundamental role in several important natural phenomena, ranging from heat and mass transfer at the ocean-atmosphere interface to pathogen transmission. The inherent nonlinearity of the equations governing the ligament destabilization leads to significant differences in the resulting drop sizes, owing to small fluctuations in the myriad initial conditions. Previous experiments and simulations reveal a variety of drop size distributions, corresponding to competing underlying physical interpretations. Here, we perform numerical simulations of individual ligaments, the deterministic breakup of which is triggered by random initial surface corrugations. The simulations are grouped in a large ensemble, each corresponding to a random initial configuration. The resulting probability distributions reveal three stable drop sizes, generated via a sequence of two distinct stages of breakup. Four different distributions are tested, volume-based Poisson, Gaussian, Gamma, and Log-Normal. Depending on the time, range of droplet sizes and criteria for success, each distribution has successes and failures. However, the Log-Normal distribution roughly describes the data when fitting both the primary peak and the tail of the distribution while the number of droplets generated is the highest, while the Gamma and Log-Normal distributions perform equally well when fitting the tail. The study demonstrates a precisely controllable and reproducible framework, which can be employed to investigate the mechanisms responsible for the polydispersity of drop sizes found in complex fluid fragmentation scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. From Gaussian to lognormal: improving material property modeling for precise structural predictions.
- Author
-
Kumar, Rakesh
- Subjects
- *
MONTE Carlo method , *ARTIFICIAL neural networks , *LOGNORMAL distribution , *ENGINEERING design , *GAUSSIAN processes - Abstract
Accurate prediction of material properties is essential in structural engineering design to ensure reliability and safety. Traditional approaches often rely on Gaussian distributions to model material variability. However, our research reveals limitations with Gaussian assumptions, particularly when covariance parameters exceed certain thresholds, leading to physically unrealistic negative values for material properties. To overcome these limitations, we investigate an alternative approach using lognormal distributions for material property modeling. Through Monte Carlo simulations employing Cholesky decomposition, we compare the performance of lognormal distributions with Gaussian counterparts. Our findings demonstrate that lognormal distributions offer a viable alternative, providing more accurate representations of material variability while maintaining computational efficiency. Furthermore, we utilize finite element method (FEM) data from Monte Carlo simulations to predict beam deflection using deep neural networks (DNNs). Leveraging inverse modeling techniques, we showcase the ability to predict elastic modulus from beam deflection data under both normal and lognormal distribution assumptions. By integrating lognormal modeling and inverse modeling techniques into structural analysis, we enhance the realism and accuracy of predictions, thereby improving reliability in engineering design. This paper discusses the implications of our findings and emphasizes the importance of considering alternative probability distributions for material property modeling in structural engineering applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Optimal experimental planning for constant-stress accelerated life-testing experiments based on coherent systems.
- Author
-
Yu, Yang and Ng, Hon Keung Tony
- Subjects
- *
DISTRIBUTION (Probability theory) , *MONTE Carlo method , *MAXIMUM likelihood statistics , *FISHER information , *LOGNORMAL distribution , *ACCELERATED life testing - Abstract
In this paper, we discuss the optimal experimental planning for multi-level stress testing with Type-II censoring when the test units can be put into coherent systems for the experiment. Based on the notion of system signatures of coherent systems and assuming the lifetimes of the test units follow a distribution in a general log-location-scale family of distributions, the maximum likelihood estimators of the model parameters and the Fisher information matrix are derived. For optimal experimental planning, in addition to some commonly used optimality criteria, such as D-optimality, A-optimality, and V-optimality, we also consider the total time of the experiment and the total time on test. Then, motivated by a real-life application in the reliability study of furniture joints, we focus on using series systems in multi-level stress experiments. The methodology is illustrated by considering lognormal and Weibull distributed test units. Numerical and Monte Carlo simulation studies are used to demonstrate the advantages and disadvantages of using series systems in life-testing experiments. A numerical example based on furniture joints with sensitivity analysis is used to elucidate how the proposed methods can be used in planning a life-testing experiment. Finally, some concluding remarks with some future research directions are provided. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Statistical Modeling of Within-Laboratory Precision Using a Hierarchical Bayesian Approach.
- Author
-
Miyake, Daisuke, Kanaya, Shigehiko, and Ono, Naoaki
- Subjects
- *
MONTE Carlo method , *CHI-square distribution , *STATISTICS , *STATISTICAL models , *LOGNORMAL distribution - Abstract
Background Reproducibility has been well studied in the field of food analysis; the RSD is said to follow a Horwitz curve with certain exceptions. However, little systematic research has been done on predicting repeatability or intermediate precision. Objective We developed a regression method to estimate within-laboratory SDs using hierarchical Bayesian modeling and analyzing duplicate measurement data obtained from actual laboratory tests. Methods The Hamiltonian Monte Carlo method was employed and implemented using R with Stan. The basic structure of the statistical model was assumed to be a Chi-squared distribution, the fixed effect of the predictor was assumed to be a nonlinear function with a constant term and a concentration-dependent term, and the random effects were assumed to follow a lognormal distribution as a hierarchical prior. Results By analyzing over 300 instances, we obtained regression results that fit well with the assumed model, except for moisture, which was a method-defined analyte. The developed method applies to a wide variety of analytes measured using general principles, including spectroscopy, GC, and HPLC. Although the estimated precisions were within the Horwitz ratio criteria for repeatability, some cases using high-sensitivity detectors, such as mass spectrometers, showed SDs below that range. Conclusion We propose utilizing the within-laboratory precision predicted by the model established in this study for internal QC and measurement uncertainty estimation without considering sample matrices. Highlights Performing statistical modeling on data from double analysis, which is conducted as a part of internal QCs, will simplify the estimation of the precision that fits each analytical system in a laboratory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Gait rhythm analysis as a new continuous scale for cerebellar ataxia: Power law and lognormal components represent the ataxic gait quantity.
- Author
-
Goto, Ryoji, Oba, Koichiro, Bando, Kyota, Todoroki, Kyoko, Yoshida, Junichiro, Nishida, Daisuke, Mizuno, Katsuhiro, Mizusawa, Hidehiro, and Takahashi, Yuji
- Subjects
- *
CEREBELLAR ataxia , *FRIEDREICH'S ataxia , *LOGNORMAL distribution , *GAIT in humans , *RHYTHM - Abstract
We estimated the severity of cerebellar ataxia by analyzing gait rhythm. We measured the step times in patients with pure cerebellar ataxia and healthy controls and then analyzed the distribution of the ratios of adjacent times. Gait rhythm displayed the best adaptation when expressed as the sum of the power law and lognormal distributions in both groups, and the groups could be distinguished by the exponent of the power law distribution, reflecting the fractal property of gait rhythm. Gait rhythm might reflect different features of impairment in patients with cerebellar ataxia, making it a useful continuous scale for cerebellar ataxia. • We identified a novel pattern of gait rhythm distribution caused by cerebellar ataxia. • A linear combination of power law and lognormal components optimized the fitting of the cumulative gait rhythm distribution. • The power law was indicative of the fractal property of gait rhythm. • The two components correlated SARA kinetic and gait/posture subscores, respectively. • Gait rhythm analysis could provide a useful continuous scale for cerebellar ataxia. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. An economic-statistical design of synthetic Tukey's control chart with Taguchi's asymmetric loss functions under log-normal distribution.
- Author
-
Lee, Pei-Hsi and Chou, Chao-Yu
- Subjects
- *
DISTRIBUTION (Probability theory) , *ECONOMIC lot size , *LOGNORMAL distribution , *INTEGRATED circuit design , *SENSITIVITY analysis , *QUALITY control charts - Abstract
The synthetic Tukey's control chart (denoted by Syn-TCC) applies a single observation in the process monitoring and is relatively robust to outliers. For a complex process, the probability distribution of quality characteristic must be carefully identified and the loss of quality characteristic, which is defined as the deviation from the target value, must be accurately estimated so that the quality state of the process may be more comprehensively described. However, the literature rarely studied the economic design of the Syn-TCC with Taguchi's loss function under non normality. In the present study, it is assumed that the quality characteristic follows a log-normal distribution and Taguchi's asymmetric linear and quadratic loss functions are involved to develop the economic-statistical design of the Syn-TCC for multiple assignable causes. A case study is presented to illustrate the application of this economic-statistical design of the Syn-TCC to an IC package industry. The sensitivity analysis of this case study reveals that a larger production lot size and larger two coefficients of loss functions generally lead to a higher total expected cost. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Dynamic Optimization of Exclusive Bus Lane Location Considering Reliability: A Case Study of Beijing.
- Author
-
Kou, Weibin, Zhang, Shijie, Liu, Fei, and Pang, Lan
- Subjects
TRAVEL time (Traffic engineering) ,LOGNORMAL distribution ,GAUSSIAN distribution ,PUBLIC transit ,CITIES & towns - Abstract
For metropolises like Beijing, heavy congestions cause transit passengers' unreliable travel time, including in vehicle time and waiting time. Comparing with other managerial measures, designing a lane for bus use only is an effective method to improve travel reliability, for it can eliminate the influence on bus-driving conditions. This paper proposes a reliable and practical method to determine exclusive bus lanes (EBL). A reliability-based optimization model is established, in which the tradeoff among bus and private car passengers' travel time, reliability, and EBL construction cost are considered. Based on the actual network, a user equilibrium demand assignment model is applied to estimate the dynamic bus flow distribution. Since the model is nonlinear, a two-step method is proposed where tangent lines are introduced to constitute an envelope curve to linearize the model. This work conducts the statistical modeling and fitting analysis with actual bus trajectory data, collected on EBL in Beijing during peak hours. Passenger travel time distributions are fitted to estimate the statistical passenger travel time; Lognormal distribution and Gaussian distribution are the best fit. The optimization results indicate that the passenger travel time reliability can be improved by 5.5% by the optimized EBL location scheme. This study will provide a theoretical basis and methodological support for improving the service level of the public transportation system in large cities through the scientific planning of exclusive bus lanes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. An Analytical Study for Explosive Grain Initiation.
- Author
-
Gao, Feng and Fan, Zhongyun
- Subjects
HETEROGENOUS nucleation ,LOGNORMAL distribution ,GRAIN refinement ,GRAIN size ,SOLIDIFICATION - Abstract
The most common form of solidification of metals is heterogeneous nucleation, in which the particles, regardless of whether they are endogenous or exogenous, nucleate the primary crystal phase, becoming solid crystal particles and, subsequently, initiating into grains during solidification. Explosive grain initiation has been proposed recently for these particles, which have significant nucleation undercooling, in which once nucleation happens, a certain number of solid particles can initiate into grains simultaneously, resulting in recalescence. This is a different form of grain initiation and has high potential for more significant grain refinement in casting alloys. In this work, an analytical model is designed to describe explosive grain initiation, based on which the criteria for the three different grain initiation forms, explosive grain initiation (EGI), hybrid grain initiation (HGI), and progressive grain initiation (PGI), are derived. These criteria are employed to develop a grain initiation map for the Mg-Al alloy system inoculated with nucleant particles having a log-normal size distribution. This work can not only help us to understand the effect of each condition, such as the cooling rate and the solute concentration, on grain initiation behaviors, but also predict the grain size for alloy systems with relatively impotent nucleant particles during solidification. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Smoke and NOx emission characteristics of in-use construction machinery base on substantial field measurement: A case study in Beijing, China.
- Author
-
Yi, Huawei, Cui, Yangyang, Zhu, Lijun, Shen, Yan, Li, Han, Huang, Guanghan, Qu, Linzhen, Guo, Dongdong, Nie, Lei, and Xue, Yifeng
- Subjects
- *
LOGNORMAL distribution , *CONSTRUCTION equipment , *EMISSION standards , *BUILDING design & construction , *BUILDING sites - Abstract
• Smoke and NO x exhaust emissions from a total of 905 construction machines, including 4 different types. • Substantial field measurements were carried out on construction sites in Beijing, giving an indication of real-world emissions. • Measured smoke result indicate that the implementation of strict regulatory policies is contribute to reducing emission. • Measured NO x emissions indicate that stricter emission standard stage cannot control the NO x level. To understand the smoke level and NO x emission characteristics of in-use construction machinery in Beijing, we selected 905 construction machines in Beijing from August 2022 to April 2023 to monitor the emission level of smoke and NO x. The exhaust smoke level and excessive emission situation of different machinery types were identified, and their NO x emission levels were monitored according to the free acceleration method. We investigated the correlation of NO x and smoke emission, and proposed suggestions for controlling pollution discharge from construction machinery in the future. The results show that the exhaust smoke level was 0–2.62 m−1, followed a log-normal distribution (μ = -1.73, δ = 1.09, R 2 = 0.99), with a 5.64% exceedance rate. Differences were observed among machinery types, with low-power engine forklifts showing higher smoke levels. The NO x emission range was 71–1516 ppm, followed a normal distribution (μ = 565.54, δ = 309.51, R 2 = 0.83). Differences among machinery types were relatively small. Engine rated net power had the most significant impact on NO x emissions. Thus, NO x emissions from construction machinery need further attention. Furthermore, we found a weak negative correlation (p < 0.05) between the emission level of smoke and NO x , that is the synergic emission reduction effect is poor, emphasizing the need for NO x emission limits. In the future, the oversight in Beijing should prioritize phasing out China Ⅰ and China Ⅱ machinery, and monitor emissions from high-power engine China Ⅲ machinery. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
43. Triaxial Compression Behavior and Damage Model of EICP-Cemented Calcareous Sand.
- Author
-
Jiang, Xueliang, Wang, Haodong, Yang, Hui, Wei, Zhenzhen, Bao, Shufeng, Fan, Wenchen, and Wang, Yixian
- Subjects
DAMAGE models ,LOGNORMAL distribution ,CALCIUM carbonate ,GROUTING ,REEFS - Abstract
The enzyme-induced calcium carbonate precipitation (EICP) technique was utilized to cement calcareous sand. The mechanical properties of EICP-cemented calcareous sand at various cementation degree were investigated using consolidated drained triaxial compression tests. A statistical damage constitutive model tailored for EICP-cemented calcareous sand was also developed based on damage mechanics theory. The findings are as follows: (1) The EICP technique significantly enhances the cementation of calcareous sand. As the number of grouting operations increases, the peak deviator stress of the cemented material gradually increases, with the maximum enhancement approaching 2.5 times. Moreover, during the stress decay phase following the peak stress, the decay rate of the cemented sand accelerates, displaying a more pronounced brittle characteristic. (2) With the increased calcium carbonate content, the peak eccentric stress of the cemented body increases significantly, and there is an obvious nonlinear exponential correlation between them. (3) The statistical damage constitutive model, formulated based on Lemaitre's strain equivalence principle combined with a log-normal distribution and the Drucker–Prager strength criterion, accurately predicts the stress–strain curves, effectively simulating the complete stress–strain evolution of EICP-cemented sand under different numbers of grouting operations and varied confining pressure conditions. (4) At higher cementation levels or lower confining pressures, the internal damage process of the EICP-cemented calcareous sand specimens intensifies, indicated by the rapid increase of the damage variable D with axial strain. The research findings can provide a crucial theoretical foundation for the application of EICP technology in the treatment of island reef or roadbed foundations, aiding in the analysis and prediction of the mechanical properties of EICP-cemented calcareous sands. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
44. Accelerated Life Testing of Biodegradable Starch Films with Nanoclay Using the Elongation Level as a Stressor.
- Author
-
Frangopoulos, Theofilos, Ketesidis, Apostolos, Marinopoulou, Anna, Goulas, Athanasios, Petridis, Dimitrios, and Karageorgiou, Vassilis
- Subjects
ACCELERATED life testing ,PACKAGING film ,FOOD packaging ,DATA distribution ,UNITS of time ,LOGNORMAL distribution - Abstract
An attempt was made to evaluate the elongation level as a stressor on biodegradable starch films reinforced with nanoclay using a simple linear model. A total of 120 film units were subjected to increasing elongation levels and the exact break time of the failed units was monitored. Nine different attempts were made to fit the data distribution and the lognormal distribution was chosen as the most suitable because it resulted in the lowest values of the regression fit indices −2LL, AICc and BIC. Following the selection of the best fit, it was, generally, observed that an increase in the elongation level resulted in the decreasing exact break time of the films. Among several models, the best fit was provided by the simple linear model. Based on this model, the acceleration factor was estimated, and it was shown that it increased exponentially while increasing the elongation level. Finally, the probability of failure and the hazard rate of the film units as a function of the elongation level were estimated, demonstrating the applicability of this method as a tool for food packaging film failure prediction. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Identifying risk factors for recurrent multidrug resistant tuberculosis based on patient's record data from 2016 to 2021: retrospective study.
- Author
-
Wotale, Teramaj Wongel, Lelisho, Mesfin Esayas, Negasa, Bikiltu Wakuma, Tareke, Seid Ali, Gobena, Woldemariam Erkalo, and Amesa, Ebsa Gelan
- Subjects
- *
MULTIDRUG-resistant tuberculosis , *PROGNOSIS , *PUBLIC health , *LUNG diseases , *LOGNORMAL distribution - Abstract
Globally, the prevalence of multidrug-resistant tuberculosis (MDR-TB) has been increasing recently. This is a major public health concern, as MDR-TB is more difficult to treat and has poorer outcomes compared to drug-sensitive tuberculosis. The main objective of the study was to identify risk factors for recurrent multidrug-resistant tuberculosis, at Alert Specialized Hospital, Addis Ababa, by using different parametric shared frailty models. From January 2016 to December 2021, a retrospective study was conducted on MDR-TB patients at Alert Specialized Hospital in Addis Ababa. The data for the study were collected from the medical records of MDR-TB patients at the hospital during this time period. Gamma and inverse-Gaussian shared frailty models were used to analyze the dataset, with the exponential, Weibull, and lognormal distributions included as baseline hazard functions. The data were analyzed using R statistical software. The median recurrence time of the patients was 12 months, and 149 (34.3%) had recurrences. The clustering effect was statistically significant for multiple drug-resistant tuberculosis patients' recurrence. According to the Weibull-Inverse-Gaussian model, factors that reduced time to MDR-TB recurrence included lower weight (ɸ = 0.944), smoking (ɸ = 0.045), alcohol use (ɸ = 0.631), hemoptysis (ɸ = 0.041), pneumonia (ɸ = 0.564), previous anti-TB treatment (ɸ = 0.106), rural residence (ɸ = 0.163), and chronic diseases like diabetes (ɸ = 0.442) were associated with faster recurrence. While, higher education (ɸ = 3.525) and age (ɸ = 1.021) extended time to recurrence. For weight increment, smokers and alcohol users, clinical complications of hemoptysis and pneumonia, patients with pulmonary disease who had a history of previous anti-TB treatment, and being rural residents are prognostic factors. There was a significant clustering effect at the Alert Specialized Hospital in Addis Ababa, Ethiopia. The Weibull-Inverse Gaussian Shared Frailty Model was chosen as the best model for predicting the time to recurrence of MDR-TB. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Prognostic factors for premature cardiovascular disease mortality in Malaysia: a modelling approach using semi-parametric and parametric survival analysis with national health and morbidity survey linked mortality data.
- Author
-
Hasani, Wan Shakira Rodzlan, Musa, Kamarul Imran, Omar, Mohd Azahadi, Hanis, Tengku Muhammad, Kueh, Yee Cheng, Ganapathy, Shubash Shander, Yusoff, Muhammad Fadhli Mohd, and Ahmad, Noor Ani
- Subjects
- *
PROPORTIONAL hazards models , *EARLY death , *SURVIVAL analysis (Biometry) , *LOGNORMAL distribution ,CARDIOVASCULAR disease related mortality - Abstract
Background: Cardiovascular disease (CVD) is the leading cause of premature mortality worldwide. Despite existing research on CVD risk factors, the study of premature CVD mortality in Malaysia remains limited. This study employs survival analysis to model modifiable risk factors associated with premature CVD mortality among Malaysian adults. Method: We utilised data from Malaysia's National Health and Morbidity Survey (NHMS) conducted in 2006, 2011, and 2015, linked with mortality records. The cohort comprised individuals aged 18 to 70 during the NHMS interview. Follow-up extended to 2021, focusing on CVD-related premature mortality between ages 30 and 70. We employed six survival models: a semi-parametric Cox proportional hazard (PH) and five parametric survival models, which were Exponential, Weibull, Gompertz, log-normal and log-logistic distributions using R software. The age standardized incidence rate (ASIR) of premature CVD mortality was calculated per 1000 person-years. Results: Among 63,722 participants, 886 (1.4%) experienced premature CVD mortality, with an ASIR of 1.80 per 1000 person-years. The best-fit models (based on AIC value) were the stratified Cox model by age (semi-parametric) and the log-normal accelerated failure time (AFT) model (parametric). Males had higher risk (Hazard Ratio, HR = 2.68) and experienced 49% shorter survival time (Event Time Ratio, ETR = 0.51) compared to females. Compared to Chinese ethnicity, Indians, Malays, and other Bumiputera had higher HR and lower survival times. Rural residents and those with lower education also faced increased HRs and reduced survival times. Diabetes (diagnosed: HR = 3.26, ETR = 0.37; undiagnosed: HR = 1.63, ETR = 0.63), hypertension (diagnosed: HR = 1.84, ETR = 0.53; undiagnosed: HR = 1.46, ETR = 0.68), and undiagnosed hypercholesterolemia (HR = 1.31, ETR = 0.80) increased risk and decreased survival times. Additionally, current smoking and abdominal obesity elevated risk (HR = 1.38, 1.60) and shortened survival (ETR = 0.81, 0.71). Conclusion: The semi-parametric and parametric survival models both highlight the considerable impact of socioeconomic status and modifiable risk factors on premature CVD mortality, underscoring the imperative for targeted interventions to effectively mitigate these effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. An Ecological Survey of Chiggers (Acariformes: Trombiculidae) Associated with Small Mammals in an Epidemic Focus of Scrub Typhus on the China–Myanmar Border in Southwest China.
- Author
-
Liu, Ru-Jin, Guo, Xian-Guo, Zhao, Cheng-Fu, Zhao, Ya-Fei, Peng, Pei-Ying, and Jin, Dao-Chao
- Subjects
- *
TSUTSUGAMUSHI disease , *ENDANGERED species , *ZOONOSES , *LOGNORMAL distribution , *NUMBERS of species - Abstract
Simple Summary: Chiggers are common ectoparasites on rodents and other small mammals, and they transmit scrub typhus, a zoonotic disease. Dehong in Yunnan Province of southwest China is located on the China–Myanmar border, and it is a focus of scrub typhus. The present paper reports the infestation and distribution of chiggers on small mammals in Dehong for the first time. From 1760 rodents and other sympatric small mammals, a total of 9309 chiggers were identified, representing 117 species. Most chigger species had low host specificity. Leptotrombidium deliense, a major vector of scrub typhus in China, was the dominant chigger species in Dehong, and it was mainly distributed in flatland areas and indoors. The infestation and community indexes of chiggers in mountainous areas and outdoors were higher than those in flatland areas and indoors. The species abundance distribution of the chigger community conformed to log-normal distribution, and the total number of chigger species was roughly estimated to be 147. The species diversity of the chigger community is high in Dehong, with an obvious environmental heterogeneity. The low host specificity of chiggers and the occurrence of a large number of L. deliense in Dehong would increase the transmission risk of scrub typhus on the China–Myanmar border. Chiggers (chigger mites) are a group of tiny arthropods, and they are the exclusive vector of Orientia tsutsugamushi (Ot), the causative agent of scrub typhus (tsutsugamushi disease). Dehong Prefecture in Yunnan Province of southwest China is located on the China–Myanmar border and is an important focus of scrub typhus. Based on the field surveys in Dehong between 2008 and 2022, the present paper reports the infestation and ecological distribution of chiggers on the body surface of rodents and other sympatric small mammals (shrews, tree shrews, etc.) in the region for the first time. The constituent ratio (Cr), prevalence (PM), mean abundance (MA), and mean intensity (MI) were routinely calculated to reflect the infestation of small-mammal hosts with chiggers. Additionally, the species richness (S), Shannon–Wiener diversity index (H), Simpson dominance index (D), and Pielou's evenness index (E) were calculated to illustrate the chigger community structure. Preston's log-normal model was used to fit the theoretical curve of species abundance distribution, and the Chao 1 formula was used to roughly estimate the expected total species. The "corrplot" package in R software (Version 4.3.1) was used to analyze interspecific relationships, and the online drawing software was used to create a chord diagram to visualize the host–chigger associations. From 1760 small-mammal hosts, a total of 9309 chiggers were identified as belonging to 1 family, 16 genera, and 117 species, with high species diversity. The dominant chigger species were Leptotrombidium deliense, Walchia ewingi, and Gahrliepia longipedalis, with a total Cr = 47.65% (4436/9309), among which L. deliense is the most important vector of Ot in China. The overall infestation indexes (PM, MA, and MI) and community parameters (S, H, and E) of chiggers in the mountainous areas and outdoors were higher than those in the flatland areas and indoors, with an obvious environmental heterogeneity. Leptotrombidium deliense was the dominant species in the flatland and indoors, while G. longipedalis was the prevalent species in the mountainous and outdoor areas. The species abundance distribution of the chigger community conformed to log-normal distribution with the theoretical curve equation: S (R) ′ = 28 e − [ 0.23 (R − 0) ] 2 , indicating the existence of many rare species and only a few dominant species in the community. The expected total number of chigger species was roughly estimated to be 147 species, 30 more than the 117 species actually collected, suggesting that some uncommon species may have been missed in the sampling survey. The host–parasite association analysis revealed that one host species can harbor different chigger species, and one chigger species can parasitize different host species with low host specificity. A positive or negative correlation existed among different chigger species, indicating a cooperative or competitive interspecific relationship. The species diversity of chiggers is high in Dehong on the China–Myanmar border, and a large host sample is recommended to find more uncommon species. There is an obvious environmental heterogeneity of the chigger community, with different species diversity and dominant species in different environments. The low host specificity of chiggers and the occurrence of a large number of L. deliense in Dehong, especially in flatland areas and indoors, would increase the risk of persistent transmission of scrub typhus in the region. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Bivariate Log-Symmetric Regression Models Applied to Newborn Data.
- Author
-
Saulo, Helton, Vila, Roberto, and Souza, Rubens
- Subjects
- *
REGRESSION analysis , *PUBLIC health , *NEWBORN infants , *DATA modeling , *LOGNORMAL distribution , *DISPERSION (Chemistry) - Abstract
This paper introduces bivariate log-symmetric models for analyzing the relationship between two variables, assuming a family of log-symmetric distributions. These models offer greater flexibility than the bivariate lognormal distribution, allowing for better representation of diverse distribution shapes and behaviors in the data. The log-symmetric distribution family is widely used in various scientific fields and includes distributions such as log-normal, log-Student-t, and log-Laplace, among others, providing several options for modeling different data types. However, there are few approaches to jointly model continuous positive and explanatory variables in regression analysis. Therefore, we propose a class of generalized linear model (GLM) regression models based on bivariate log-symmetric distributions, aiming to fill this gap. Furthermore, in the proposed model, covariates are used to describe its dispersion and correlation parameters. This study uses a dataset of anthropometric measurements of newborns to correlate them with various biological factors, proposing bivariate regression models to account for the relationships observed in the data. Such models are crucial for preventing and controlling public health issues. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Optical Modeling of Sea Salt Aerosols Using in situ Measured Size Distributions and the Impact of Larger Size Particles.
- Author
-
Lin, Wushao and Bi, Lei
- Subjects
- *
SEA salt aerosols , *PARTICLE size distribution , *SEA salt , *DELIQUESCENCE , *LOGNORMAL distribution - Abstract
Sea salt aerosols play a critical role in regulating the global climate through their interactions with solar radiation. The size distribution of these particles is crucial in determining their bulk optical properties. In this study, we analyzed in situ measured size distributions of sea salt aerosols from four field campaigns and used multi-mode lognormal size distributions to fit the data. We employed super-spheroids and coated super-spheroids to account for the particles' non-spherictty, inhomogeneity, and hysteresis effect during the deliquescence and crystallization processes. To compute the single-scattering properties of sea salt aerosols, we used the state-of-the-art invariant imbedding T-matrix method, which allows us to obtain accurate optical properties for sea salt aerosols with a maximum volume-equivalent diameter of 12 µm at a wavelength of 532 nm. Our results demonstrated that the particle models developed in this study were successful in replicating both the measured depolarization and lidar ratios at various relative humidity (RH) levels. Importantly, we observed that large-size particles with diameters larger than 4 µm had a substantial impact on the optical properties of sea salt aerosols, which has not been accounted for in previous studies. Specifically, excluding particles with diameters larger than 4 µm led to underestimating the scattering and backscattering coefficients by 27%–38% and 43%–60%, respectively, for the ACE-Asia field campaign. Additionally, the depolarization ratios were underestimated by 0.15 within the 50%–70% RH range. These findings emphasize the necessity of considering large particle sizes for optical modeling of sea salt aerosols. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A Theoretical Investigation of Coal Fracture Evolution with Hydrostatic Pressure and its Validation by CT.
- Author
-
Zhao, Changxin, Cheng, Yuanping, Li, Wei, Wang, Liang, Lu, Zhuang, and Wang, Hao
- Subjects
BULK modulus ,GAS well drilling ,HYDROSTATIC pressure ,LOGNORMAL distribution ,IN situ processing (Mining) - Abstract
The stress-induced evolution of coal fractures significantly affects permeability and, consequently, gas extraction efficiency. This study introduces a novel coal fracture evolution model based on assumptions of fracture morphology and log-normal distribution of fracture aspect ratio. This model offers a theoretical framework for understanding the fracture closure process, ultimately depicting fracture evolution as a combined result of elastic compression and closure. It predicts the decay curve of fracture porosity under hydrostatic pressure loading. We conducted uniaxial compression experiments for determining the mechanical parameters of the model and in situ CT experiments with confining pressure ranging from 0 to 25 MPa for validating the model. The findings indicate the following: (1) Initially, the decline in fracture porosity with stress is predominantly due to elastic compression, followed by a rapid transition to closure. (2) Sensitivity analysis reveals that an increase in two physical quantities—the cube root of the product of the peak aspect ratio and the square of the mean aspect ratio (x
c ) and the bulk modulus of the coal matrix (Km )—results in a decrease in the rate of fracture porosity decay with stress. (3) Tectonic action has a dual effect of augmenting xc and diminishing Km . We define the magnification of xc and the divisor of Km under a common term—scaling factor. When the scaling factor of xc is less than that of Km , the tectonic action promotes the decay of porosity with stress. Conversely, when the scaling factor of xc is greater than that of Km , the effect is reversed. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.