69,907 results on '"Stochastic process"'
Search Results
2. Research on passenger flow control at metro transfer stations based on real-time flow calculation of streamlines
- Author
-
Lei, Bin, Hou, Zhuoxing, Suo, Yifei, Liu, Wei, Luo, Linlin, and Lei, Dongbo
- Published
- 2024
- Full Text
- View/download PDF
3. Seasonal dynamics of soil microbiome in response to dry–wet alternation along the Jinsha River Dry-hot Valley.
- Author
-
Jiang, Hao, Chen, Xiaoqing, Li, Yongping, Chen, Jiangang, Wei, Li, and Zhang, Yuanbin
- Abstract
Background: Soil microorganisms play a key role in nutrient cycling, carbon sequestration, and other important ecosystem processes, yet their response to seasonal dry–wet alternation remains poorly understood. Here, we collected 120 soil samples from dry-hot valleys (DHVs, ~ 1100 m a.s.l.), transition (~ 2000 m a.s.l.) and alpine zones (~ 3000 m a.s.l.) along the Jinsha River in southwest China during both wet and dry seasons. Our aims were to investigate the bacterial microbiome across these zones, with a specific focus on the difference between wet and dry seasons. Results: Despite seasonal variations, bacterial communities in DHVs exhibit resilience, maintaining consistent community richness, diversity, and coverage. This suggests that the microbes inhabiting DHVs have evolved adaptive mechanisms to withstand the extreme dry and hot conditions. In addition, we observed season-specific microbial clades in all sampling areas, highlighting their resilience to environmental fluctuations. Notably, we found similarities in microbial clades between soils from DHVs and the transition zones, including the phyla Actinomycetota, Chloroflexota, and Pseudomonadota. The neutral community model respectively explained a substantial proportion of the community variation in DHVs (87.7%), transition (81.4%) and alpine zones (81%), indicating that those were predominantly driven by stochastic processes. Our results showed that migration rates were higher in the dry season than in the wet season in both DHVs and the alpine zones, suggesting fewer diffusion constraints. However, this trend was reversed in the transition zones. Conclusions: Our findings contribute to a better understanding of how the soil microbiome responds to seasonal dry–wet alternation in the Jinsha River valley. These insights can be valuable for optimizing soil health and enhancing ecosystem resilience, particularly in dry-hot valleys, in the context of climate change. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. On damping a control system with global aftereffect on quantum graphs: Stochastic interpretation.
- Author
-
Buterin, Sergey
- Subjects
- *
QUANTUM graph theory , *BOUNDARY value problems , *STOCHASTIC processes , *TIME-varying networks , *TREE graphs - Abstract
Quantum graphs model processes in complex systems represented as spatial networks in various fields of natural science and technology. An example is the oscillations of elastic string networks, the nodes of which, besides the continuity conditions, also obey the Kirchhoff conditions, expressing the balance of tensions. In this paper, we propose a new look at quantum graphs as temporal networks, which means that the variable parametrizing the edges of a graph is interpreted as time, while each internal vertex is a branching point giving several different scenarios for the further trajectory of a process. Then Kirchhoff‐type conditions may also arise. Namely, they will be satisfied by such a trajectory of the process that is optimal with account of all the scenarios simultaneously. By employing the recent concept of global delay, we extend the problem of damping a first‐order control system with aftereffect, considered earlier only on an interval, to an arbitrary tree graph. The first means that the delay, imposed starting from the initial moment of time, associated with the root of the tree, propagates through all internal vertices. Bringing the system into the equilibrium and minimizing the energy functional with account of the anticipated probability of each scenario, we come to a variational problem. Then, we establish its equivalence to a self‐adjoint boundary value problem on the tree for some second‐order equations involving both the global delay and the global advance. The unique solvability of both problems is proved. We also illustrate that the interval case when the coefficients of the equation are discrete stochastic processes in discrete time can be viewed as the extension to a tree. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Steady state distributions of moving particles in one dimension: with an eye towards axonal transport.
- Author
-
Dallon, J. C., Evans, Emily, Grant, Christopher P., and Portet, Stephanie
- Abstract
Axonal transport, propelled by motor proteins, plays a crucial role in maintaining the homeostasis of functional and structural components over time. To establish a steady-state distribution of moving particles, what conditions are necessary for axonal transport? This question is pertinent, for instance, to both neurofilaments and mitochondria, which are structural and functional cargoes of axonal transport. In this paper we prove four theorems regarding steady state distributions of moving particles in one dimension on a finite domain. Three of the theorems consider cases where particles approach a uniform distribution at large time. Two consider periodic boundary conditions and one considers reflecting boundary conditions. The other theorem considers reflecting boundary conditions where the velocity is space dependent. If the theoretical results hold in the complex setting of the cell, they would imply that the uniform distribution of neurofilaments observed under healthy conditions appears to require a continuous distribution of neurofilament velocities. Similarly, the spatial distribution of axonal mitochondria may be linked to spatially dependent transport velocities that remain invariant over time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Periodic evolution of nonlinear economic cycle systems under exogenous harmonic and stochastic drivers.
- Author
-
Zhao, Jun and Zhang, Xinxin
- Abstract
The study is conducted for analyzing the periodic behavior of nonlinear economic cycle systems with an exogenous harmonic driver in random environment. The case is taken in the form of a harmonic cosine function acting on income for the exogenous driver and Gaussian white noise for the random environment. It theoretically can model the periodic adjustment of an authority in a random market, which is more practical in reality. The probability density function (PDF) of the systems is used to describe the behaviors. Two different nonlinear economic cycle models are further studied to evaluate the reaction of economic cycle systems with different cases. One model is about a van der Pol-polynomial economic cycle model. The other model is about a van der Pol-type economic cycle model. To obtain the PDF evolution of the two economic cycle models, a path integration technique is adopted. The analysis reveals that the PDFs of the nonlinear models exhibit stable periodic behaviors after several cycles. Furthermore, the van der Pol-polynomial model has a unimodal PDF, with the PDFs of income and income growth rate exhibiting similar evolutionary patterns. By contrast, the van der Pol-type economic cycle model performs a repetitive alternation between unimodal and bimodal patterns in the PDF distributions. The shape of these distributions is significantly affected by different magnitudes of driver frequencies. Therefore, the authority should focus on the effect of the adjustment frequency on the random market besides the adjustment amplitude. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Optimal subsampling for semi‐parametric accelerated failure time models with massive survival data using a rank‐based approach.
- Author
-
Yang, Zehan, Wang, HaiYing, and Yan, Jun
- Subjects
- *
STOCHASTIC processes , *PARAMETRIC modeling , *SURVIVAL analysis (Biometry) , *PROBABILITY theory , *SAMPLE size (Statistics) - Abstract
Subsampling is a practical strategy for analyzing vast survival data, which are progressively encountered across diverse research domains. While the optimal subsampling method has been applied to inferences for Cox models and parametric accelerated failure time (AFT) models, its application to semi‐parametric AFT models with rank‐based estimation have received limited attention. The challenges arise from the non‐smooth estimating function for regression coefficients and the seemingly zero contribution from censored observations in estimating functions in the commonly seen form. To address these challenges, we develop optimal subsampling probabilities for both event and censored observations by expressing the estimating functions through a well‐defined stochastic process. Meanwhile, we apply an induced smoothing procedure to the non‐smooth estimating functions. As the optimal subsampling probabilities depend on the unknown regression coefficients, we employ a two‐step procedure to obtain a feasible estimation method. An additional benefit of the method is its ability to resolve the issue of underestimation of the variance when the subsample size approaches the full sample size. We validate the performance of our estimators through a simulation study and apply the methods to analyze the survival time of lymphoma patients in the surveillance, epidemiology, and end results program. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Reliability and Sensitivity Analysis of Wireless Sensor Network Using a Continuous-Time Markov Process.
- Author
-
Kumar, Amit, Jadhav, Sujata, and Alsalami, Omar Mutab
- Subjects
- *
EMERGENCY power supply , *WIRELESS sensor networks , *STOCHASTIC analysis , *MARKOV processes , *STOCHASTIC processes - Abstract
A remarkably high growth has been observed in the uses of wireless sensor networks (WSNs), due to their momentous potential in various applications, namely the health sector, smart agriculture, safety systems, environmental monitoring, military operations, and many more. It is quite important that a WSN must have high reliability along with the least MTTF. This paper introduces a continuous-time Markov process, which is a special case of stochastic process, based on modeling of a wireless sensor network for analyzing the various reliability indices of the same. The modeling has been conducted by considering the different components, including the sensing unit, transceiver, microcontroller, power supply, standby power supply unit, and their failures/repairs, which may occur during their functioning. The study uncovered different important assessment parameters like reliability, components-wise reliability, MTTF, and sensitivity analysis. The critical components of a WSN are identified by incorporating the concept of sensitivity analysis. The outcomes emphasize that the proposed model will be ideal for understanding different reliability indices of WSNs and guiding researchers and potential users in developing a more robust wireless sensor network system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A bivariate dependent degradation model based on artificial neural network supported stochastic process and Copula function.
- Author
-
Liu, Di, Duan, Xiaochuan, Wang, Shaoping, Shi, Jian, and Shang, Yaoxing
- Subjects
- *
ARTIFICIAL neural networks , *COPULA functions , *STOCHASTIC processes , *FATIGUE cracks , *MOMENTS method (Statistics) - Abstract
In order to use the high ability of the artificial neural network (ANN) in data fitting, this paper introduces an ANN in stochastic process to describe the mean function for degradation modeling. Due to the fact that the existing method cannot handle the bivariate dependent degradation conditions, a bivariate dependent degradation model based on Copula function and ANN‐supported stochastic processes is proposed. Considering the random effects caused by individual difference, it is assumed that the unknown parameters in the stochastic processes and Copula functions are randomly distributed. Based on the maximum likelihood and moment estimation methods, a related statistical inference method for ANN training and parameter estimation is developed to use the bivariate dependent degradation model. An actual fatigue crack dataset is used to demonstrate the validity of the proposed method. The obtained results show that the dependent relationship between two degradation indicators should not be neglected, and it can be efficiently handled by the proposed method. Furthermore, the proposed degradation model can provide reliability and degradation intervals with enough precision due to the fact that it considers the random effects caused by individual difference. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Seasonal dynamics of soil microbiome in response to dry–wet alternation along the Jinsha River Dry-hot Valley
- Author
-
Hao Jiang, Xiaoqing Chen, Yongping Li, Jiangang Chen, Li Wei, and Yuanbin Zhang
- Subjects
Altitudinal gradient ,Dry-hot valley ,Mountain-valley breeze circulation ,Seasonal dry–wet cycle ,Stochastic process ,Microbiology ,QR1-502 - Abstract
Abstract Background Soil microorganisms play a key role in nutrient cycling, carbon sequestration, and other important ecosystem processes, yet their response to seasonal dry–wet alternation remains poorly understood. Here, we collected 120 soil samples from dry-hot valleys (DHVs, ~ 1100 m a.s.l.), transition (~ 2000 m a.s.l.) and alpine zones (~ 3000 m a.s.l.) along the Jinsha River in southwest China during both wet and dry seasons. Our aims were to investigate the bacterial microbiome across these zones, with a specific focus on the difference between wet and dry seasons. Results Despite seasonal variations, bacterial communities in DHVs exhibit resilience, maintaining consistent community richness, diversity, and coverage. This suggests that the microbes inhabiting DHVs have evolved adaptive mechanisms to withstand the extreme dry and hot conditions. In addition, we observed season-specific microbial clades in all sampling areas, highlighting their resilience to environmental fluctuations. Notably, we found similarities in microbial clades between soils from DHVs and the transition zones, including the phyla Actinomycetota, Chloroflexota, and Pseudomonadota. The neutral community model respectively explained a substantial proportion of the community variation in DHVs (87.7%), transition (81.4%) and alpine zones (81%), indicating that those were predominantly driven by stochastic processes. Our results showed that migration rates were higher in the dry season than in the wet season in both DHVs and the alpine zones, suggesting fewer diffusion constraints. However, this trend was reversed in the transition zones. Conclusions Our findings contribute to a better understanding of how the soil microbiome responds to seasonal dry–wet alternation in the Jinsha River valley. These insights can be valuable for optimizing soil health and enhancing ecosystem resilience, particularly in dry-hot valleys, in the context of climate change.
- Published
- 2024
- Full Text
- View/download PDF
11. Research on passenger flow control at metro transfer stations based on real-time flow calculation of streamlines
- Author
-
Bin Lei, Zhuoxing Hou, Yifei Suo, Wei Liu, Linlin Luo, and Dongbo Lei
- Subjects
Metro transfer station ,Passenger flow control ,Flow streamline ,Stochastic process ,User equilibrium ,Transportation engineering ,TA1001-1280 ,Railroad engineering and operation ,TF1-1620 - Abstract
Purpose – The volume of passenger traffic at metro transfer stations serves as a pivotal metric for the orchestration of crowd flow management. Given the intricacies of crowd dynamics within these stations and the recurrent instances of substantial passenger influxes, a methodology predicated on stochastic processes and the principle of user equilibrium is introduced to facilitate real-time traffic flow estimation within transfer station streamlines. Design/methodology/approach – The synthesis of stochastic process theory with streamline analysis engenders a probabilistic model of intra-station pedestrian traffic dynamics. Leveraging real-time passenger flow data procured from monitoring systems within the transfer station, a gradient descent optimization technique is employed to minimize the cost function, thereby deducing the dynamic distribution of categorized passenger flows. Subsequently, adhering to the tenets of user equilibrium, the Frank–Wolfe algorithm is implemented to allocate the intra-station categorized passenger flows across various streamlines, ascertaining the traffic volume for each. Findings – Utilizing the Xiaozhai Station of the Xi’an Metro as a case study, the Anylogic simulation software is engaged to emulate the intra-station crowd dynamics, thereby substantiating the efficacy of the proposed passenger flow estimation model. The derived solutions are instrumental in formulating a crowd control strategy for Xiaozhai Station during the peak interval from 17:30 to 18:00 on a designated day, yielding crowd management interventions that offer insights for the orchestration of passenger flow and operational governance within metro stations. Originality/value – The construction of an estimation methodology for the real-time streamline traffic flow augments the model’s dataset, supplanting estimated values derived from surveys or historical datasets with real-time computed traffic data, thereby enhancing the precision and immediacy of crowd flow management within metro stations.
- Published
- 2024
- Full Text
- View/download PDF
12. Modeling uncertainty: the impact of noise in T cell differentiation.
- Author
-
Martínez-Méndez, David, Villarreal, Carlos, and Huerta, Leonor
- Subjects
- *
T cell differentiation , *T cells , *STOCHASTIC differential equations , *NOISE - Abstract
Background: The regulatory mechanisms guiding CD4 T cell differentiation are complex and are further influenced by intrinsic cell variability along with that of microenvironmental cues, such as cytokine and nutrient availability. Objective: This study aims to expand our understanding of CD4 T cell differentiation by examining the influence of intrinsic noise on cell fate. Methodology: A model based on a complex regulatory network of early signaling events involved in CD4 T cell activation and differentiation was described in terms of a set of stochastic differential equation to assess the effect of noise intensity on differentiation efficiency to the Th1, Th2, Th17, Treg, and TFH effector phenotypes under defined cytokine and nutrient conditions. Results: The increase of noise intensity decreases differentiation efficiencies. In a microenvironment of Th1-inducing cytokines and optimal nutrient conditions, noise levels of 3%, 5% and 10% render Th1 differentiation efficiencies of 0.87, 0.76 and 0.62, respectively, underscoring the sensitivity of the network to random variations. Further increments of noise reveal that the network is relatively stable until noise levels of 20%, where the resulting cell phenotypes becomes heterogeneous. Notably, Treg differentiation showed the highest robustness to noise perturbations. A combined Th1-Th2 cytokine environment with optimal nutrient levels induces a dominant Th1 phenotype; however, removal of glutamine shifts the balance towards the Th2 phenotype at all noise levels, with an efficiency similar to that obtained under Th2-only cytokine conditions. Similarly, combinations of Th1/Treg and Treg/Th17-inducing cytokines along with the removal of either tryptophan or oxygen shift the dominant Th1 and Treg phenotypes towards Treg and Th17 respectively. Model results are consistent with differentiation efficiency patterns obtained under wellcontrolled experimental settings reported in the literature. Conclusion: The stochastic CD4 T cell mathematical model presented here demonstrates a noise-dependent modulation of T cell differentiation induced by cytokines and nutrient availability. Modeling results can be explained by the network topology, which assures that the system will arrive at stable states of cell functionality despite variable levels of biological intrinsic noise. Moreover, the model provides insights into the robustness of the T cell differentiation process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. A New Approach on the Approximate Controllability Results for Hilfer Fractional Stochastic Hemivariational Inequalities of Order 1<μ<2.
- Author
-
Pradeesh, J. and Vijayakumar, V.
- Abstract
In this paper, we investigate the approximate controllability for Hilfer fractional stochastic hemivariational inequalities of order 1 < μ < 2 in Hilbert spaces. Initially, we define the concept of a mild solution for our problem in terms of fractional calculus, cosine families, stochastic analysis, and generalized Clarke subdifferential. Then, the existence and approximate controllability for Hilfer fractional stochastic evolution hemivariational inequalities are formulated and proven under appropriate conditions using fixed point theorems for multivalued maps. Finally, an example is presented to illustrate the theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. On the Asymptotic Stability of Hilfer Fractional Neutral Stochastic Differential Systems with Infinite Delay.
- Author
-
Pradeesh, J. and Vijayakumar, V.
- Abstract
This article explores the existence and asymptotic stability in the p-th moment of mild solutions to a class of Hilfer fractional neutral stochastic differential equations with infinite delay in Hilbert spaces. To prove our main results, we use fractional calculus, stochastic analysis, semigroup theory, and the Krasnoselskii-Schaefer type fixed point theorem. Moreover, a set of novel sufficient conditions is derived for achieving the required result. Following that, we extend the given system to the Sobolev type and provided the existence results of the considered system. After that, we provided an example to illustrate the validity of our results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. On some stable linear functional regression estimators based on random projections.
- Author
-
Ben Saber, Asma and Karoui, Abderrazek
- Subjects
STOCHASTIC processes ,INVERSE problems ,PROBLEM solving ,SAMPLING (Process) ,COMPUTER simulation - Abstract
In this work, we develop two stable estimators for solving linear functional regression problems. It is well known that such a problem is an ill-posed stochastic inverse problem. Hence, a special interest has to be devoted to the stability issue in the design of an estimator for solving such a problem. Our proposed estimators are based on combining a stable least-squares technique and a random projection of the slope function β 0 (·) ∈ L 2 (J) , where J is a compact interval. Moreover, these estimators have the advantage of having a fairly good convergence rate with reasonable computational load, since the involved random projections are generally performed over a fairly small dimensional subspace of L 2 (J). More precisely, the first estimator is given as a least-squares solution of a regularized minimization problem over a finite dimensional subspace of L 2 (J). In particular, we give an upper bound for the empirical risk error as well as the convergence rate of this estimator. The second proposed stable LFR estimator is based on combining the least-squares technique with a dyadic decomposition of the i.i.d. samples of the stochastic process, associated with the LFR model. In particular, we provide an L 2 -risk error of this second LFR estimator. Finally, we provide some numerical simulations on synthetic as well as on real data that illustrate the results of this work. These results indicate that our proposed estimators are competitive with some existing and popular LFR estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A Jordan Curve Theorem on a 3D Ball Through Brownian Motion.
- Author
-
Srinivasa Rao, Arni S. R. and Krantz, Steven G.
- Abstract
The Jordan curve theorem states that any simple closed curve in 3D space divides the space into two regions, an interior and an exterior. In this article, we prove the Jordan curve theorem on the boundary of a 3D ball that is inserted in a complex plane bundle. To do so, we make use of the Brownian motion principle, which is a continuous-time and continuous-state stochastic process. We begin by selecting a random point on an arbitrarily chosen complex plane within a bundle G and on the boundary of the 3D ball considered. Using the two-step random process developed on complex planes earlier by Srinivasa Rao (Multilevel contours on bundles of complex planes, 2022), we draw a contour from the initial point to the next point on this plane. We then continue this process until we finish the Jordan curve that connects points on the boundary of a ball within G. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Best Couple Algorithm: A New Metaheuristic with Two Types of Equal Size Swarm Splits.
- Author
-
Kusuma, Purba Daru
- Subjects
- *
OPTIMIZATION algorithms , *SWARM intelligence , *RELATIVE motion , *STOCHASTIC processes , *WALRUS - Abstract
As stated in the no-free-lunch (NFL) theory, there is not any optimizer suitable for all problems. This circumstance becomes the motivation of introducing a new swarm-based metaheuristic called best couple algorithm (BCA). BCA is constructed as a swarm-based metaheuristic where the swarm is split into two sub-swarms. There are two types of splitting. The first split is dividing the swarm into the first half and second half of swarms. The second split is dividing the swarm into the odd indexed swarm members and even indexed swarm members. There is a sub swarm leader representing the highest quality swarm member in every sub swarm. There are two sequential searches for every split: the motion toward the middle between two sub swarm leaders and the motion relative to the middle between two randomly picked sub swarm members. In the benchmark assessment, BCA is compared with total interaction algorithm (TIA), coati optimization algorithm (COA), language education algorithm (LEO), osprey optimization algorithm (OOA), and walrus optimization algorithm (WaOA). The result shows that BCA is superior to these five contenders as it is better than TIA, COA, LEO, OOA, and WaOA in 18, 18, 16, 18, and 18 functions respectively out of 23 functions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
18. Electronic health record data is unable to effectively characterize measurement error from pulse oximetry: a simulation study.
- Author
-
Sarraf, Elie
- Abstract
Large data sets from electronic health records (EHR) have been used in journal articles to demonstrate race-based imprecision in pulse oximetry (SpO
2 ) measurements. These articles do not appear to recognize the impact of the variability of the SpO2 values with respect to time ("deviation time"). This manuscript seeks to demonstrate that due to this variability, EHR data should not be used to quantify SpO2 error. Using the MIMIC-IV Waveform dataset, SpO2 values are sampled from 198 patients admitted to an intensive care unit and used as reference samples. The error derived from the EHR data is simulated using a set of deviation times. The laboratory oxygen saturation measurements are also simulated such that the performance of three simulated pulse oximeter devices will produce an average root mean squared (ARMS ) error of 2%. An analysis is then undertaken to reproduce a medical device submission to a regulatory body by quantifying the mean error, the standard deviation of the error, and the ARMS error. Bland-Altman plots were also generated with their Limits of Agreements. Each analysis was repeated to evaluate whether the measurement errors were affected by increasing the deviation time. All error values increased linearly with respect to the logarithm of the time deviation. At 10 min, the ARMS error increased from a baseline of 2% to over 4%. EHR data cannot be reliably used to quantify SpO2 error. Caution should be used in interpreting prior manuscripts that rely on EHR data. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
19. Probabilistic generation of hazard‐consistent suites of fully non‐stationary seismic records.
- Author
-
Yanni, Hera, Fragiadakis, Michalis, and Mitseas, Ioannis P.
- Subjects
GROUND motion ,ACCELEROGRAMS ,EARTHQUAKE engineering ,HAZARD mitigation - Abstract
A novel, practical, and computationally efficient probabilistic methodology for the stochastic generation of suites of fully non‐stationary artificial accelerograms is presented. The proposed methodology ensures that the produced ground motion suites match a given target spectral mean and target variability for the whole period range of interest. This is achieved by first producing an ensemble of random target spectra with the given mean and variability and then using them to generate artificial, target spectrum‐compatible, acceleration time‐histories with spectral representation techniques. Spectral correlation can also be assumed for the generated ground motion spectra. Based on the same backbone, two different formulations are proposed for generating spectrum‐compatible acceleration time‐histories of the non‐stationary kind. The distinction between these two variants lies in the techniques employed for modeling the temporal and spectral modulation, focusing on the site‐compatibility of the produced records. The first approach uses past‐recorded seismic accelerograms as seed records, and the second proposes and uses a new, probabilistic time‐frequency modulating function. The outcome of the proposed methodology is suites containing site‐compatible ground motion time‐histories whose spectral mean and variability match those obtained from any of the usually employed target spectra used in the earthquake engineering practice. An online tool implementing the proposed methodology is also freely provided. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. A Probabilistic Approach to Characterizing Drought Using Satellite Gravimetry.
- Author
-
Saemian, Peyman, Tourian, Mohammad J., Elmi, Omid, Sneeuw, Nico, and AghaKouchak, Amir
- Subjects
WATER storage ,STOCHASTIC processes ,SOIL moisture ,ORBITS (Astronomy) ,TIME series analysis ,SUBGLACIAL lakes - Abstract
In the recent past, the Gravity Recovery and Climate Experiment (GRACE) satellite mission and its successor GRACE Follow‐On (GRACE‐FO), have become invaluable tools for characterizing drought through measurements of Total Water Storage Anomaly (TWSA). However, the existing approaches have often overlooked the uncertainties in TWSA that stem from GRACE orbit configuration, background models, and intrinsic data errors. Here we introduce a fresh view on this problem which incorporates the uncertainties in the data: the Probabilistic Storage‐based Drought Index (PSDI). Our method leverages Monte Carlo simulations to yield realistic realizations for the stochastic process of the TWSA time series. These realizations depict a range of plausible drought scenarios that later on are used to characterize drought. This approach provides probability for each drought category instead of selecting a single final category at each epoch. We have compared PSDI with the deterministic approach (Storage‐based Drought Index, SDI) over major global basins. Our results show that the deterministic approach often leans toward an overestimation of storage‐based drought severity. Furthermore, we scrutinize the performance of PSDI across diverse hydrologic events, spanning continents from the United States to Europe, the Middle East, Southern Africa, South America, and Australia. In each case, PSDI emerges as a reliable indicator for characterizing drought conditions, providing a more comprehensive perspective than conventional deterministic indices. In contrast to the common deterministic view, our probabilistic approach provides a more realistic characterization of the TWS drought, making it more suited for adaptive strategies and realistic risk management. Plain Language Summary: Total Water Storage (TWS) is defined as the sum of water stored as surface water (e.g., lakes and rivers), groundwater, soil moisture, snow, ice, and vegetation biomass. Since its launch in 2002, the Gravity Recovery and Climate Experiment (GRACE) satellite mission has provided unique TWS change measurements with many applications in hydrology, including characterizing drought events. Scientists have been using satellites like GRACE and its successor, GRACE‐FO, to understand drought by measuring the Total Water Storage Anomaly (TWSA). However, previous methods didn't consider uncertainties from satellite orbits, models, and data errors. This study offers a novel probabilistic approach for characterizing drought, Probabilistic Storage‐based Drought Index (PSDI), which acknowledges the uncertainties in the GRACE TWS change. We use simulations to create different drought scenarios, offering probabilities for each category instead of one fixed category. Comparing PSDI to traditional methods, we found that traditional methods tend to overestimate drought severity. We tested PSDI across different regions, and it consistently proved to be a reliable way to understand drought conditions, offering a more comprehensive perspective. Our probabilistic approach offers a more realistic view of TWS drought, making it suitable for adaptive strategies and risk management. Key Points: A novel probabilistic framework is introduced to characterize drought using Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow‐On observations and propagating their stochasticsOur study suggests a tendency of deterministic approaches to overestimate storage‐based drought severityThe probabilistic approach captures global hydrological droughts while delivering more realistic results suited for risk management [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Markov chain quasi-Monte Carlo method for forecasting fire hotspots in Sarawak, Malaysia.
- Author
-
Zakaria, Nurul Nnadiah, Daud, Hanita, Sokkalingam, Rajalingam, Othman, Mahmod, Abdul Kadir, Evizal, Mohd Aris, Muhammad Naeim, Muhammad, Noryanti, and Maharani, Warih
- Subjects
MARKOV chain Monte Carlo ,MARKOV processes ,STOCHASTIC processes ,FORECASTING methodology ,FOREST fires - Abstract
Stochastic modeling approaches have attracted many researchers to the field. However, fire hotspot detection suffers from not using a Markov chain quasi-Monte Carlo (MCQMC) as a forecasting methodology. This paper proposes improvements to the computational time by combining the strengths of the Markov chain Monte Carlo (MCMC) and quasi-Monte Carlo (QMC) methods. The proposed method can lead to more precise and stable results, particularly in problems with high-dimensional integration or complex probability distributions. The proposed method is applied to a case study of fire hotspot detection in Sarawak, Malaysia. The outcome of this study reveals that the MCQMC method is more computationally efficient, taking only 0.0746 seconds compared to MCMC's 0.0914 seconds and QMC's 0.0994 seconds. It is shown that the best option derived by the proposed method is effective in predicting fire hotspots and providing quick solutions to protect the environment and communities from forest fires. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Expected Values in Complex Networks Constructed Using a Compression Algorithm to Time Series.
- Author
-
Carareto, Rodrigo and El Hage, Fabio S.
- Subjects
- *
ALGORITHMS , *LOSSLESS data compression , *STOCHASTIC processes , *TIME management - Abstract
This paper introduces a methodology for computing expected values associated with compression networks resulting from the application of compression algorithms to independent and identically distributed random time series. Our analysis establishes a robust correspondence between the calculated expected values and empirically derived results obtained from constructing networks using nondeterministic time series. Notably, the ratio of the average indegree of a network to the computed expected indegree for stochastic time series serves as a versatile metric. It enables the assessment of inherent randomness in time series and facilitates the distinction between nondeterministic and chaotic systems. The metric demonstrates high sensitivity to nondeterminism in both synthetic and real-world datasets, highlighting its capacity to detect subtle disturbances and high-frequency noise, even in series characterized by a deficient sample rate. Our results extend and confirm previous findings in the field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. A non-fungible token (NFT) chain model and performance study.
- Author
-
Seol, J., Ke, Z., Kancharla, A., Joshi, S., and Park, N.
- Subjects
- *
NON-fungible tokens , *GENERAL stores , *CHAIN stores , *STOCHASTIC processes , *BLOCKCHAINS - Abstract
This paper presents a quantitative model to assess the performance of a non-fungible token (NFT)-centered chain as referred to as a NFT Chain in this paper. The model was introduced in Seol et al. (in: 2022 Fourth international conference on blockchain computing and applications (BCCA), pp. 159–166, IEEE, 2022) and more extensive simulations are conducted and the results are presented in this work. NFT chain in general stores its data distributed across on chain (e.g., NFT registration ledger data and an address pointing at the data located off chain such as meta data table and ultimate digital asset's data) due to the high cost to store the potentially high volume of data for digital assets. Therefore, it is expected that the overall performance of NFT chain is primarily to be dominated and bound by the off-chain performance. The proposed performance model employing an embedded Markovian queueing process model, tracks a bivariate state of the NFT chain such that i , j where i stochastically tracks the number of slots of the transactions executed on chain and j stochastically tracks the number of transactions off chain as well, and the states transition as determined by λ on , λ off , μ , and the number of slots in the current block. Extensive numerical simulations are performed to validate the efficacy of the model. The primary set of variables used in the simulations consists of λ on , λ off , μ and the average number of slots of the transactions during a block posting, L , is simulated based on both L on and L off ; and the average waiting time W based on both W on and W off , in an intermingled manner in order to take into account of the nature of NFT transactions executed across on- and off-chain without loss of generality. The simulation results in Seol et al. (2022) has demonstrated a good agreement with the expected and intuitive trends. The results of more extensive simulations are presented in this paper to further demonstrate the efficacy and versatility of the proposed model. Ultimately, the proposed NFT chain model will serve as a sound theoretical foundation for the design of NFT chains from the performance's perspective. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Comparative Analysis of Stochastic and Uncertain Process Degradation Modeling Based on RQRL.
- Author
-
Kai Liu, Tianji Zou, and Mincheng Xin
- Subjects
EPISTEMIC uncertainty ,WIENER processes ,STOCHASTIC processes ,TIME measurements ,COMPARATIVE method - Abstract
Small sample sizes cause epistemic uncertainties in reliability estimation and even result in potential risks in maintenance strategies. To explore the difference between stochastic- and uncertain-process-based degradation modeling in reliability estimation for small samples, this study proposes a comparative analysis methodology based on the range of quantile reliable lifetime (RQRL). First, considering both unit-to-unit variability and epistemic uncertainty, we proposed the Wiener and Liu process degradation models. Second, based on the RQRL, a comparative analysis method of different degradation models for reliability estimation under various sample sizes and measurement times was proposed. Third, based on a case study, the sensitivities of the Wiener and Liu process degradation models for various sample sizes and measurement times were compared and analyzed based on the RQRL. The results demonstrated that using the uncertain process degradation model improved the uniformity and stability of reliability estimation under small-sample conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Musical composition based on skewed statistical distributions of stochastic processes
- Author
-
Igor Lugo and Martha G. Alatriste-Contreras
- Subjects
Nomusa Makhubu, University of Cape Town, South Africa ,Composition & Orchestration ,Elements of Music ,Music Technology ,Theory of Music ,Stochastic process ,Fine Arts ,Arts in general ,NX1-820 ,General Works ,History of scholarship and learning. The humanities ,AZ20-999 - Abstract
AbstractThe inclusion of skewed statistical distributions in the stochastic process for composing music contributes to clarify the effect of some pitch patterns for generating new pieces of music. The aim of this study was to compare and measure the effect of using skewed statistical distributions instead of using the common uniform distribution applied in Markov chains. We applied an explorative data analysis related to the Shannon entropy and the Monte Carlo approach, to generate and compare different stochastic realizations associated with a particular exponential and uniform distributions and two types of note selection based on intervals. Findings suggested that the presence of an exponential statistical distribution may generate a wide range of entropy values that could be associated with the diversity concept in complex systems. On the other hand, the presence of the uniform distribution may generate a narrow range of entropy values possibly associated with less diverse behaviors in simple systems. Therefore, the use of skewed statistical distributions, in particular the exponential, in the stochastic process for musical composition sets ground for the emergence of articulated musical patterns.
- Published
- 2024
- Full Text
- View/download PDF
26. Investigating day-to-day route choices based on multi-scenario laboratory experiments, Part II: Route-dependent attraction-based stochastic process model
- Author
-
Hang Qi, Ning Jia, Xiaobo Qu, and Zhengbing He
- Subjects
Day-to-day dynamics ,Route choice behavior ,Markov process ,Stochastic process ,Transportation engineering ,TA1001-1280 - Abstract
Laboratory experiments are one of the important means used to investigate travel choice behavior under strategic uncertainty. Many experiment-based studies have shown that the Nash equilibrium can predict aggregated route choices, while the fluctuations, whose mechanisms are still unclear, continue to exist until the end. To understand the fluctuations, this paper proposes a route-dependent attraction-based stochastic process model, which shares exactly the same behavioral foundation introduced in Part I of the study (Qi et al., 2023), i.e., route-dependent inertia and route-dependent preference. The model predictions are carefully compared with the experimental observations obtained from the congestible parallel-route laboratory experiments containing 312 subjects and eight decision-making scenarios (Qi et al., 2023). The results show that the proposed stochastic process model can precisely reproduce the random oscillations both in terms of flow switching and route flow evolution. Subsequently, an approximated model is developed to enhance the efficiency in evaluating the equilibrium distribution, providing a practical tool to evaluate the impacts of transportation policies in both long- and short-term runs. To the best of our knowledge, this paper is the first attempt to model and explain experimental phenomena by introducing stochastic process theories, as well as a successful example of applying experimental economics methodology to improve our understanding of human travel choice behavior.
- Published
- 2024
- Full Text
- View/download PDF
27. Identification of Network Traffic Using Neural Networks
- Author
-
Salimzyanova, Daria, Lisovskaya, Ekaterina, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Dudin, Alexander, editor, Nazarov, Anatoly, editor, and Moiseev, Alexander, editor
- Published
- 2024
- Full Text
- View/download PDF
28. A Stochastic Framework of Risk Assessment of Flooding on Stability and Serviceability of Bridges Under a Changing Climate
- Author
-
Habeeb, Bassel, Liberge, Erwan, Bastidas-Arteaga, Emilio, di Prisco, Marco, Series Editor, Chen, Sheng-Hong, Series Editor, Vayas, Ioannis, Series Editor, Kumar Shukla, Sanjay, Series Editor, Sharma, Anuj, Series Editor, Kumar, Nagesh, Series Editor, Wang, Chien Ming, Series Editor, Cui, Zhen-Dong, Series Editor, Matos, José C., editor, Lourenço, Paulo B., editor, Oliveira, Daniel V., editor, Branco, Jorge, editor, Proske, Dirk, editor, Silva, Rui A., editor, and Sousa, Hélder S., editor
- Published
- 2024
- Full Text
- View/download PDF
29. Area and Perimeter Full Distribution Functions for Planar Poisson Line Processes and Voronoi Diagrams
- Author
-
Kanel-Belov, Alexei, Golafshan, Mehdi, Malev, Sergey, Yavich, Roman, and Slavova, Angela, editor
- Published
- 2024
- Full Text
- View/download PDF
30. The Anatomy of Accident as a Deviation from Random Walk
- Author
-
Hacıoğlu, Volkan and Hacıoğlu, Volkan
- Published
- 2024
- Full Text
- View/download PDF
31. Linear Systems Under Gaussian White Noise Excitation: Exact Closed-Form Solutions
- Author
-
Kougioumtzoglou, Ioannis A., Psaros, Apostolos F., Spanos, Pol D., Kougioumtzoglou, Ioannis A., Psaros, Apostolos F., and Spanos, Pol D.
- Published
- 2024
- Full Text
- View/download PDF
32. Introduction
- Author
-
Kougioumtzoglou, Ioannis A., Psaros, Apostolos F., Spanos, Pol D., Kougioumtzoglou, Ioannis A., Psaros, Apostolos F., and Spanos, Pol D.
- Published
- 2024
- Full Text
- View/download PDF
33. Two-Terminal Reliability of the K4-Ladder—Revisited
- Author
-
Poulin, Philippe, Cowell, Simon R., Beiu, Valeriu, and Vlachos, Dimitrios, editor
- Published
- 2024
- Full Text
- View/download PDF
34. Existence in Pseudo Almost Automorphic Mild Solutions for Mean Field Stochastic Evolution Equations
- Author
-
Mbaye, Mamadou Moustapha, Diop, Amadou, Dieye, Moustapha, Seck, Diaraf, editor, Kangni, Kinvi, editor, Sambou, Marie Salomon, editor, Nang, Philibert, editor, and Fall, Mouhamed Moustapha, editor
- Published
- 2024
- Full Text
- View/download PDF
35. Considering Multiplicative Noise in a Software Reliability Growth Model Using Stochastic Differential Equation Approach
- Author
-
Chaudhary, Kuldeep, Kumar, Vijay, Kumar, Deepansha, Kumar, Pradeep, Pham, Hoang, Series Editor, Kapur, P. K., editor, Singh, Gurinder, editor, and Kumar, Vivek, editor
- Published
- 2024
- Full Text
- View/download PDF
36. Optimizing Stock Option Forecasting with the Assembly of Machine Learning Models and Improved Trading Strategies
- Author
-
Cao, Zheng, Guo, Raymond, Du, Wenyu, Gao, Jiayi, Golubnichiy, Kirill V., Kacprzyk, Janusz, Series Editor, Gomide, Fernando, Advisory Editor, Kaynak, Okyay, Advisory Editor, Liu, Derong, Advisory Editor, Pedrycz, Witold, Advisory Editor, Polycarpou, Marios M., Advisory Editor, Rudas, Imre J., Advisory Editor, Wang, Jun, Advisory Editor, and Arai, Kohei, editor
- Published
- 2024
- Full Text
- View/download PDF
37. Heterogeneous Queueing Model with Intermittently Obtainable Server with Feedback
- Author
-
Divya, K., Indhira, K., Seenivasan, M., Kamalov, Firuz, editor, Sivaraj, R., editor, and Leung, Ho-Hon, editor
- Published
- 2024
- Full Text
- View/download PDF
38. Bayesian earthquake forecasting approach based on the epidemic type aftershock sequence model
- Author
-
Giuseppe Petrillo and Jiancang Zhuang
- Subjects
Statistical seismology ,Time series analysis ,Probabilistic forecast ,Stochastic process ,Geography. Anthropology. Recreation ,Geodesy ,QB275-343 ,Geology ,QE1-996.5 - Abstract
Abstract The epidemic type aftershock sequence (ETAS) model is used as a baseline model both for earthquake clustering and earthquake prediction. In most forecast experiments, the ETAS parameters are estimated based on a short and local catalog, therefore the model parameter optimization carried out by means of a maximum likelihood estimation may be not as robust as expected. We use Bayesian forecast techniques to solve this problem, where non-informative flat prior distributions of the parameters is adopted to perform forecast experiments on 3 mainshocks occurred in Southern California. A Metropolis–Hastings algorithm is employed to sample the model parameters and earthquake events. We also show, through forecast experiments, how the Bayesian inference allows to obtain a probabilistic forecast, differently from one obtained via MLE. Graphical Abstract
- Published
- 2024
- Full Text
- View/download PDF
39. Data-driven optimization for microgrid control under distributed energy resource variability
- Author
-
Akhilesh Mathur, Ruchi Kumari, V. P. Meena, V. P. Singh, Ahmad Taher Azar, and Ibrahim A. Hameed
- Subjects
Microgrids ,Stochastic process ,Optimal scheduling ,Monte Carlo simulation ,K-mean clustering ,Probability distribution function ,Medicine ,Science - Abstract
Abstract The integration of renewable energy resources into the smart grids improves the system resilience, provide sustainable demand-generation balance, and produces clean electricity with minimal leakage currents. However, the renewable sources are intermittent in nature. Therefore, it is necessary to develop scheduling strategy to optimise hybrid PV-wind-controllable distributed generator based Microgrids in grid-connected and stand-alone modes of operation. In this manuscript, a priority-based cost optimization function is developed to show the relative significance of one cost component over another for the optimal operation of the Microgrid. The uncertainties associated with various intermittent parameters in Microgrid have also been introduced in the proposed scheduling methodology. The objective function includes the operating cost of CDGs, the emission cost associated with CDGs, the battery cost, the cost of grid energy exchange, and the cost associated with load shedding. A penalty function is also incorporated in the cost function for violations of any constraints. Multiple scenarios are generated using Monte Carlo simulation to model uncertain parameters of Microgrid (MG). These scenarios consist of the worst as well as the best possible cases, reflecting the microgrid’s real-time operation. Furthermore, these scenarios are reduced by using a k-means clustering algorithm. The reduced procedures for uncertain parameters will be used to obtain the minimum cost of MG with the help of an optimisation algorithm. In this work, a meta-heuristic approach, grey wolf optimisation (GWO), is used to minimize the developed cost optimisation function of MG. The standard LV Microgrid CIGRE test network is used to validate the proposed methodology. Results are obtained for different cases by considering different priorities to the sub-objectives using GWO algorithm. The obtained results are compared with the results of Jaya and PSO (particle swarm optimization) algorithms to validate the efficacy of the GWO method for the proposed optimization problem.
- Published
- 2024
- Full Text
- View/download PDF
40. Topology, convergence, and reconstruction of predictive states
- Author
-
Loomis, Samuel P and Crutchfield, James P
- Subjects
Applied Mathematics ,Mathematical Physics ,Numerical and Computational Mathematics ,Mathematical Sciences ,Stochastic process ,Symbolic dynamics ,Dynamical systems ,Measure theory ,Weak topology ,Fluids & Plasmas ,Applied mathematics ,Mathematical physics ,Numerical and computational mathematics - Abstract
Predictive equivalence in discrete stochastic processes has been applied with great success to identify randomness and structure in statistical physics and chaotic dynamical systems and to inferring hidden Markov models. We examine the conditions under which predictive states can be reliably reconstructed from time-series data, showing that convergence of predictive states can be achieved from empirical samples in the weak topology of measures. Moreover, predictive states may be represented in Hilbert spaces that replicate the weak topology. We mathematically explain how these representations are particularly beneficial when reconstructing high-memory processes and connect them to reproducing kernel Hilbert spaces.
- Published
- 2023
41. Model-based statistical depth with applications to functional data.
- Author
-
Zhao, Weilong, Xu, Zishen, Mu, Yue, Yang, Yun, and Wu, Wei
- Subjects
- *
STATISTICS , *NONPARAMETRIC statistics , *HILBERT space , *PROBABILISTIC generative models , *QUANTILES , *STOCHASTIC processes , *OUTLIER detection - Abstract
Statistical depth, a commonly used analytic tool in nonparametric statistics, has been extensively studied for multivariate and functional observations over the past few decades. Although various forms of depth were introduced, they are mainly procedure based whose definitions are independent of the generative model for observations. To address this problem, we introduce a generative model-based approach to define statistical depth for both multivariate and functional data. The proposed model-based depth framework permits simple computation via a bootstrap sampling and improves the depth estimation accuracy. When applied to functional data, the proposed depth can capture important features such as continuity, smoothness or phase variability, depending on the defining criteria. We propose efficient algorithms to compute the proposed depths and establish estimation consistency. Through simulations and real data, we demonstrate that the proposed functional depths reveal important statistical information such as those captured by the median and quantiles, and detect outliers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Bayesian earthquake forecasting approach based on the epidemic type aftershock sequence model.
- Author
-
Petrillo, Giuseppe and Zhuang, Jiancang
- Subjects
- *
EARTHQUAKE aftershocks , *MAXIMUM likelihood statistics , *EARTHQUAKES , *EARTHQUAKE prediction , *EPIDEMICS , *BAYESIAN field theory - Abstract
The epidemic type aftershock sequence (ETAS) model is used as a baseline model both for earthquake clustering and earthquake prediction. In most forecast experiments, the ETAS parameters are estimated based on a short and local catalog, therefore the model parameter optimization carried out by means of a maximum likelihood estimation may be not as robust as expected. We use Bayesian forecast techniques to solve this problem, where non-informative flat prior distributions of the parameters is adopted to perform forecast experiments on 3 mainshocks occurred in Southern California. A Metropolis–Hastings algorithm is employed to sample the model parameters and earthquake events. We also show, through forecast experiments, how the Bayesian inference allows to obtain a probabilistic forecast, differently from one obtained via MLE. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Data-driven optimization for microgrid control under distributed energy resource variability.
- Author
-
Mathur, Akhilesh, Kumari, Ruchi, Meena, V. P., Singh, V. P., Azar, Ahmad Taher, and Hameed, Ibrahim A.
- Subjects
- *
MICROGRIDS , *POWER resources , *SMART power grids , *RENEWABLE energy sources , *CLEAN energy , *COST functions , *OPTIMIZATION algorithms - Abstract
The integration of renewable energy resources into the smart grids improves the system resilience, provide sustainable demand-generation balance, and produces clean electricity with minimal leakage currents. However, the renewable sources are intermittent in nature. Therefore, it is necessary to develop scheduling strategy to optimise hybrid PV-wind-controllable distributed generator based Microgrids in grid-connected and stand-alone modes of operation. In this manuscript, a priority-based cost optimization function is developed to show the relative significance of one cost component over another for the optimal operation of the Microgrid. The uncertainties associated with various intermittent parameters in Microgrid have also been introduced in the proposed scheduling methodology. The objective function includes the operating cost of CDGs, the emission cost associated with CDGs, the battery cost, the cost of grid energy exchange, and the cost associated with load shedding. A penalty function is also incorporated in the cost function for violations of any constraints. Multiple scenarios are generated using Monte Carlo simulation to model uncertain parameters of Microgrid (MG). These scenarios consist of the worst as well as the best possible cases, reflecting the microgrid's real-time operation. Furthermore, these scenarios are reduced by using a k-means clustering algorithm. The reduced procedures for uncertain parameters will be used to obtain the minimum cost of MG with the help of an optimisation algorithm. In this work, a meta-heuristic approach, grey wolf optimisation (GWO), is used to minimize the developed cost optimisation function of MG. The standard LV Microgrid CIGRE test network is used to validate the proposed methodology. Results are obtained for different cases by considering different priorities to the sub-objectives using GWO algorithm. The obtained results are compared with the results of Jaya and PSO (particle swarm optimization) algorithms to validate the efficacy of the GWO method for the proposed optimization problem. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Reconstruction of the Amplitude of Signals of Acoustic Emission Based on Mathematically Modeling as a Stochastic Process.
- Author
-
Berkovich, V. N., Builo, S. I., and Builo, B. I.
- Subjects
- *
ACOUSTIC emission , *DISTRIBUTION (Probability theory) , *STOCHASTIC processes , *INTEGRAL transforms , *NONDESTRUCTIVE testing - Abstract
The problem of random oscillations generated by an internal defect within the neighborhood of the boundary of an elastic massive body at the prefailure stage is considered. The study is based on the results of the invariant method in the theory of acoustic emission (AE), according to which the statistical distribution of the values of the parameters of acoustic emission (AE) signals due to a defect obeys the stability condition when the body remains at the same prefailure stage. A mathematical model of a nonstationary wave field of displacements in an elastic massive body is constructed and the issues of the correctness of its application are studied. The problem is reduced to the study of a certain boundary integral equation in special classes of stochastic processes. We pose the problem of reconstructing and describing the nature of the random process of defect emission on the free boundary of the body based on AE signals. The data of numerical analysis are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Sensorless Speed Estimation of Induction Motors through Signal Analysis Based on Chaos Using Density of Maxima.
- Author
-
Silva, Marlio Antonio, Lucena-Junior, Jose Anselmo, da Silva, Julio Cesar, Belo, Francisco Antonio, Lima-Filho, Abel Cavalcante, Ramos, Jorge Gabriel Gomes de Souza, Camara, Romulo, and Brito, Alisson
- Subjects
- *
INDUCTION motors , *SPEED , *ELECTRICAL energy , *POWER resources , *FOURIER transforms , *DENSITY - Abstract
Three-phase induction motors are widely used in various industrial sectors and are responsible for a significant portion of the total electrical energy consumed. To ensure their efficient operation, it is necessary to apply control systems with specific algorithms able to estimate rotation speed accurately and with an adequate response time. However, the angular speed sensors used in induction motors are generally expensive and unreliable, and they may be unsuitable for use in hostile environments. This paper presents an algorithm for speed estimation in three-phase induction motors using the chaotic variable of maximum density. The technique used in this work analyzes the current signals from the motor power supply without invasive sensors on its structure. The results show that speed estimation is achieved with a response time lower than that obtained by classical techniques based on the Fourier Transform. This technique allows for the provision of motor shaft speed values when operated under variable load. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Stochastic processes shape the functional and phylogenetic structure of bird assemblages at the mine area in southwest China.
- Author
-
Liu, Shilong, Zhou, Tianlong, Tan, Xiaocai, Mtemi, Wambura M, and Jiang, Aiwu
- Subjects
- *
STOCHASTIC processes , *BIRD communities , *DETERMINISTIC processes , *RESOURCE allocation , *ECOSYSTEMS - Abstract
Understanding the mechanisms of community assembly is a key question in ecology. Metal pollution may result in significant changes in bird community structure and diversity, with implications for ecosystem processes and function. However, the relative importance of these processes in shaping the bird community at the polluted area is still not clear. Here, we explored bird species richness, functional, and phylogenetic diversity, and the assembly processes of community at the mine region of southwest China. Our results showed that the 3 dimensions of diversity at the mine area were lower than that at the reference sites. In the community assembly, the result was 0 < NRI/ NFRI < 1.96, which indicated deterministic processes (environmental filtering) might drive community clustering. The results of the neutral community model, and normalized stochasticity ratio, showed the dominant role of stochastic processes in shaping the bird community assembly. We further quantified the community-level habitat niche breadth (Bcom), and we found that there was no difference in Bcom -value between the mine area and reference sites. This indicates that the bird communities at the mine area and 3 reference sites were not subjected to extreme environmental selection (same or different resource allocation) to form a highly specialized niche. These findings provide insights into the distribution patterns and dominant ecological processes of bird communities under metal exposure, and extend the knowledge in community assembly mechanisms of bird communities living in the mine area. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Pseudo-dynamic simulations applied to ball mill grinding circuit using population balance model and Monte Carlo Method.
- Author
-
de Abreu Valadares, Jose Guilherme, Batista Mazzinghy, Douglas, and Galéry, Roberto
- Subjects
- *
SIMULATION methods & models , *ELECTRIC circuits , *STOCHASTIC processes , *OPERATING costs , *STOCHASTIC models , *MONTE Carlo method , *BALL mills , *MODEL validation - Abstract
Process simulations can be used to improve grinding circuit performance, which efficiently reduces operating costs. The population balance model (PBM) is widely accepted for grinding modeling because it can reproduce breakage events in tumbling mills, as described by Austin et al. (1984). In this study, a pseudodynamic model is introduced, integrating the PBM with the Monte Carlo Method to stochastically simulate variables in an industrial grinding circuit. This integrated approach enabled circuit simulations over a period of 2 hours, representing the operational variables as seen in historical data. Model validation showed a correlation of 0.74 in the product size distribution when comparing simulated outcomes with the original population. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Effective vaccination strategies for human papillomavirus (HPV) infection and cervical cancer based on the mathematical model with a stochastic process.
- Author
-
Kim, Minsoo and Kim, Eunjung
- Subjects
HUMAN papillomavirus vaccines ,CERVICAL cancer ,HUMAN papillomavirus ,STOCHASTIC processes ,STOCHASTIC models - Abstract
Human papillomavirus (HPV) infection poses a significant risk to women's health by causing cervical cancer. In addition to HPV, cervical cancer incidence rates can be influenced by various factors, including human immunodeficiency virus and herpes, as well as screening policy. In this study, a mathematical model with stochastic processes was developed to analyze HPV transmission between genders and its subsequent impact on cervical cancer incidence. The model simulations suggest that both‐gender vaccination is far more effective than female‐only vaccination in preventing an increase in cervical cancer incidence. With increasing stochasticity, the difference between the number of patients in the vaccinated group and the number in the nonvaccinated group diminishes. To distinguish the patient population distribution of the vaccinated from the nonvaccinated, we calculated effect size (Cohen's distance) in addition to Student's t‐test. The model analysis suggests a threshold vaccination rate for both genders for a clear reduction of cancer incidence when significant stochastic factors are present. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Nitrogen deposition mediates more stochastic processes in structuring plant community than soil microbial community in the Eurasian steppe.
- Author
-
Yang, Wei, Zhang, Shuhan, Li, Ang, Yang, Junjie, Pang, Shuang, Hu, Zonghao, Wang, Zhiping, Han, Xingguo, and Zhang, Ximei
- Abstract
Anthropogenic environmental changes may affect community assembly through mediating both deterministic (e.g., competitive exclusion and environmental filtering) and stochastic processes (e.g., birth/death and dispersal/colonization). It is traditionally thought that environmental changes have a larger mediation effect on stochastic processes in structuring soil microbial community than aboveground plant community; however, this hypothesis remains largely untested. Here we report an unexpected pattern that nitrogen (N) deposition has a larger mediation effect on stochastic processes in structuring plant community than soil microbial community (those <2 mm in diameter, including archaea, bacteria, fungi, and protists) in the Eurasian steppe. We performed a ten-year nitrogen deposition experiment in a semiarid grassland ecosystem in Inner Mongolia, manipulating nine rates (0–50 g N m
−2 per year) at two frequencies (nitrogen added twice or 12 times per year) under two grassland management strategies (fencing or mowing). We separated the compositional variation of plant and soil microbial communities caused by each treatment into the deterministic and stochastic components with a recently-developed method. As nitrogen addition rate increased, the relative importance of stochastic component of plant community first increased and then decreased, while that of soil microbial community first decreased and then increased. On the whole, the relative importance of stochastic component was significantly larger in plant community (0.552±0.035; mean±standard error) than in microbial community (0.427±0.035). Consistently, the proportion of compositional variation explained by the deterministic soil and community indices was smaller for plant community (0.172–0.186) than microbial community (0.240–0.767). Meanwhile, as nitrogen addition rate increased, the linkage between plant and microbial community composition first became weaker and then became stronger. The larger stochasticity in plant community relative to microbial community assembly suggested that more stochastic strategies (e.g., seeds addition) should be adopted to maintain above- than below-ground biodiversity under the pressure of nitrogen deposition. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
50. Discrete stopping times in the lattice of continuous functions.
- Author
-
Polavarapu, Achintya Raya
- Abstract
A functional calculus for an order complete vector lattice E was developed by Grobler (Indag Math (NS) 25(2):275–295, 2014) using the Daniell integral. We show that if one represents the universal completion of E as C ∞ (K) , where K is an extremally disconnected compact Hausdorff topological space, then the Daniell functional calculus for continuous functions is exactly the pointwise composition of functions in C ∞ (K) . This representation allows an easy deduction of the various properties of the functional calculus. Afterwards, we study discrete stopping times and stopped processes in C ∞ (K) . We obtain a representation that is analogous to what is expected in probability theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.