3,108 results on '"stochastic modeling"'
Search Results
2. Stochastic modeling and availability optimization of reverse osmosis water purification system using metaheuristic algorithms
- Author
-
Saini, Monika, Kumar, Naveen, Sinwar, Deepak, and Kumar, Ashish
- Published
- 2024
- Full Text
- View/download PDF
3. Deep neural networks for probability of default modelling.
- Author
-
Georgiou, Kyriakos and Yannacopoulos, Athanasios N.
- Abstract
In this paper we develop Deep Neural Networks for the approximation of the solution to Partial Integro-Differential Equations (PIDE) that arise in the calculation of Probability of Default functions. We consider a modelling framework in compliance with the spirit and regulations of the International Financial Reporting Standard 9 and use the resulting Deep Learning models to estimate default probabilities that can be used to solve credit risk problems. Detailed comparisons with standard numerical analysis schemes for the solutions to these PIDEs are also reported, enhancing the understanding and adding to the discussion regarding the applicability of the related Machine Learning methodologies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Stochastic Multiscale Modeling of Electrical Conductivity of Carbon Nanotube Polymer Nanocomposites: An Interpretable Machine Learning Approach.
- Author
-
Elaskalany, Mostafa and Behdinan, Kamran
- Abstract
This study introduces an interpretable machine learning (ML) framework for efficiently predicting the electrical conductivity of carbon nanotube (CNT)/polymer nanocomposites. A stochastic multiscale numerical model based on representative volume element (RVE) is employed to generate a representative dataset. This dataset is used to train three ML models, including random forest, XGBoost, and artificial neural networks (ANN). The dataset includes six input features: CNT length, aspect ratio, intrinsic CNT conductivity, number of CNT conduction channels, energy barrier height, and volume fraction, with the electrical conductivity of the nanocomposites as the output feature. The findings highlight the exceptional accuracy of the ANN model in predicting electrical conductivity at significantly lower computational costs. Furthermore, the use of Shapley additive explanations (SHAP) enhances the interpretability of these ML models, identifying the volume fraction, energy barrier height, and intrinsic CNT conductivity as the most influential factors affecting conductivity. This approach sets the stage for rapid and efficient modeling of CNT/polymer nanocomposites facilitating the design of materials with tailored electrical properties for diverse applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Mathematical Optimization of Wind Turbine Maintenance Using Repair Rate Thresholds.
- Author
-
Kontrec, Nataša, Panić, Stefan, Vujaković, Jelena, Stošović, Dejan, and Khotnenok, Sergei
- Subjects
- *
WIND turbine maintenance & repair , *WIND turbine efficiency , *PROBABILITY density function , *WIND turbines , *ENGINEERING reliability theory , *CUMULATIVE distribution function - Abstract
As reliance on wind energy intensifies globally, optimizing the efficiency and reliability of wind turbines is becoming vital. This paper explores sophisticated maintenance strategies, crucial for enhancing the operational sustainability of wind turbines. It introduces an innovative approach to maintenance scheduling that utilizes a mathematical model incorporating an alternating renewal process for accurately determining repair rate thresholds. These thresholds are important for identifying optimal maintenance timings, thereby averting failures and minimizing downtime. Central to this study are the obtained generalized analytical expressions that can be used to predict the total repair time for an observed entity. Four key lemmas are developed to establish formal proofs for the probability density function (PDF) and cumulative distribution function (CDF) of repair rates, both above and below critical repair rate thresholds. The core innovation of this study lies in the methodological application of PDFs and CDFs to set repair time thresholds that refine maintenance schedules. The model's effectiveness is illustrated using simulated data based on typical wind turbine components such as gearboxes, generators, and converters, validating its potential for improving system availability and operational readiness. By establishing measurable repair rate thresholds, the model effectively prioritizes maintenance tasks, extending the life of crucial turbine components and ensuring consistent energy output. Beyond enhancing theoretical understanding, this research provides practical insights that could inform broader maintenance strategies across various renewable energy systems, marking a significant advancement in the field of maintenance engineering [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. An Online Learning Approach to Dynamic Pricing and Capacity Sizing in Service Systems.
- Author
-
Chen, Xinyun, Liu, Yunan, and Hong, Guiyu
- Abstract
Online Learning in Queueing Systems Most queueing models have no analytic solutions, so previous research often resorts to heavy-traffic analysis for performance analysis and optimization, which requires the system scale (e.g., arrival and service rate) to grow to infinity. In "An Online Learning Approach to Dynamic Pricing and Capacity Sizing in Service Systems," X. Chen, Y. Liu, and G. Hong develop a new "scale-free" online learning framework designed for optimizing a queueing system, called gradient-based online learning in queue (GOLiQ). GOLiQ prescribes an efficient procedure to obtain improved decisions in successive cycles using newly collected queueing data (e.g., arrival counts, waiting times, and busy times). Besides its robustness in the system scale, GOLiQ is advantageous when focusing on performance optimization in the long run because its data-driven nature enables it to constantly produce improved solutions which will eventually reach optimality. Effectiveness of GOLiQ is substantiated by theoretical regret analysis (with a logarithmic regret bound) and simulation experiments. We study a dynamic pricing and capacity sizing problem in a GI/GI/1 queue, in which the service provider's objective is to obtain the optimal service fee p and service capacity μ so as to maximize the cumulative expected profit (the service revenue minus the staffing cost and delay penalty). Because of the complex nature of the queueing dynamics, such a problem has no analytic solution so that previous research often resorts to heavy-traffic analysis in which both the arrival and service rates are sent to infinity. In this work, we propose an online learning framework designed for solving this problem that does not require the system's scale to increase. Our framework is dubbed gradient-based online learning in queue (GOLiQ). GOLiQ organizes the time horizon into successive operational cycles and prescribes an efficient procedure to obtain improved pricing and staffing policies in each cycle using data collected in previous cycles. Data here include the number of customer arrivals, waiting times, and the server's busy times. The ingenuity of this approach lies in its online nature, which allows the service provider to do better by interacting with the environment. Effectiveness of GOLiQ is substantiated by (i) theoretical results, including the algorithm convergence and regret analysis (with a logarithmic regret bound), and (ii) engineering confirmation via simulation experiments of a variety of representative GI/GI/1 queues. Funding: X. Chen acknowledges support [Grants NSFC72171205, NSFC11901493, and RCYX20210609103124047]. Supplemental Material: The e-companion is available at https://doi.org/10.1287/opre.2020.0612. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. A Stochastic-Geometrical Framework for Object Pose Estimation Based on Mixture Models Avoiding the Correspondence Problem.
- Author
-
Hoegele, Wolfgang
- Abstract
Pose estimation of rigid objects is a practical challenge in optical metrology and computer vision. This paper presents a novel stochastic-geometrical modeling framework for object pose estimation based on observing multiple feature points. This framework utilizes mixture models for feature point densities in object space and for interpreting real measurements. Advantages are the avoidance to resolve individual feature correspondences and to incorporate correct stochastic dependencies in multi-view applications. First, the general modeling framework is presented, second, a general algorithm for pose estimation is derived, and third, two example models (camera and lateration setup) are presented. Numerical experiments show the effectiveness of this modeling and general algorithm by presenting four simulation scenarios for three observation systems, including the dependence on measurement resolution, object deformations and measurement noise. Probabilistic modeling utilizing mixture models shows the potential for accurate and robust pose estimations while avoiding the correspondence problem. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Species interactions drive continuous assembly of freshwater communities in stochastic environments
- Author
-
Andrea Tabi, Tadeu Siqueira, and Jonathan D. Tonkin
- Subjects
Biodiversity maintenance ,Causal inference ,Body size scaling ,Stochastic modeling ,Community assembly ,Medicine ,Science - Abstract
Abstract Understanding the factors driving the maintenance of long-term biodiversity in changing environments is essential for improving restoration and sustainability strategies in the face of global environmental change. Biodiversity is shaped by both niche and stochastic processes, however the strength of deterministic processes in unpredictable environmental regimes is highly debated. Since communities continuously change over time and space—species persist, disappear or (re)appear—understanding the drivers of species gains and losses from communities should inform us about whether niche or stochastic processes dominate community dynamics. Applying a nonparametric causal discovery approach to a 30-year time series containing annual abundances of benthic invertebrates across 66 locations in New Zealand rivers, we found a strong negative causal relationship between species gains and losses directly driven by predation indicating that niche processes dominate community dynamics. Despite the unpredictable nature of these system, environmental noise was only indirectly related to species gains and losses through altering life history trait distribution. Using a stochastic birth-death framework, we demonstrate that the negative relationship between species gains and losses can not emerge without strong niche processes. Our results showed that even in systems that are dominated by unpredictable environmental variability, species interactions drive continuous community assembly.
- Published
- 2024
- Full Text
- View/download PDF
9. Species interactions drive continuous assembly of freshwater communities in stochastic environments.
- Author
-
Tabi, Andrea, Siqueira, Tadeu, and Tonkin, Jonathan D.
- Abstract
Understanding the factors driving the maintenance of long-term biodiversity in changing environments is essential for improving restoration and sustainability strategies in the face of global environmental change. Biodiversity is shaped by both niche and stochastic processes, however the strength of deterministic processes in unpredictable environmental regimes is highly debated. Since communities continuously change over time and space—species persist, disappear or (re)appear—understanding the drivers of species gains and losses from communities should inform us about whether niche or stochastic processes dominate community dynamics. Applying a nonparametric causal discovery approach to a 30-year time series containing annual abundances of benthic invertebrates across 66 locations in New Zealand rivers, we found a strong negative causal relationship between species gains and losses directly driven by predation indicating that niche processes dominate community dynamics. Despite the unpredictable nature of these system, environmental noise was only indirectly related to species gains and losses through altering life history trait distribution. Using a stochastic birth-death framework, we demonstrate that the negative relationship between species gains and losses can not emerge without strong niche processes. Our results showed that even in systems that are dominated by unpredictable environmental variability, species interactions drive continuous community assembly. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. A Combined OCBA–AIC Method for Stochastic Variable Selection in Data Envelopment Analysis.
- Author
-
Deng, Qiang
- Subjects
- *
DATA envelopment analysis , *MONTE Carlo method , *RANDOM variables , *AKAIKE information criterion , *NUMERICAL analysis - Abstract
This study introduces a novel approach to enhance variable selection in Data Envelopment Analysis (DEA), especially in stochastic environments where efficiency estimation is inherently complex. To address these challenges, we propose a game cross-DEA model to refine efficiency estimation. Additionally, we integrate the Akaike Information Criterion (AIC) with the Optimal Computing Budget Allocation (OCBA) technique, creating a hybrid method named OCBA–AIC. This innovative method efficiently allocates computational resources for stochastic variable selection. Our numerical analysis indicates that OCBA–AIC surpasses existing methods, achieving a lower AIC value. We also present two real-world case studies that demonstrate the effectiveness of our approach in ranking suppliers and tourism companies under uncertainty by selecting the most suitable partners. This research enriches the understanding of efficiency measurement in DEA and makes a substantial contribution to the field of performance management and decision-making in stochastic contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Modeling kinematic variability reveals displacement and velocity based dual control of saccadic eye movements.
- Author
-
Vasudevan, Varsha, Murthy, Aditya, and Padhi, Radhakant
- Subjects
- *
SACCADIC eye movements , *EYE movements , *HUMAN mechanics , *STOCHASTIC models , *NOISE control - Abstract
Noise is a ubiquitous component of motor systems that leads to behavioral variability of all types of movements. Nonetheless, systems-based models investigating human movements are generally deterministic and explain only the central tendencies like mean trajectories. In this paper, a novel approach to modeling kinematic variability of movements is presented and tested on the oculomotor system. This approach reconciles the two prominent philosophies of saccade control: displacement-based control versus velocity-based control. This was achieved by quantifying the variability in saccadic eye movements and developing a stochastic model of its control. The proposed stochastic dual model generated significantly better fits of inter-trial variances of the saccade trajectories compared to existing models. These results suggest that the saccadic system can flexibly use the information of both desired displacement and velocity for its control. This study presents a potential framework for investigating computational principles of motor control in the presence of noise utilizing stochastic modeling of kinematic variability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. The Daniel H. Wagner Prize for Excellence in the Practice of Advanced Analytics and Operations Research: Introduction to the 2023 Competition and Some History on the Prize's 25th Anniversary.
- Author
-
Cochran, James J., Browning, William J., Discenza, Joseph H., and Greenland, Arnie
- Subjects
MONTE Carlo method ,OPERATIONS research ,DATA analytics ,LINEAR programming ,GENETIC algorithms - Abstract
The judges for the 2023 competition selected the three finalist papers featured in this special issue of the INFORMS Journal on Applied Analytics (IJAA). The prestigious Wagner Prize—awarded for achievement in implemented operations research, management science, and advanced analytics—emphasizes the quality and originality of mathematical models along with clarity of written and oral exposition. This year's winning submission describes the design and deployment of an approach to accommodate the assembly of semiconductor products using multidie packages, a powerful and innovative approach to manufacturing a wide range of strategically designed semiconductors. The authors refer to the problem as the sort-assemble-blend routing problem (SABR-P), and their approach to solving the SABR-P is a combination of genetic algorithms, Monte Carlo simulations, linear programming, and machine learning. The remaining papers describe an integrated approach to system-wide deployment of nurses that facilitates the movement of nurses between hospitals, and a multiple-stage process that consolidates parcels ordered by the same consumer from one or more merchants during the fulfillment process. In addition, we provide some historical perspective on the Daniel H. Wagner Prize for Excellence in the Practice of Advanced Analytics and Operations Research. We show the titles, abstracts, and names and affiliations for its recipients for each year in which the prize has been awarded. Finally, we provide links to the winning papers, which have been published by INFORMS Journal on Applied Analytics (formerly Interfaces). Supplemental Material: Full presentation videos with slides are available in the INFORMS Video Library at https://www.informs.org/Resource-Center/Video-Library and as electronic companions to the INFORMS Journal on Applied Analytics articles. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Generating Chaos in Dynamical Systems: Applications, Symmetry Results, and Stimulating Examples.
- Author
-
Kyurkchiev, Nikolay, Zaevski, Tsvetelin, Iliev, Anton, Kyurkchiev, Vesselin, and Rahnev, Asen
- Subjects
- *
DYNAMICAL systems , *STOCHASTIC models , *MATHEMATICAL models , *BEHAVIORAL research , *MODEL theory - Abstract
In this paper, we present a new class of extended oscillators in light of chaos theory. It is based on dynamical complex systems built on the concept of self-describing with a stopping criterion process. We offer an effective studying approach with a specific focus on learning, provoking students' thinking through the triad of enigmatics–creativity–acmeology. Dynamic processes are the basis of mathematical modeling; thus, we can reach the goal of the above-mentioned triad by the proposed differential systems. The results we derive strongly confirm the presence of symmetry in the outcomes of the proposed models. We suggest a stochastic approach to structuring the proposed dynamical systems by modeling the coefficients that drive them by some discrete probability distribution that exhibits symmetry or asymmetry. We propose specific tools for researching the behavior of these systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Estimating Post‐Fire Flood Infrastructure Clogging and Overtopping Hazards.
- Author
-
Jong‐Levinger, Ariane, Houston, Douglas, and Sanders, Brett F.
- Subjects
BODIES of water ,FLOOD control ,FLOOD risk ,RAINFALL periodicity ,DESIGN protection ,FLOOD warning systems ,WATERSHEDS - Abstract
Cycles of wildfire and rainfall produce sediment‐laden floods that pose a hazard to development and may clog or overtop protective infrastructure, including debris basins and flood channels. The compound, post‐fire flood hazards associated with infrastructure overtopping and clogging are challenging to estimate due to the need to account for interactions between sequences of wildfire and storm events and their impact on flood control infrastructure over time. Here we present data sources and calibration methods to estimate infrastructure clogging and channel overtopping hazards on a catchment‐by‐catchment basis using the Post‐Fire Flood Hazard Model (PF2HazMo), a stochastic modeling approach that utilizes continuous simulation to resolve the effects of antecedent conditions and system memory. Publicly available data sources provide parameter ranges needed for stochastic modeling, and several performance measures are considered for model calibration. With application to three catchments in southern California, we show that PF2HazMo predicts the median of the simulated distribution of peak bulked flows within the 95% confidence interval of observed flows, with an order of magnitude range in bulked flow estimates depending on the performance measure used for calibration. Using infrastructure overtopping data from a post‐fire wet season, we show that PF2HazMo accurately predicts the number of flood channel exceedances. Model applications to individual watersheds reveal where infrastructure is undersized to contain present‐day and future overtopping hazards based on current design standards. Model limitations and sources of uncertainty are also discussed. Plain Language Summary: Communities at the foot of the mountains face an especially dangerous type of flooding called "sediment‐laden floods." Many such communities in the southwestern U.S. are protected from water floods by flood infrastructure designed to trap sediment at the mouth of mountain canyons and convey only water flows safely past developed areas to a downstream water body. Sediment‐laden floods, which are more forceful and typically larger than water floods, are more likely to happen during storms over burned mountain canyons soon after a wildfire occurs. However, estimating the likelihood that sediment‐laden floods fill and overtop flood infrastructure is challenging since existing sediment‐laden flood models do not explicitly consider the role of flood infrastructure. Here we present the Post‐Fire Flood Hazard Model (PF2HazMo), a model that can estimate the likelihood of post‐fire floods on a canyon‐by‐canyon basis accounting for flood infrastructure. Environmental data collected following a major wildfire is used to apply PF2HazMo to three mountain canyons in southern California, and we find that it predicts the number of floods accurately relative to observed post‐fire flood channel overtopping events. Further, the model is used to predict the frequency of floods due to infrastructure overtopping under both present‐day and future wildfire scenarios. Key Points: Flood risks are heightened by clogging of infrastructure with sediment, which can occur from sequences of storms especially after wildfiresA framework for calibration and validation of a post‐fire infrastructure clogging and flood hazard model is presentedModel applications reveal whether infrastructure is adequately sized to meet design levels of protection [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Microstructurally-informed stochastic inhomogeneity of material properties and material symmetries in 3D-printed 316 L stainless steel.
- Author
-
Chu, Shanshan, Iliopoulos, Athanasios, Michopoulos, John, Birnbaum, Andrew, Steuben, John, Stewart, Colin, Callahan, Patrick, Rowenhorst, David, and Guilleminot, Johann
- Subjects
- *
STAINLESS steel , *MONTE Carlo method , *MAXIMUM likelihood statistics , *RANDOM fields , *ELASTICITY (Economics) - Abstract
Stochastic mesoscale inhomogeneity of material properties and material symmetries are investigated in a 3D-printed material. The analysis involves a spatially-dependent characterization of the microstructure in 316 L stainless steel, obtained through electron backscatter diffraction imaging. These data are subsequently fed into a Voigt–Reuss–Hill homogenization approximation to produce maps of elasticity tensor coefficients along the path of experimental probing. Information-theoretic stochastic models corresponding to this stiffness random field are then introduced. The case of orthotropic fields is first defined as a high-fidelity model, the realizations of which are consistent with the elasticity maps. To investigate the role of material symmetries, an isotropic approximation is next introduced through ad-hoc projections (using various metrics). Both stochastic representations are identified using the dataset. In particular, the correlation length along the characterization path is identified using a maximum likelihood estimator. Uncertainty propagation is finally performed on a complex geometry, using a Monte Carlo analysis. It is shown that mechanical predictions in the linear elastic regime are mostly sensitive to material symmetry but weakly depend on the spatial correlation length in the considered propagation scenario. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. On the application of genetic algorithm for predicting the strength of CNT/ABS filaments using multi-scale modeling.
- Author
-
Rafiee, Roham and Amohaji, Hirad
- Subjects
- *
MULTISCALE modeling , *GENETIC algorithms , *FIBERS , *FUSED deposition modeling , *ACRYLONITRILE butadiene styrene resins , *CARBON nanotubes , *STOCHASTIC models , *ACRYLONITRILE - Abstract
This article investigates the strength of nanocomposite filaments produced through extrusion. It involves incorporating carbon nanotubes (CNTs) into Acrylonitrile Butadiene Styrene (ABS) to produce printable nanocomposite filaments for filament fused fabrication technique and measuring their strengths. A multi-scale modeling procedure is developed to predict the strength of nanocomposite filaments through computational modeling. This model analyzes three sequential scales of micro, meso and macro. Determining effective parameter(s) of each scale, a proper representative volume element (RVE) is defied for each scale, seperately. CNT-polymer interaction, CNT length and CNT orientation are taken into account at the scale of micro; while CNT agglomeration is captured at the scale of meso. The strength of nanocomposite filament is finally estimated at the uppermost scale of macro. CNT length, orientation and agglomeration are all treated as random paramters, thus stochastic modeling is conducted in connection with genetic algorithm (GA) to reduce the required runtime of analysis. The outputs of this modeling procedure align closely with experimental observations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. ARIMAX Modelling: Response of Hass Avocado Respiration Rate to Environmental Factors.
- Author
-
Morales-Solis, Anabel, Pérez-López, Artemio, Ramírez-Guzmán, Martha Elva, Espinosa-Solares, Teodoro, and Alia-Tejacal, Irán
- Subjects
BOX-Jenkins forecasting ,DATA acquisition systems ,DYNAMIC loads ,HUMIDITY ,TIME series analysis ,AVOCADO - Abstract
This research explores how random events influence the respiration rate in Hass avocado beyond deterministic models in order to develop better strategies for extending its shelf life. Understanding these factors can enhance the accuracy of postharvest management strategies. The Autoregressive Integrated Moving Average (ARIMA) model with exogenous variables (ARIMAX) is an alternative stochastic probability model which is capable of modeling complex, externally influenced phenomena such as respiration. This study aimed to elucidate the effect of three exogenous variables, namely temperature, relative humidity, and ambient illumination, on the respiration rate of Hass avocado fruits. Data on the respiration rate and exogenous variables were obtained using sensors coupled to a data acquisition system in a prototype of continuous airflow. The Box–Jenkins methodology was employed to construct the ARIMA models. The temperature, relative humidity, ambient illumination, and respiration rate variables were adjusted to the ARIMA models (3,1,2), ARIMA (1,1,2), ARIMA (1,1,2), and ARIMA (1,1,3), respectively. The ARIMAX (1,1,3) models were obtained from the pre-whitened respiration rate series. The impact detected in the transfer functions indicates increases in the respiration rate of 0.34%, 1.52%, and 0.99% for each unit increase in the temperature, relative humidity, and ambient illumination variables, respectively. In this regard, ARIMAX modeling is reliable for explaining the physiological response of Hass avocado fruits due to external factors. In future research, it is intended to extrapolate this stochastic modeling procedure to measure the effect of dynamic loads on the respiratory metabolism of fruits during transportation, where there is a considerable loss in the quality of fresh products. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Cost-Effective Planning of Hybrid Energy Systems Using Improved Horse Herd Optimizer and Cloud Theory under Uncertainty.
- Author
-
Alghamdi, Ali S.
- Subjects
ANIMAL herds ,METAHEURISTIC algorithms ,HYBRID systems ,HYBRID power systems ,HEAT storage ,ENERGY consumption - Abstract
In this paper, an intelligent stochastic model is recommended for the optimization of a hybrid system that encompasses wind energy sources, battery storage, combined heat and power generation, and thermal energy storage (Wind/Battery/CHP/TES), with the inclusion of electric and thermal storages through the cloud theory model. The framework aims to minimize the costs of planning, such as construction, maintenance, operation, and environmental pollution costs, to determine the best configuration of the resources and storage units to ensure efficient electricity and heat supply simultaneously. A novel meta-heuristic optimization algorithm named improved horse herd optimizer (IHHO) is applied to find the decision variables. Rosenbrock's direct rotational technique is applied to the conventional horse herd optimizer (HHO) to improve the algorithm's performance against premature convergence in the optimization due to the complexity of the problem, and its capability is evaluated with particle swarm optimization (PSO) and manta ray foraging optimization (MRFO) methods. Also, the cloud theory-based stochastic model is recommended for solving problems with uncertainties of system generation and demand. The obtained results are evaluated in three simulation scenarios including (1) Wind/Battery, (2) Wind/Battery/CHP, and (3) Wind/Battery/CHP/TES systems to implement the proposed methodology and evaluate its effectiveness. The results show that scenario 3 is the best configuration to meet electrical and thermal loads, with the lowest planning cost (12.98% less than scenario 1). Also, the superiority of the IHHO is proven with more accurate answers and higher convergence rates in contrast to the conventional HHO, PSO, and MRFO. Moreover, the results show that when considering the cloud theory-based stochastic model, the costs of annual planning are increased for scenarios 1 to 3 by 4.00%, 4.20%, and 3.96%, respectively, compared to the deterministic model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Stochastic modeling of multiple-server charging stations for electric vehicle networks using feedback strategies: A queueing-theoretic approach
- Author
-
Shreekant Varshney, Bhasuru Abhinaya Srinivas, Mayank Gupta, Manthan Shah, and Aksh Bavisi
- Subjects
Stochastic modeling ,Battery management system ,Charging station management ,Efficient energy management ,Markovian mechanisms ,Individual feedback strategies ,Heat ,QC251-338.5 - Abstract
Nowadays, electric vehicles (EVs) significantly affect transportation as they provide a more environmentally friendly alternative to traditional fossil-fueled automobiles. Electric vehicles, which depend on energy stored in batteries, significantly contribute to environmental preservation and comply with worldwide efforts to tackle climate change. However, the growing demand for electric vehicles causes traditional power grids under pressure emphasizing the necessity of establishing a suitable infrastructure for charging electric vehicles. Charging stations are becoming increasingly critical since they allow for the recharging of electric vehicles and play a significant role in stabilizing the power system. In order to optimize charging station infrastructure with multiple servers, the current research incorporates a Markovian queueing modeling approach. The primary objective of the study is to address queue management concerns and boost overall productivity. Considering the real-world challenges, a queue-based stochastic model for multi-server EV systems and individual feedback strategies is developed. Subsequently, a transition state diagram is provided by balancing the input-output rates between the adjacent states. Next, the system of Chapman-Kolmogorov differential-difference equations is formulated to help understand mathematical modeling better. The matrix method is employed to demonstrate the state probability distribution in equilibrium. The infographics are utilized and incorporated for better visualization of the research findings. For a better understanding from an individual's point of view, numerous managerial insights are provided. Lastly, several concluding remarks and future perspectives are provided that can help decision-makers and practitioners to construct and analyze economic strategies based on EV management systems.
- Published
- 2024
- Full Text
- View/download PDF
20. Hybrid Geoid Modeling for the Kingdom of Saudi Arabia
- Author
-
Grebenitcharsky, Rossen S., Vergos, Georgios S., Al-Shahrani, Sultan, Al-Qahtani, Abdullah, Iuri, Golubinka, Othman, Alrubayyi, Aljebreen, Suliman, Freymueller, Jeffrey T., Series Editor, and Sánchez, Laura, Assistant Editor
- Published
- 2024
- Full Text
- View/download PDF
21. A Stochastic Model for Cryptocurrencies in Illiquid Markets with Extreme Conditions and Structural Changes
- Author
-
El-Khatib, Youssef, Hatemi-J, Abdulnasser, Kacprzyk, Janusz, Series Editor, Hamdan, Allam, editor, and Aldhaen, Esra Saleh, editor
- Published
- 2024
- Full Text
- View/download PDF
22. Simulation modeling of a stochastic process on the example of a computer system
- Author
-
Y.B. Brodskyi, O.V. Maievskyi, and M.O. Khokhlov
- Subjects
computing system ,simulation model ,stochastic modeling ,complex system ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
The purpose of the article is to develop a methodology for evaluating stochastic processes in complex systems using the example of a three-processor computer system, which will make it possible to increase the effectiveness of monitoring the functioning and management of such objects. During the research, the methods of system approach, analysis and synthesis, stochastic and simulation modeling were applied, based on which the behavior of a three-processor computing system was reproduced under different conditions; simulations made it possible to reveal the mechanism of the system and verify the stochastic model; Markov chains made it possible to simulate the transitions of a computer system from state to state and to estimate the probabilities of system states as functions of time using numerical methods. Proposed simulation model in the article allows determining the states of the processors, calculate their load periods depending on the total time of operation, and determine the moment of time at which transient processes are completed and the computer system reaches its balanced mode of operation. The theoretical and practical significance of the research lies in the further studying of the existing and the development of new theoretical and methodological provisions for increasing the efficiency of the analysis and evaluation of stochastic processes in complex systems.
- Published
- 2024
- Full Text
- View/download PDF
23. STOCHASTIC MODELING OF STRATEGIC SUPPLY CHAIN DESIGN
- Author
-
Marcel ILIE and Augustin SEMENESCU
- Subjects
supply chain ,stochastic modeling ,numerical modeling ,dynamical systems ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Supply chain risk management plays a critical role in the any business or industry environments, and it enables a good coordination of the input and outputs parameters that may affect the smooth processes development such as a manufacturing process for example. However, the supply chain risk management is often prone to the impact of various uncertainties associated with supply chain disruptions caused by meteorological, pandemic, resources shortage, etc. Therefore, one way to quantify these uncertainties are the stochastic modeling approaches of supply chain management. The stochastic modeling is a powerful tool that can predict with certain probability the events that may occur within the supply chain such as that associated with manufacturing processes. In the present research a stochastic model, based on probability theory, is developed and proposed for the analysis of supply chain risk management, for manufacturing processes. Therefore, the studies are performed to investigate the impact of the number of manufacturing processes on the supply chain proper evolution. The current study shows that the increase of the number of the manufacturing processes results in an increase of uncertainty in the supply chain management and thus, it increases the probability of supply chain disruption occurrences, within the supply chain. Therefore, it is recommended that a supply chain should contain a minimum number of manufacturing process, if the delivery time and final product allows.
- Published
- 2024
- Full Text
- View/download PDF
24. How drug onset rate and duration of action affect drug forgiveness.
- Author
-
Clark, Elias D. and Lawley, Sean D.
- Abstract
Medication nonadherence is one of the largest problems in healthcare today, particularly for patients undergoing long-term pharmacotherapy. To combat nonadherence, it is often recommended to prescribe so-called "forgiving" drugs, which maintain their effect despite lapses in patient adherence. Nevertheless, drug forgiveness is difficult to quantify and compare between different drugs. In this paper, we construct and analyze a stochastic pharmacokinetic/pharmacodynamic (PK/PD) model to quantify and understand drug forgiveness. The model parameterizes a medication merely by an effective rate of onset of effect when the medication is taken (on-rate) and an effective rate of loss of effect when a dose is missed (off-rate). Patient dosing is modeled by a stochastic process that allows for correlations in missed doses. We analyze this "on/off" model and derive explicit formulas that show how treatment efficacy depends on drug parameters and patient adherence. As a case study, we compare the effects of nonadherence on the efficacy of various antihypertensive medications. Our analysis shows how different drugs can have identical efficacies under perfect adherence, but vastly different efficacies for adherence patterns typical of actual patients. We further demonstrate that complex PK/PD models can indeed be parameterized in terms of effective on-rates and off-rates. Finally, we have created an online app to allow pharmacometricians to explore the implications of our model and analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Developing a Fluvial and Pluvial Stochastic Flood Model of Southeast Asia.
- Author
-
Olcese, Gaia, Bates, Paul D., Neal, Jeffrey C., Sampson, Christopher C., Wing, Oliver E. J., Quinn, Niall, Murphy‐Barltrop, Callum J. R., and Probyn, Izzy
- Subjects
FLOOD warning systems ,STOCHASTIC models ,CATASTROPHE modeling ,RAINFALL ,FLOOD risk ,HYDROLOGIC models - Abstract
Flood event set generation, as employed in catastrophe risk models, relies on gauge information that is not available in data‐scarce regions. To overcome this limitation, we develop a stochastic fluvial and pluvial flood model of Southeast Asia, using freely and globally available discharge data from the global hydrological model GloFAS and rainfall from the ERA5 reanalysis. We use a conditional multivariate statistical model to produce a synthetic catalog of 10,000 years of flood events. We calculate the flood population exposure associated with each flood event using freely available population data from WorldPop and generate exposure probability exceedance curves. We validate the population exposure curves against observed flood disaster data from EM‐DAT, showing that our methodology provides exposure estimates that are in line with historical observations. We find that there is a 1% probability that more than 30 million people will be exposed to flooding in a given year according to our event set. This number is roughly half the population living in the 100‐year return period flood zone of Fathom's hazard maps, suggesting most studies based on static flood maps overestimate exposure. This analysis provides significant progress over previous non‐stochastic studies which are only able to compute total or average exposure within a given floodplain area and demonstrates that a reanalysis‐based stochastic flood model can be designed to generate reliable estimates of population exposure probability exceedance. This study is a step toward a fully global catastrophe model for floods capable of providing exposure and loss estimates worldwide. Key Points: Global hydrological models can be used to drive a large‐scale stochastic flood inundation model in Southeast AsiaA reanalysis‐based stochastic flood model generates realistic flood eventsThe computed flood exposure exceedance curve for Southeast Asia compares well to the EM‐DAT database [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Measuring Turbulent Flows: Analyzing a Stochastic Process with Stochastic Tools.
- Author
-
Rozos, Evangelos, Wieland, Jörg, and Leandro, Jorge
- Subjects
TURBULENCE ,TURBULENT flow ,STOCHASTIC processes ,REYNOLDS stress ,HYDRAULIC jump ,EXTREME value theory - Abstract
Assessing drag force and Reynolds stresses in turbulent flows is crucial for evaluating the stability and longevity of hydraulic structures. Yet, this task is challenging due to the complex nature of turbulent flows. To address this, physical models are often employed. Nonetheless, this practice is associated with difficulties, especially in the case of high sampling frequency where the inherent randomness of velocity fluctuations becomes mixed with the measurement noise. This study introduces a stochastic approach, which aims to mitigate bias from measurement errors and provide a probabilistic estimate of extreme stress values. To accomplish this, a simple experimental setup with a hydraulic jump was employed to acquire long-duration velocity measurements. Subsequently, a modified first-order autoregressive model was applied through ensemble simulations, demonstrating the benefits of the stochastic approach. The analysis highlights its effectiveness in estimating the uncertainty of extreme events frequency and minimizing the bias induced by the noise in the high-magnitude velocity measurements and by the limited length of observations. These findings contribute to advancing our understanding of turbulent flow analysis and have implications for the design and assessment of hydraulic structures. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. IMPROVEMENT OF STOCHASTIC MODELLING SKILLS FOR A SUSTAINABLE EDUCATION IN ENGINEERING.
- Author
-
D., SIPOS, C., BENDEA, and I., KOCSIS
- Subjects
SUSTAINABLE engineering ,STOCHASTIC models ,ENGINEERING students ,ENGINEERING education ,SUSTAINABILITY - Abstract
Deterministic and stochastic models play an important role in engineering, economics, and the natural sciences. Despite this, the development of stochastic modelling skills in engineering students is less emphasized, and this part of modelling knowledge is less well founded in secondary education. The concept of sustainable education is based on the idea that education is driven by the implication and energy of the students and teachers and their natural energy for learning is continuously renewed. In this article, we describe an approach that uses simple tools to highlight the role and importance of the stochastic approach in engineering and also serves as a model for developing application skills that equip learners with the knowledge and values needed to build a more sustainable and resilient future for all. [ABSTRACT FROM AUTHOR]
- Published
- 2024
28. A stochastic particle extended SEIRS model with repeated vaccination: Application to real data of COVID-19 in Italy.
- Author
-
Papageorgiou, Vasileios E. and Tsaklidis, George
- Subjects
- *
VACCINATION , *COMMUNICABLE diseases , *COVID-19 pandemic , *EPIDEMIOLOGICAL models , *PARAMETER estimation , *H7N9 Influenza - Abstract
The prediction of the evolution of epidemics plays an important role in limiting the transmissibility and the burdensome consequences of infectious diseases, which leads to the employment of mathematical modeling. In this paper, we propose a stochastic particle filtering extended SEIRS model with repeated vaccination and time-dependent parameters, aiming to efficiently describe the demanding dynamics of time-varying epidemics. The validity of our model is examined using daily records of COVID-19 in Italy for a period of 525 days, revealing a notable capacity to uncover the hidden dynamics of the pandemic. The main findings include the estimation of asymptomatic cases, which is a well-known feature of the current pandemic. Unlike other proposed models that employ extra compartments for asymptomatic cases, which force the estimation of this proportion and significantly increase the model's complexity, our approach leads to the evaluation of the hidden dynamics of COVID-19 without additional computational burden. Other findings that confirm the model's appropriateness and robustness are its parameter evolution and the estimation of more ICU-admitted cases compared to the official records during the most prevalent infection wave of January 2022, attributed to the intensified increase in admissions that may have led to full occupancy in ICUs. As the vast majority of datasets contain time series of total recovered and vaccinated cases, we propose a statistical algorithm to estimate the currently recovered and protected through vaccination cases. This necessity arises from the attenuation of antibodies after vaccination/infection and is necessary for long-time interval predictions. Finally, we not only present a novel stochastic epidemiological model and test its efficiency but also investigate its mathematical properties, such as the existence and stability of epidemic equilibria, giving new insights to the literature. The latter provides additional details concerning the system's long-term behavior, while the conclusions drawn from the R0 index provide perspectives on the severity and future of the COVID-19 pandemic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Generation Rescheduling Based Contingency Constrained Optimal Power Flow Considering Uncertainties Through Stochastic Modeling.
- Author
-
Nasir, Mohammad, Sadollah, Ali, Barati, Hassan, Khodabakhshi, Mona, and Kim, Joong Hoon
- Subjects
- *
METAHEURISTIC algorithms , *PLUG-in hybrid electric vehicles , *RENEWABLE energy sources , *ELECTRICAL load , *HYDROLOGIC cycle - Abstract
The generation rescheduling is described as the power generation shifting from one or more generators to one or more other generators as a preventive action to improve and maintain the security of the power system. Since, there is a direct link between security improvement and the lines overload under contingencies, by rescheduling generation, the transmission lines become more flexible and thus, the overload can be relieved. In this paper, contingency constrained optimal power flow (CCOPF) problem based on generation rescheduling by considering the uncertainty of photovoltaic (PV), wind turbine (WT), and plug-in hybrid electric vehicle (PHEV) have been addressed. Water cycle algorithm (WCA) using its potential in finding optimal solution has been used in order to reschedule the generators and optimize the total fuel cost, power losses under contingency scenario, and system security. Moreover, stochastic approach has been proposed for taking into account the uncertainty of PV, WT, and PHEV. Overall performance index including the power and the voltage severity indices have been provided for determining overloaded transmission lines due to the lines' outage and consequently elimination of overloaded lines. The efficiency of the proposed algorithm has been evaluated on two IEEE-30 and IEEE 118-bus systems. The results are compared with the results of other classical and metaheuristic optimization algorithms. The simulations reveal that the WCA outperforms the other reported optimizers, and is more efficient and effective in improving security for power systems. In addition, the obtained numerical results show that renewable energy sources can significantly reduce fuel costs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Minimal Mechanisms of Microtubule Length Regulation in Living Cells.
- Author
-
Nelson, Anna C., Rolls, Melissa M., Ciocanel, Maria-Veronica, and McKinley, Scott A.
- Abstract
The microtubule cytoskeleton is responsible for sustained, long-range intracellular transport of mRNAs, proteins, and organelles in neurons. Neuronal microtubules must be stable enough to ensure reliable transport, but they also undergo dynamic instability, as their plus and minus ends continuously switch between growth and shrinking. This process allows for continuous rebuilding of the cytoskeleton and for flexibility in injury settings. Motivated by in vivo experimental data on microtubule behavior in Drosophila neurons, we propose a mathematical model of dendritic microtubule dynamics, with a focus on understanding microtubule length, velocity, and state-duration distributions. We find that limitations on microtubule growth phases are needed for realistic dynamics, but the type of limiting mechanism leads to qualitatively different responses to plausible experimental perturbations. We therefore propose and investigate two minimally-complex length-limiting factors: limitation due to resource (tubulin) constraints and limitation due to catastrophe of large-length microtubules. We combine simulations of a detailed stochastic model with steady-state analysis of a mean-field ordinary differential equations model to map out qualitatively distinct parameter regimes. This provides a basis for predicting changes in microtubule dynamics, tubulin allocation, and the turnover rate of tubulin within microtubules in different experimental environments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Generalized flutter reliability analysis with adjoint and direct approaches for aeroelastic eigen-pair derivatives computation.
- Author
-
Kumar, Sandeep
- Abstract
The article presents physics based time invariant generalized flutter reliability approach for a wing in detail. For carrying flutter reliability analysis, a generalized first order reliability method (FORM) and a generalized second order reliability method (SORM) algorithms are developed. The FORM algorithm requires first derivative and the SORM algorithm requires both the first and second derivatives of a limit state function; and for these derivatives, an adjoint and a direct approaches for computing eigen-pair derivatives are proposed by ensuring uniqueness in eigenvector and its derivative. The stability parameter, damping ratio (real part of an eigenvalue), is considered as implicit type limit state function. To show occurrence of the flutter phenomenon, the limit state function is defined in conditional sense by imposing a condition on flow velocity. The aerodynamic parameter: slope of the lift coefficient curve ( C L ) and structural parameters: bending rigidity (EI) and torsional rigidity (GJ) of an aeroelastic system are considered as independent Gaussian random variables, and also the structural parameters are modeled as second-order constant mean stationary Gaussian random fields having exponential type covariance structures. To represent the random fields in finite dimensions, the fields are discretized using Karhunen–Loeve expansion. The analysis shows that the derivatives of an eigenvalue obtained from both the adjoint and direct approaches are the same. So the cumulative distribution functions (CDFs) of flutter velocity will be the same, irrespective of the approach chosen, and it is also reflected in CDFs obtained using various reliability methods based on adjoint and direct approaches: first order second moment method, generalized FORM, and generalized SORM. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Stochastic Assessment of Dissolution at Fluid‐Mineral Interfaces.
- Author
-
Recalcati, Chiara, Siena, Martina, Riva, Monica, Bollani, Monica, and Guadagnini, Alberto
- Subjects
- *
CHEMICAL weathering , *ATOMIC force microscopy , *CALCITE crystals , *LIQUID-liquid interfaces , *CRYSTAL surfaces , *UNDERGROUND storage - Abstract
Chemical weathering associated with dissolution/precipitation at interfaces between minerals and flowing fluids is key for the evolution of geologic systems, including groundwater contamination and storage capacity. Relying on Atomic Force Microscopy (AFM) yields reaction rates at nanoscale resolutions. Challenges limiting our ability to quantify heterogeneity associated with these processes include establishing reliable platforms allowing AFM imaging of real‐time and in situ absolute material fluxes across mineral surfaces under continuous flow conditions to complement typically acquired surface topography images. We provide an experimental workflow and heterogeneous absolute rates at the nanoscale across the surface of a calcite crystal under dissolution. These high‐quality experimental observations are then interpreted through a stochastic approach. The latter is geared to embed diverse kinetic modes driving the degree of spatial heterogeneity of the reaction and corresponding to different mechanistic processes documented across the crystal surface. Plain Language Summary: Quantification of basic processes underpinning precipitation/dissolution at mineral/fluid interfaces is key for realistic assessment of chemical weathering rates driving rock morphology, subsurface storage capacity and contamination. We provide direct observation of the complex mechanistic processes acting at nanoscales through an original experimental platform relying on Atomic Force Microscopy imaging to evaluate absolute material fluxes associated with dissolution of a mineral subject to reaction under continuous flow conditions. Dissolution is characterized at very high spatial resolutions (∼10 nm). This enables observing in real‐time and in situ mechanistic processes driving system evolution. The ensuing rich data set of absolute reaction rates displays a marked degree of spatial heterogeneity. The latter is then interpreted within a stochastic framework to yield a detailed mechanistic appraisal of mineral dissolution. Key Points: A platform to evaluate absolute nanoscale topographic measurements of a crystal sample subject to dissolution/precipitation is designedThe associated spatially heterogeneous fields of absolute material fluxes across the surface are evaluatedReaction rates are described through a stochastic framework encapsulating behaviors of surface features driving dissolution processes [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. On the NP-VSS-NLMS Algorithm: Model, Design Guidelines, and Numerical Results.
- Author
-
Becker, Augusto Cesar, Kuhn, Eduardo Vinicius, Matsuo, Marcos Vinicius, and Benesty, Jacob
- Subjects
- *
SYSTEM identification , *ALGORITHMS , *STOCHASTIC models , *ADAPTIVE filters - Abstract
In this paper, a stochastic model is presented for the nonparametric variable step-size normalized least-mean-square (NP-VSS-NLMS) algorithm. This algorithm has demonstrated potential in practical applications and hence a deeper understanding of its behavior becomes crucial. In this context, model expressions are obtained for characterizing the algorithm behavior in the transient phase as well as in the steady state, considering a system identification problem and Gaussian input data. Such expressions reveal interesting algorithm characteristics that are useful for establishing design guidelines and for the advancement of more refined algorithms. Simulation results for various operating scenarios ratified both the model's accuracy and the algorithm's superior performance relative to other recent and relevant algorithms from the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. A stochastic log-logistic diffusion process: Statistical computational aspects and application to real data.
- Author
-
El Azri, Abdenbi and Nafidi, Ahmed
- Subjects
- *
SIMULATED annealing , *PROBABILITY density function , *MAXIMUM likelihood statistics , *STOCHASTIC processes , *MICROBIAL growth - Abstract
This article introduces a new stochastic diffusion process based on the theory of diffusion processes whose mean function is proportional to the loglogistic growth curve. The main characteristics of the process are analyzed, including the transition probability density function, the mean functions and in particular, the auto-correlation function between two times of the process. The parameters of the process are estimated by maximum likelihood method using discrete sampling. The simulated annealing algorithm is applied after bounding the parametric space by a strategy procedure to solve the likelihood equations. The behavior of the diffusion process here derived is finally applied to study an example for the growth data of a microorganism culture. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Data‐driven stochastic model for quantifying the interplay between amyloid‐beta and calcium levels in Alzheimer's disease.
- Author
-
Shaheen, Hina, Melnik, Roderick, and Singh, Sundeep
- Subjects
- *
ALZHEIMER'S disease , *HOMEOSTASIS , *STOCHASTIC models , *AMYLOID plaque , *CALCIUM - Abstract
The abnormal aggregation of extracellular amyloid‐β(Aβ)$$ \left(A\beta \right) $$ in senile plaques resulting in calcium Ca+2$$ \left({Ca}^{+2}\right) $$ dyshomeostasis is one of the primary symptoms of Alzheimer's disease (AD). Significant research efforts have been devoted in the past to better understand the underlying molecular mechanisms driving Aβ$$ A\beta $$ deposition and Ca+2$$ {Ca}^{+2} $$ dysregulation. Importantly, synaptic impairments, neuronal loss, and cognitive failure in AD patients are all related to the buildup of intraneuronal Aβ$$ A\beta $$ accumulation. Moreover, increasing evidence show a feed‐forward loop between Aβ$$ A\beta $$ and Ca+2$$ {Ca}^{+2} $$ levels, that is, Aβ$$ A\beta $$ disrupts neuronal Ca+2$$ {Ca}^{+2} $$ levels, which in turn affects the formation of Aβ$$ A\beta $$. To better understand this interaction, we report a novel stochastic model where we analyze the positive feedback loop between Aβ$$ A\beta $$ and Ca+2$$ {Ca}^{+2} $$ using ADNI data. A good therapeutic treatment plan for AD requires precise predictions. Stochastic models offer an appropriate framework for modeling AD since AD studies are observational in nature and involve regular patient visits. The etiology of AD may be described as a multi‐state disease process using the approximate Bayesian computation method. So, utilizing ADNI data from 2$$ 2 $$‐year visits for AD patients, we employ this method to investigate the interplay between Aβ$$ A\beta $$ and Ca+2$$ {Ca}^{+2} $$ levels at various disease development phases. Incorporating the ADNI data in our physics‐based Bayesian model, we discovered that a sufficiently large disruption in either Aβ$$ A\beta $$ metabolism or intracellular Ca+2$$ {Ca}^{+2} $$ homeostasis causes the relative growth rate in both Ca+2$$ {Ca}^{+2} $$ and Aβ$$ A\beta $$, which corresponds to the development of AD. The imbalance of Ca+2$$ {Ca}^{+2} $$ ions causes Aβ$$ A\beta $$ disorders by directly or indirectly affecting a variety of cellular and subcellular processes, and the altered homeostasis may worsen the abnormalities of Ca+2$$ {Ca}^{+2} $$ ion transportation and deposition. This suggests that altering the Ca+2$$ {Ca}^{+2} $$ balance or the balance between Aβ$$ A\beta $$ and Ca+2$$ {Ca}^{+2} $$ by chelating them may be able to reduce disorders associated with AD and open up new research possibilities for AD therapy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. A Study of Congestion-Based Information Guidance Policy for Hierarchical Healthcare Systems.
- Author
-
Yu, Miao, Xu, Jie, Li, Xiangling, and Yu, Dandan
- Subjects
INFORMATION policy ,BOTTLENECKS (Manufacturing) ,MEDICAL quality control ,MEDICAL care ,STOCHASTIC analysis ,MARKOV processes - Abstract
This paper develops a queueing system model to analyze the operations of a hierarchical healthcare system consisting of general hospitals (GHs) and community healthcare centers (CHCs). GHs typically provide a higher level of health care service than CHCs, and thus are preferred choices for many patients' healthcare service needs. Consequently, GHs are often heavily congested and patients often incur excessive waiting time. In contrast, CHCs are often idle and resources are underutilized. To help balance the utilization of resources in GHs and CHCs, a congestion-based information guidance policy is proposed in this paper to inform patients in the GH service queue about the anticipated delay. Upon being informed the delay for GH service, patients may balk, remain in queue for GH service, or switch to receive service at CHCs. This policy is thus expected to relieve the congestion at GHs and promote CHC usage. To study the effects of the proposed policy, a hierarchical healthcare system is modeled as a queueing system with strategic patients. Stationary performance measures of the system are analytically characterized using a Markov chain model. Stochastic and numerical analyses provide insights on how to design information guidance policy that would help improve overall health care service quality under different scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. On a regime switching illiquid high volatile prediction model for cryptocurrencies
- Author
-
El-Khatib, Youssef and Hatemi-J, Abdulnasser
- Published
- 2024
- Full Text
- View/download PDF
38. Data-Driven Modeling of Frequency Dynamics Observed in Operating Microgrids: A South African University Campus Case Study
- Author
-
Jacques Maritz, Leonardo Rydin Gorjao, P. Armand Bester, Nicolaas Esterhuysen, Stefaans Erasmus, Stephanus Riekert, Reuben Immelman, Tiaan Geldenhuys, Alexandra Viljoen, and Charl Bodenstein
- Subjects
Operational microgrids ,frequency dynamics ,state classification schemes ,synchronous frequency measurements ,complex systems ,stochastic modeling ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
South Africa has been experiencing an energy crisis since 2007. Utility loadshedding became the main control for under-frequency events due to a mismatch in generation and consumption. Rolling blackouts are further supported by failing electrical infrastructure and illegal (non-metered) connections to the distribution network. A common remedy to mandatory South African loadshedding, from the perspective of university campuses, is to deploy hybrid photovoltaic-diesel (PV-diesel) microgrids that allow for an uninterrupted power supply for a few hours. Campus microgrids are typically smaller compared to national utilities (less inertia) and require sensitive control schemes to remain stable. In this paper, frequency recordings associated with the operating microgrid of the University of the Free State QwaQwa campus are analysed. A simplistic stochastic mathematical model is presented as a model describing the observed frequency dynamics, describing the transition between the utility grid and the microgrid state, the microgrid frequency controller response, and the influence of the PV generators. Moreover, inter-campus synchronous frequency measurements are showcased and the future implications thereof are discussed. The main contributions of this paper focus on the recording and modelling of the frequency dynamics of fully functioning campus microgrids, and the showcasing of continuous synchronous measurements of frequency at two different campuses.
- Published
- 2024
- Full Text
- View/download PDF
39. Mathematical Optimization of Wind Turbine Maintenance Using Repair Rate Thresholds
- Author
-
Nataša Kontrec, Stefan Panić, Jelena Vujaković, Dejan Stošović, and Sergei Khotnenok
- Subjects
maintenance optimization ,repair rate thresholds ,stochastic modeling ,reliability theory ,wind turbines ,Mathematics ,QA1-939 - Abstract
As reliance on wind energy intensifies globally, optimizing the efficiency and reliability of wind turbines is becoming vital. This paper explores sophisticated maintenance strategies, crucial for enhancing the operational sustainability of wind turbines. It introduces an innovative approach to maintenance scheduling that utilizes a mathematical model incorporating an alternating renewal process for accurately determining repair rate thresholds. These thresholds are important for identifying optimal maintenance timings, thereby averting failures and minimizing downtime. Central to this study are the obtained generalized analytical expressions that can be used to predict the total repair time for an observed entity. Four key lemmas are developed to establish formal proofs for the probability density function (PDF) and cumulative distribution function (CDF) of repair rates, both above and below critical repair rate thresholds. The core innovation of this study lies in the methodological application of PDFs and CDFs to set repair time thresholds that refine maintenance schedules. The model’s effectiveness is illustrated using simulated data based on typical wind turbine components such as gearboxes, generators, and converters, validating its potential for improving system availability and operational readiness. By establishing measurable repair rate thresholds, the model effectively prioritizes maintenance tasks, extending the life of crucial turbine components and ensuring consistent energy output. Beyond enhancing theoretical understanding, this research provides practical insights that could inform broader maintenance strategies across various renewable energy systems, marking a significant advancement in the field of maintenance engineering
- Published
- 2024
- Full Text
- View/download PDF
40. Pattern-Based Multiple-point Geostatistics for 3D Automatic Geological Modeling of Borehole Data
- Author
-
Guo, Jiateng, Zheng, Yufei, Liu, Zhibin, Wang, Xulei, Zhang, Jianqiao, and Zhang, Xingzhou
- Published
- 2024
- Full Text
- View/download PDF
41. Numerical Simulation of the Time Series of Bioclimatic Indices in the Russian Arctic Based on a Stochastic Weather Generator.
- Author
-
Akenteva, M. S. and Kargapolova, N. A.
- Subjects
- *
TIME series analysis , *COMPUTER simulation , *METEOROLOGICAL stations , *WEATHER , *STOCHASTIC models - Abstract
The paper proposes an approach to the numerical stochastic modeling of the time series of the wind chill index and equivalent effective temperature at weather stations located in the Arctic zone of the Russian Federation. The approach is based on the use of a specially designed stochastic weather generator. It is shown that the approach allows developing the models of the time series of bioclimatic indices that very accurately reproduce various statistical properties of real processes related, in particular, to their daily variations and specific features of the study area. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Modulation Transfer between Microwave Beams: Asymptotic Evaluation of Integrals with Pole Singularities near a First-Order Saddle Point.
- Author
-
Cacciari, Ilaria and Ranfagni, Anedio
- Subjects
- *
MICROWAVES , *INTEGRALS , *SADDLERY , *QUANTITATIVE research - Abstract
Experimental results of delay-time measurements in the transfer of modulation between microwave beams, as reported in previous articles, were interpreted on a competition (interference) between two waves, one of which is modulated and the other is a continuous wave (c.w.). The creation of one of these waves was attributed to a saddle-point contribution, while the other was attributed to pole singularities. In this paper, such an assumption is justified by a quantitative field-amplitude analysis in order to make the modeling plausible. In particular, two ways of calculating field amplitudes are considered. These lead to results that are quantitatively markedly different, although qualitatively similar. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. UQAM‐TCW: A Global Hybrid Tropical Cyclone Wind Model Based Upon Statistical and Coupled Climate Models.
- Author
-
Carozza, David A., Boudreault, Mathieu, Grenier, Manuel, and Caron, Louis‐Philippe
- Subjects
- *
TROPICAL cyclones , *ATMOSPHERIC models , *CLIMATE change models , *FINANCIAL risk management , *STORMS ,EL Nino - Abstract
Tropical cyclones (TCs) are among the most destructive natural hazards and yet, quantifying their financial impacts remains a significant methodological challenge. It is therefore of high societal value to synthetically simulate TC tracks and winds to assess potential impacts along with their probability distributions for example, land use planning and financial risk management. A common approach to generate TC tracks is to apply storm detection methodologies to climate model output, but such an approach is sensitive to the method and parameterization used and tends to underestimate intense TCs. We present a global TC model (the UQAM‐TCW model thereafter) that melds statistical modeling, to capture historical risk features, with a climate model large ensemble, to generate large samples of physically coherent TC seasons. Integrating statistical and physical methods, the model is probabilistic and consistent with the physics of how TCs develop. The model includes frequency and location of cyclogenesis, full trajectories with maximum sustained winds and the entire wind structure along each track for the six typical cyclogenesis basins from IBTrACS. Being an important driver of TCs globally, we also integrate ENSO effects in key components of the model. The global TC model thus belongs to a recent strand of literature that combines probabilistic and physical approaches to TC track generation. As an application of the model, we show global hazard maps for direct and indirect hits expressed in terms of return periods. The global TC model can be of interest to climate and environmental scientists, economists and financial risk managers. Plain Language Summary: Tropical cyclones (TCs) are among the most destructive natural hazards and yet, quantifying their financial impacts remains a difficult task. Being able to randomly simulate TCs and their features (such as wind speed) with mathematical models is therefore critical to build scenarios (and their corresponding probability) for land use planning and financial risk management. A common approach is to simulate TCs by tracking them directly in climate model outputs but this often underestimates the frequency of intense TCs while being computationally costly overall to generate a large number of events. For these reasons, many authors have looked into alternative approaches that replicate key physical features of TCs but rather using statistical models that are much less computationally demanding. This paper therefore presents a global TC model that leverages the strengths of both statistical and climate models to simulate a large number of TCs whose features are consistent with the physics and observations. As an important global phenomenon that affects TCs globally, we also integrate in our model the effects of El Niño. The paper focuses on the methodology and validation of each model component and concludes with global hazard maps for direct and indirect hits. Key Points: We present a global tropical cyclone (TC) wind model built upon a climate model large ensemble that can be used for risk analysisWe integrate ENSO into our model since it is a strong driver of storm annual frequency, cyclogenesis, trajectories, and intensityWe present global hazard maps consistent with statistical features of TC components and coherent with a global climate model [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. A NOVEL STOCHASTIC FRAMEWORK FOR AVAILABILITY OPTIMIZATION OF HARVESTING SYSTEMS.
- Author
-
KUMAR, ASHISH, KUMAR, NAVEEN, SAINI, MONIKA, and SINWAR, DEEPAK
- Subjects
MARKOVIAN jump linear systems ,ARITHMETIC ,METAHEURISTIC algorithms ,METHODOLOGY ,MATHEMATICS - Abstract
The prominent objective of present study is to develop an ef- ficient stochastic framework for availability evaluation of harvesting system (HS) using the concept of partial failure of subsystems. A harvesting system is very complex structure configured with four subsystems in series structure. The failure and repair laws of all subsystems are exponentially distributed. The sufficient repair facility available with system and harvesting system work as new after repair. Markovian birth-death methodology is opted for development of Chapman-Kolmogorov differential-difference equations of proposed stochastic framework. The steady state availability of HS system is derived for a particular case. Later, an effort is made to predict the optimal availability and respective optimal parameters of subsystems using metaheuristics algorithms. It is revealed that HS can attain optimal limit of availability 0.9999967 at population size 5 after 25 iterations. This study adds to the body of knowledge about harvesting systems by providing an all-encompassing viewpoint on availability optimization. The study's findings can be utilized in designing reliable harvesting systems. The proposed methodology can be used in other similar kind of mechanical systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
45. Advancing COVID-19 stochastic modeling: a comprehensive examination integrating vaccination classes through higher-order spectral scheme analysis.
- Author
-
Wang, Laiquan, Khan, Sami Ullah, Khan, Farman U., A. AlQahtani, Salman, and M. Alamri, Atif
- Abstract
AbstractThis research article presents a comprehensive analysis aimed at enhancing the stochastic modeling of COVID-19 dynamics by incorporating vaccination classes through a higher-order spectral scheme. The ongoing COVID-19 pandemic has underscored the critical need for accurate and adaptable modeling techniques to inform public health interventions. In this study, we introduce a novel approach that integrates various vaccination classes into a stochastic model to provide a more nuanced understanding of disease transmission dynamics. We employ a higher-order spectral scheme to capture complex interactions between different population groups, vaccination statuses, and disease parameters. Our analysis not only enhances the predictive accuracy of COVID-19 modeling but also facilitates the exploration of various vaccination strategies and their impact on disease control. The findings of this study hold significant implications for optimizing vaccination campaigns and guiding policy decisions in the ongoing battle against the COVID-19 pandemic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise.
- Author
-
Kraikivski, Pavel
- Subjects
- *
STOCHASTIC models , *MODEL theory , *CONSCIOUSNESS , *NEURAL circuitry , *NOISE - Abstract
The concept of the brain's own time and space is central to many models and theories that aim to explain how the brain generates consciousness. For example, the temporo-spatial theory of consciousness postulates that the brain implements its own inner time and space for conscious processing of the outside world. Furthermore, our perception and cognition of time and space can be different from actual time and space. This study presents a mechanistic model of mutually connected processes that encode phenomenal representations of space and time. The model is used to elaborate the binding mechanism between two sets of processes representing internal space and time, respectively. Further, a stochastic version of the model is developed to investigate the interplay between binding strength and noise. Spectral entropy is used to characterize noise effects on the systems of interacting processes when the binding strength between them is varied. The stochastic modeling results reveal that the spectral entropy values for strongly bound systems are similar to those for weakly bound or even decoupled systems. Thus, the analysis performed in this study allows us to conclude that the binding mechanism is noise-resilient. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Correlation Structure of Steady Well‐Type Flows Through Heterogeneous Porous Media: Results and Application.
- Author
-
Severino, Gerardo, Fallico, Carmine, and Brunetti, Guglielmo Federico Antonio
- Subjects
POROUS materials ,HYDRAULIC conductivity ,RANDOM fields ,FUNCTION spaces ,SPATIAL variation ,STOCHASTIC models - Abstract
Steady flow toward a fully penetrating well takes place in a natural porous formation, where the erratic spatial variations, and the raising uncertainty, of the hydraulic conductivity K are modeled within a stochastic framework which regards the log‐conductivity, ln K, as a Gaussian, stationary, random field. The study provides second order moments of the flow variables by regarding the variance of the log‐conductivity as a perturbation parameter. Unlike similar studies on the topic, moments are expressed in a quite general (valid for any autocorrelation function of ln K) and very simple (from the computational stand point) form. It is shown that the (cross)variances, unlike the case of mean uniform flows, are not anymore stationary due to the dependence of the mean velocity upon the distance from the well. In particular, they vanish at the well because of the condition of given head along the well's axis, whereas away from it they behave like those pertaining to a uniform flow. Then, theoretical results are applied to a couple (one serving for calibration and the other used for validation purposes) of pumping tests to illustrate how they can be used to determine the hydraulic properties of the aquifers. In particular, the concept of head‐factor is shown to be the key‐parameter to identify the statistical moments of the random field K. Plain Language Summary: Flow toward a single well takes place in a porous formation, where the hydraulic conductivity is regarded as a random space function to account for its irregular spatial variability. A simple solution to this difficult problem is achieved by adopting some simplifying assumptions which apply to numerous real settings. Theoretical results are applied to a series of pumping tests in order to demonstrate their utility in the identification of aquifers' hydraulic properties. Key Points: A simple, general formulation to compute second‐order moments is presentedThe head factor is introduced for a robust identification of aquifers' statistical parametersThe application to pumping tests is illustrated and discussed [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Remaining Useful Life Estimation of Hollow Worn Railway Vehicle Wheels via On-Board Random Vibration-Based Wheel Tread Depth Estimation.
- Author
-
Iliopoulos, Ilias A. and Sakellariou, John S.
- Subjects
- *
REMAINING useful life , *RAILROAD trains , *MONTE Carlo method , *TIME series analysis , *WHEELS - Abstract
The problem of remaining useful life estimation (RULE) of hollow worn railway vehicle wheels in terms of remaining mileage via wheel tread depth estimation using on-board vibration signals from a single accelerometer on the bogie frame is presently investigated. This is achieved based on the introduction of a statistical time series method that employs: (i) advanced data-driven stochastic Functionally Pooled models for the modeling of the vehicle dynamics under different wheel tread depths in a range of interest until a critical limit, as well as tread depth estimation through a proper optimization procedure, and (ii) a wheel tread depth evolution function with respect to the vehicle running mileage that interconnects the estimated hollow wear with the remaining useful mileage. The method's RULE performance is investigated via hundreds of Simpack-based Monte Carlo simulations with an Attiko Metro S.A. vehicle and many hollow worn wheels scenarios which are not used for the method's training. The obtained results indicate the accurate estimation of the wheels tread depth with a mean absolute error of ∼0.07 mm that leads to a corresponding small error of ∼3% with respect to the wheels remaining useful mileage. In addition, the comparison with a recently introduced Multiple Model (MM)-based multi-health state classification method for RULE, demonstrates the better performance of the postulated method that achieves 81.17% True Positive Rate (TPR) which is significantly higher than the 45.44% of the MM method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Combining Karhunen-Loève expansion and stochastic modeling for probabilistic delineation of well capture zones in heterogeneous aquifers.
- Author
-
Gao, Wenfeng, Shao, Guangyu, Zhu, Tengqiao, Jiang, Simin, Yang, Yun, and Zhai, Yuanzheng
- Subjects
STOCHASTIC models ,AQUIFERS ,HYDRAULIC conductivity ,WATER quality ,WATER supply ,HYDROGEOLOGY - Abstract
The delineation of well capture zones (WCZs), particularly for water supply wells, is of utmost importance to ensure water quality. This task requires a comprehensive understanding of the aquifer's hydrogeological parameters for precise delineation. However, the inherent uncertainty associated with these parameters poses a significant challenge. Traditional deterministic methods bear inherent risks, emphasizing the demand for more resilient and probabilistic techniques. This study introduces a novel approach that combines the Karhunen-Loève expansion (KLE) technique with stochastic modeling to probabilistically delineate well capture zones in heterogeneous aquifers. Through numerical examples involving moderate and strong heterogeneity, the effectiveness of KLE dimension reduction and the reliability of stochastic simulations are explored. The results show that increasing the number of KL-terms significantly improves the statistical attributes of the samples. When employing more KL-terms, the statistical properties of the hydraulic conductivity field outperform those of cases with fewer KL-terms. Notably, particularly in scenarios of strong heterogeneity, achieving a convergent probabilistic WCZs map requires a greater number of KL-terms and stochastic simulations compared to cases with moderate heterogeneity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Overlap times in the infinite server queue.
- Author
-
Palomo, Sergio and Pender, Jamol
- Subjects
- *
UNITS of time , *CONSUMERS , *GROCERY shopping , *QUEUING theory , *GROCERY industry - Abstract
Imagine, you enter a grocery store to buy food. How many people do you overlap with in this store? How much time do you overlap with each person in the store? In this paper, we answer these questions by studying the overlap times between customers in the infinite server queue. We compute in closed form the steady-state distribution of the overlap time between a pair of customers and the distribution of the number of customers that an arriving customer will overlap with. Finally, we define a residual process that counts the number of overlapping customers that overlap in the queue for at least δ time units and compute its distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.