695 results on '"Stochastic Simulation"'
Search Results
2. Existence of a Periodic and Seasonal INAR Process.
- Author
-
Ispány, Márton, Bondon, Pascal, Reisen, Valdério Anselmo, and Prezotti Filho, Paulo Roberto
- Subjects
AUTOREGRESSIVE models ,MOVING average process ,IMMIGRANTS - Abstract
A spectral criterion involving the model parameters is given for the existence and uniqueness of a periodically correlated and seasonal non‐negative integer‐valued autoregressive process. The structure of the mean and covariance functions of the periodically stationary distribution of the model is derived using its implicit state‐space representation. Two infinite series representations for the process, the moving average, and the immigrant generation, are established. Based on the latter representation, a novel and parallelizable simulation method is proposed to generate the process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Impact of ground motion uncertainty evolution from post-earthquake data on building damage assessment.
- Author
-
Lozano, Jorge-Mario, Tien, Iris, Nichols, Elliot, and Frost, J. David
- Subjects
GROUND motion ,EMERGENCY management ,EARTHQUAKES ,GEOLOGICAL surveys ,ACQUISITION of data - Abstract
Accurate damage assessment after an earthquake is crucial for effective emergency response. Using ground motion information enables rapid building damage assessment when detailed damage data are unavailable. While uncertainty in earthquake parameters plays a significant role in the accuracy of rapid estimations, it is usually treated as a constant parameter rather than as a dynamic parameter that considers the amount of ground motion data collected that evolve over time. This work investigates the impact of incorporating evolving ground motion uncertainty in ground motion estimations from US Geological Survey's (USGS) ShakeMap on post-disaster damage assessments from two methodologies: the revised Thiel–Zsutty (TZR) model and Federal Emergency Management Agency's (FEMA) Hazus. Using data from the 2020 Indios earthquake in Puerto Rico and the 2014 Napa earthquake, we find that changes in uncertainty in estimates of peak ground acceleration reach 65% between early and late versions of the ShakeMap. We propose a process to integrate this evolution with the two damage assessment methodologies through a Monte Carlo simulation-based approach, demonstrating that it is critical to introduce dynamic ground motion uncertainty in the damage assessment process to avoid propagating unreliable measures. Both methodologies show that resulting damage estimates can be characterized by narrower distributions, indicative of reduced uncertainty and increased precision in damage estimates. For the TZR model, an improved estimate of post-disaster loss is achieved with narrower bounds in distributions of expected high scenario loss. For Hazus, the results show potential changes in the most probable damage state with an average change of 13% in the most probable damage state. The described methodology also demonstrates how uncertainty in the resulting damage state distributions can be reduced compared with the use of the current Hazus methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Translation regulation by RNA stem-loops can reduce gene expression noise.
- Author
-
Çelik, Candan, Bokes, Pavol, and Singh, Abhyudai
- Subjects
HAIRPIN (Genetics) ,GENETIC translation ,GENE expression ,STOCHASTIC models ,PROTEIN expression - Abstract
Background: Stochastic modelling plays a crucial role in comprehending the dynamics of intracellular events in various biochemical systems, including gene-expression models. Cell-to-cell variability arises from the stochasticity or noise in the levels of gene products such as messenger RNA (mRNA) and protein. The sources of noise can stem from different factors, including structural elements. Recent studies have revealed that the mRNA structure can be more intricate than previously assumed. Results: Here, we focus on the formation of stem-loops and present a reinterpretation of previous data, offering new insights. Our analysis demonstrates that stem-loops that restrict translation have the potential to reduce noise. Conclusions: In conclusion, we investigate a structured/generalised version of a stochastic gene-expression model, wherein mRNA molecules can be found in one of their finite number of different states and transition between them. By characterising and deriving non-trivial analytical expressions for the steady-state protein distribution, we provide two specific examples which can be readily obtained from the structured/generalised model, showcasing the model's practical applicability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Can cooperation reduce yield risks associated with infectious diseases in shrimp aquaculture in Vietnam?
- Author
-
Lien, Ho Hong, de Mey, Yann, Nhan, Dang Kieu, Bush, Simon, and Meuwissen, Miranda P.M.
- Subjects
SHRIMP diseases ,FARM risks ,INFORMATION sharing ,SYSTEMIC risk (Finance) ,COMMUNICABLE diseases - Abstract
Infectious diseases are a major threat to Asian shrimp aquaculture, as they proliferate at system level rather than only the individual level. We assess the impact of various forms of cooperation among Vietnamese farmers on yield risks caused by white spot disease and acute hepatopancreatic necrosis disease. Using a stochastic simulation model, we simulate shrimp farming yield risks based on input from two expert workshops. The results provide a relative comparison of expected yield losses caused by both diseases comparing a baseline scenario (no cooperation) and three scenarios with varying degrees of synchronization and information sharing across farms. Results show lower expected yield losses in all three cooperation scenarios in comparison with the farm-based scenario, highlighting the value of synchronization and information sharing practices to mitigate yield losses. We discuss the potential this has to reduce systemic risks in aquaculture, thereby potentially incentivizing the reintroduction of risk-sharing mechanisms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Physically adjusted ground motion prediction equations for induced seismicity at Preston New Road, UK.
- Author
-
Suroyo, Pungky Megasari, Sunny, Jaleena, and Edwards, Benjamin
- Abstract
Predicting ground motions due to induced seismicity is a challenging task owing to the scarcity of data and heterogeneity of the uppermost crust. Dealing with this requires a thorough understanding of the underlying physics and consideration of inter-site variability. The most common ground motion model used in practice is the parametric ground motion prediction equation (GMPE), of which hundreds exist in the literature. However, relatively few are developed with a focus on induced seismicity. Developing GMPEs that are specific to an appropriate magnitude-distance range ( R < 30 km; 2 ≤ M ≤ 6 ) is important for induced seismicity applications. This paper proposes a framework for the development of physically-based GMPEs to provide more accurate and reliable estimates of the potential induced-seismicity ground motion hazard, allowing for better risk assessment and management strategies. To demonstrate this approach, a new set of GMPEs for the 2018-2019 induced seismicity sequence at the Preston New Road (PNR) shale gas site near Blackpool, United Kingdom, is presented. The physically-based GMPE was developed based on a pseudo-finite-fault stochastic ground motion simulation, calibrated with parameters derived from the spectral analysis of weak-motion records from induced seismic events. An optimization-based calibration technique using the area metric (AM) was subsequently performed to calibrate optimal parameters for simulating ground motion at the PNR site. Finally, using a suite of forward simulations for events with 1 ≤ M ≤ 6 recorded at distances up to 30 km, combined with empirical data, a location-specific GMPE was derived through adjustment of an existing model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Assessment of Slope Stability Taking into Account the Uncertainty of Standard Seismic Impacts and Material Parameters of the Site.
- Author
-
Konovalov, A. V. and Gensiorovsky, Yu. V.
- Subjects
SLOPES (Soil mechanics) ,ENGINEERING geology ,ACCOUNTING standards ,EARTHQUAKES ,LANDSLIDES ,SLOPE stability - Abstract
Assessing the risk of landslides as a result of intense seismic vibrations is an urgent problem in engineering geology. In this article, to assess slope stability, a method is proposed for normalizing the internal deformation of a slope, based on a probabilistic-stochastic approach. The cumulative Newmark displacement is considered as a normalized value, for which empirical relationships have been selected among the accumulated displacement, the level of seismic impact, and the critical acceleration specified by the material parameters of the slope. The method takes into account the uncertainties in the position of possible earthquake sources in the next 50 years in the vicinity of the studied slope, the magnitude of the event(s), and the level of seismic impact. The uncertainties in the physicomechanical parameters of the slope are also taken into account. Normative biases are estimated using reference exceedance probabilities (10 and 5%). The obtained values are compared with the threshold characteristics, and based on this, a decision is made on the slope stability to seismic loads. The value of 10 cm was taken as the lower threshold at which the slope can be considered stable. The technique was successfully tested on a well-studied area of the western slope of Mt. Bolshevik (the south of Sakhalin Island). The balanced estimate of the normative displacement for a 5% exceedance probability was slightly less than 10 cm. The study also provides recommendations for further improvement of the methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Genomic-inferred cross-selection methods for multi-trait improvement in a recurrent selection breeding program.
- Author
-
Atanda, Sikiru Adeniyi and Bandillo, Nonoy
- Subjects
PLANT breeding ,LEGUMES ,SEXUAL cycle ,GENETIC drift ,HAPLOIDY - Abstract
The major drawback to the implementation of genomic selection in a breeding program lies in long-term decrease in additive genetic variance, which is a trade-off for rapid genetic improvement in short term. Balancing increase in genetic gain with retention of additive genetic variance necessitates careful optimization of this trade-off. In this study, we proposed an integrated index selection approach within the genomic inferred cross-selection (GCS) framework to maximize genetic gain across multiple traits. With this method, we identified optimal crosses that simultaneously maximize progeny performance and maintain genetic variance for multiple traits. Using a stochastic simulated recurrent breeding program over a 40-years period, we evaluated different GCS methods along with other factors, such as the number of parents, crosses, and progeny per cross, that influence genetic gain in a pulse crop breeding program. Across all breeding scenarios, the posterior mean variance consistently enhances genetic gain when compared to other methods, such as the usefulness criterion, optimal haploid value, mean genomic estimated breeding value, and mean index selection value of the superior parents. In addition, we provide a detailed strategy to optimize the number of parents, crosses, and progeny per cross that can potentially maximize short- and long-term genetic gain in a public breeding program. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Relaxation and Noise-Driven Oscillations in a Model of Mitotic Spindle Dynamics.
- Author
-
Hargreaves, Dionn, Woolner, Sarah, and Jensen, Oliver E.
- Abstract
During cell division, the mitotic spindle moves dynamically through the cell to position the chromosomes and determine the ultimate spatial position of the two daughter cells. These movements have been attributed to the action of cortical force generators which pull on the astral microtubules to position the spindle, as well as pushing events by these same microtubules against the cell cortex and plasma membrane. Attachment and detachment of cortical force generators working antagonistically against centring forces of microtubules have been modelled previously (Grill et al. in Phys Rev Lett 94:108104, 2005) via stochastic simulations and mean-field Fokker–Planck equations (describing random motion of force generators) to predict oscillations of a spindle pole in one spatial dimension. Using systematic asymptotic methods, we reduce the Fokker–Planck system to a set of ordinary differential equations (ODEs), consistent with a set proposed by Grill et al., which can provide accurate predictions of the conditions for the Fokker–Planck system to exhibit oscillations. In the limit of small restoring forces, we derive an algebraic prediction of the amplitude of spindle-pole oscillations and demonstrate the relaxation structure of nonlinear oscillations. We also show how noise-induced oscillations can arise in stochastic simulations for conditions in which the mean-field Fokker–Planck system predicts stability, but for which the period can be estimated directly by the ODE model and the amplitude by a related stochastic differential equation that incorporates random binding kinetics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Site-specific traffic modelling and simulation for a major Italian highway based on weigh-in-motion systems accounting for gross vehicle weight limitations.
- Author
-
Ravichandran, Nagavinothini, Losanno, Daniele, Pecce, Maria Rosaria, and Parisi, Fulvio
- Abstract
The present-day road traffic with the persistent change in the type and volume of vehicles needs to be specifically investigated for effective safety management of aging highway infrastructures. Actual traffic data can be implemented in refined procedures for stochastic simulation of road infrastructure performance, structural health monitoring (SHM), definition of weight limits on highways, and traffic-informed structural safety checks. While weigh-in-motion (WIM) systems had been widely used in many countries, their installation on Italian highways was mostly discussed and carried out only after the catastrophic collapse of the Polcevera bridge in 2018. This study presents a statistical data analysis, probabilistic models, and a simulation procedure for highway traffic, based on measurements of two WIM systems located along European route E45 close to Naples, Italy. Different limitations to maximum gross vehicle weight (GVW) were enforced at the locations of the two WIM systems, according to the Italian road code and the Italian guidelines for risk classification, safety assessment and monitoring of existing bridges, respectively. WIM data sets were filtered to exclude erroneous traffic data and vehicle classes defined according to the number of axles and axle distance were statistically characterised, allowing the derivation of probabilistic models for all traffic parameters of interest. A simulation methodology to generate random traffic load from the WIM data is also presented for its possible use in probabilistic performance assessment and traffic informed SHM of road infrastructures such as bridges. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Optimized Selection Method of Air Combat Course of Action under Stochastic Uncertainty.
- Author
-
Zhong, Yun, Zhang, Jieyong, Sun, Peng, Wan, Lujun, and Wang, Kepeng
- Abstract
Aiming at the design problem of aviation swarm combat course of action (COA), considering the influence of stochastic parameters in the causal relationship model and optimization problem model, according to the dynamic influence net (DIN) theory, stochastic simulation technique, feedforward neural network (FNN) function approximation technique and multi-objective artificial fish school algorithm (MOAFSA), this paper proposed a COA optimized method based on DIN and multi-objective stochastic chance constraint optimization for aviation swarm combat. First, on the basis of establishing the overall framework of the model and defining the elements of causal relationship modeling, the static and dynamic causal relationship modeling and optimization problem modeling were carried out respectively. Second, the probability propagation mechanism of DIN was established, which mainly included two aspects, i.e., the overall process and the specific algorithm. Then, input and output data were generated based on stochastic simulation. According to these data, FNN was adopted for function approximation, and MOAFSA was adopted for iterative optimization. Finally, the rationality of the model, and the effectiveness and superiority of the algorithm were verified through multiple sets of simulation cases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Evaluation of urban environmental sustainability based on the integration of multi-improvement demands: a case study of Liaoning Province, China.
- Author
-
Zhou, Ying, Yu, Miao, Tian, Shen, and Gong, Chengju
- Subjects
SUSTAINABLE urban development ,CITIES & towns ,SUSTAINABILITY ,ENVIRONMENTAL indicators ,POLLUTION ,ECONOMIC systems ,ENVIRONMENTAL protection ,SOCIAL systems - Abstract
Evaluation of urban environmental sustainability is an important prerequisite for scientific decision-making of urban sustainability. This paper proposed an evaluation method of urban environmental sustainability based on the integration of multi-improvement demands. Considering the interaction between the economic and social systems on the environmental system, this paper constructed an urban environmental sustainability evaluation indicator system from three dimensions: environmental foundation, environmental pollution, and environmental protection. Combined with the actual improvement demands of each city, a corresponding weighting method was constructed, and the weights of each city's individualized improvement demands were obtained through the analysis of indicator data. Further, in order to consider the fairness and guidance of the evaluation, the corresponding random simulation algorithm was matched for the individualized weights, and the evaluation value and the superiority probability in a stable state after multiple simulations were sought. Based on this, this paper conducted an empirical evaluation of urban environmental sustainability of the 14 cities in Liaoning Province, China. Through the demand-driven evaluation model, cities can obtain improvement suggestions that meet their demands while understanding their environmental sustainability level. It is convenient for top-down policy guidance and bottom-up practical exploration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. A Method of Integrating Air Conditioning Usage Models to Building Simulations for Predicting Residential Cooling Energy Consumption.
- Author
-
Ao, Jingyun, Du, Chenqiu, Jing, Mingyi, Li, Baizhan, and Chen, Zhaoyang
- Subjects
ENERGY consumption of buildings ,HOME energy use ,MONTE Carlo method ,REGRESSION analysis ,AIR conditioning ,ENERGY consumption - Abstract
Great deviations in building energy consumption simulation are attributed to the simplified settings of occupants' air conditioning (AC) usage schedules. This study was designed to develop a method to quantify the uncertainty and randomness of AC usage behavior and incorporate the model into simulations, in order to improve the prediction performance of AC energy consumption. Based on long-term onsite monitoring of household thermal environments and AC usage patterns, two stochastic models were built using unsupervised cluster and statistical methods. Based on the Monte Carlo method, the AC operation schedule was generated through AC opening duration, setpoints, and other relevant parameters, and was further incorporated into EnergyPlus. The results show that the ideally deterministic AC operation settings from the standard significantly overestimate the cooling energy consumption, where the value based on the fixed mode was 6.35 times higher. The distribution of daily AC energy consumption based on the stochastic modeling was highly consistent with the actual situation, thanks to the accurate prediction of the randomness and dynamics of residents' AC usage patterns. The total cooling energy consumption based on two stochastic models was found to be much closer to the actual values. The work proposes a method of embedding stochastic AC usage models to EnergyPlus 22.1 benefits for an improvement in building energy consumption simulation and the energy efficiency evaluation regarding occupant behavior in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Abstraction-based segmental simulation of reaction networks using adaptive memoization.
- Author
-
Helfrich, Martin, Andriushchenko, Roman, Češka, Milan, Křetínský, Jan, Martiček, Štefan, and Šafránek, David
- Abstract
Background: Stochastic models are commonly employed in the system and synthetic biology to study the effects of stochastic fluctuations emanating from reactions involving species with low copy-numbers. Many important models feature complex dynamics, involving a state-space explosion, stiffness, and multimodality, that complicate the quantitative analysis needed to understand their stochastic behavior. Direct numerical analysis of such models is typically not feasible and generating many simulation runs that adequately approximate the model’s dynamics may take a prohibitively long time. Results: We propose a new memoization technique that leverages a population-based abstraction and combines previously generated parts of simulations, called segments, to generate new simulations more efficiently while preserving the original system’s dynamics and its diversity. Our algorithm adapts online to identify the most important abstract states and thus utilizes the available memory efficiently. Conclusion: We demonstrate that in combination with a novel fully automatic and adaptive hybrid simulation scheme, we can speed up the generation of trajectories significantly and correctly predict the transient behavior of complex stochastic systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. An Approximate Algorithm for Simulating Stationary Discrete Random Processes with Bivariate Distributions of Their Consecutive Components in the Form of Mixtures of Gaussian Distributions.
- Author
-
Ogorodnikov, V. A., Akenteva, M. S., and Kargapolova, N. A.
- Abstract
The paper presents an approximate algorithm for modeling a stationary discrete random process with marginal and bivariate distributions of its consecutive components in the form of a mixture of two Gaussian distributions. The algorithm is based on a combination of the conditional distribution method and the rejection method. An example of application of the proposed algorithm for simulating time series of daily maximum air temperatures is given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Population Viability Analysis for Two Species of Imperiled Freshwater Turtles.
- Author
-
Gregory, Kaili M., Darst, Cat, Lantz, Samantha M., Powelson, Katherine, Ashton, Don, Fisher, Robert, Halstead, Brian J., Hubbs, Brian, Lovich, Jeffrey E., and McGowan, Conor P.
- Subjects
POPULATION viability analysis ,TURTLE populations ,TURTLES ,LIFE history theory ,BIOLOGICAL extinction ,POPULATION dynamics ,EMYDIDAE - Abstract
In the first range-wide population viability model for the northwestern and southwestern pond turtles (Actinemys marmorata and Actinemys pallida, respectively), a stage-based population projection matrix was assembled with 3 life stages: hatchling, juvenile, and adult. Vital rates were defined using biologically appropriate statistical distributions, with additional parametric uncertainty included for the adult survival parameter. A triple-loop stochastic simulation model was built around a population viability analysis to project pond turtle populations into the future. Initial abundance was calculated using available historical presence data and remotely sensed landscape condition metrics. A negative binomial regression was used to predict the relationship between abundance, habitat area, and human modification. Populations of pond turtles are dominated by adult individuals, so we applied a nonstable stage distribution to initial abundance values. Initial abundances of analysis units were variable across the species' ranges, but all populations declined precipitously in the population projections. By the end of the century, the mean range-wide probability of extinction was 44.3% for the northwestern species and 57.8% for the southwestern species. Consistent with other long-lived chelonian species, population growth rate was most sensitive to adult survival, indicating that where possible, conservation efforts focusing on increasing or maintaining adult survival would benefit the species. Elasticity analysis indicated a bet-hedging life history strategy where long-term reproductive output is maximized through longevity, small clutches, and frequent reproductive bouts in the face of highly variable juvenile survival. The population dynamics presented here indicate that efforts to bolster adult survival would be most beneficial in terms of long-term population viability, which can inform targeted research and management. The feasibility of such efforts is an important consideration in conservation management for these long-lived species. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. CorrToolBox: an package for modeling correlational magnitude transformations in discretization contexts.
- Author
-
Gao, R. and Demirtas, H.
- Subjects
FEASIBILITY studies - Abstract
This article describes the R package CorrToolBox, which is designed for modeling the correlation transitions under specified distributional assumptions within the realm of discretization in the context of the latency and threshold concepts. The practical utility and functionality of the package are demonstrated by several illustrative examples. In addition, the package's feasibility and performance are evaluated via simulation studies using synthetic mixed data with a range of marginal distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Exploring substitution random functions composed of stationary multi-Gaussian processes.
- Author
-
Straubhaar, Julien and Renard, Philippe
- Subjects
STATIONARY processes ,GAUSSIAN processes ,RANDOM fields ,GIBBS sampling ,STOCHASTIC processes ,POINT processes ,VARIOGRAMS - Abstract
Simulation of random fields is widely used in Earth sciences for modeling and uncertainty quantification. The spatial features of these fields may have a strong impact on the forecasts made using these fields. For instance, in flow and transport problems the connectivity of the permeability fields is a crucial aspect. Multi-Gaussian random fields are the most common tools to analyze and model continuous fields. Their spatial correlation structure is described by a covariance or variogram model. However, these types of spatial models are unable to represent highly or poorly connected structures even if a broad range of covariance models can be employed. With this type of model, the regions with values close to the mean are always well connected whereas the regions of low or high values are isolated. Substitution random functions (SRFs) belong to another broad class of random functions that are more flexible. SRFs are constructed by composing ( Z = Y ∘ T ) two stochastic processes: the directing function T (latent field) and the coding process Y (modifying the latent field in a stochastic manner). In this paper, we study the properties of SRFs obtained by combining stationary multi-Gaussian random fields for both T and Y with bounded variograms. The resulting SRFs Z are stationary, but as T has a finite variance Z is not ergodic for the mean and the covariance. This means that single realizations behave differently from each other. We propose a simple technique to control which values (low, intermediate, or high) are connected. It consists of adding a control point on the process Y to guide every single realization. The conditioning to local values is obtained using a Gibbs sampler. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. 我国沿海混合强风区典型 桥址设计风速取值研究.
- Author
-
吴思哲, 方根深, 潘放, 胡小浓, 赵林, and 葛耀君
- Abstract
Copyright of Journal of Southeast University / Dongnan Daxue Xuebao is the property of Journal of Southeast University Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
20. Interlinked bi‐stable switches govern the cell fate commitment of embryonic stem cells.
- Author
-
Giri, Amitava and Kar, Sandip
- Subjects
EMBRYONIC stem cells ,CELL determination ,CELL differentiation ,EMBRYOLOGY ,WNT signal transduction ,ENDODERM - Abstract
The development of embryonic stem (ES) cells to extraembryonic trophectoderm and primitive endoderm lineages manifests distinct steady‐state expression patterns of two key transcription factors—Oct4 and Nanog. How dynamically such kind of steady‐state expressions are maintained remains elusive. Herein, we demonstrate that steady‐state dynamics involving two bistable switches which are interlinked via a stepwise (Oct4) and a mushroom‐like (Nanog) manner orchestrate the fate specification of ES cells. Our hypothesis qualitatively reconciles various experimental observations and elucidates how different feedback and feedforward motifs orchestrate the extraembryonic development and stemness maintenance of ES cells. Importantly, the model predicts strategies to optimize the dynamics of self‐renewal and differentiation of embryonic stem cells that may have therapeutic relevance in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Kinetic analysis of p53 gene network with time delays and PIDD.
- Author
-
Huo, Ruimin, Liu, Nan, Yang, Hongli, and Yang, Liangui
- Subjects
P53 antioncogene ,GENE regulatory networks ,PHARMACOKINETICS ,HOPFIELD networks - Abstract
p53 kinetics plays a key role in regulating cell fate. Based on the p53 gene regulatory network composed by the core regulatory factors ATM, Mdm2, Wip1, and PIDD, the effect of the delays in the process of transcription and translation of Mdm2 and Wip1 on the dynamics of p53 is studied theoretically and numerically. The results show that these two time delays can affect the stability of the positive equilibrium. With the increase of delays, the dynamics of p53 presents an oscillating state. Further, we also study the effects of PIDD and chemotherapeutic drug etoposide on the kinetics of p53. The model indicates that (i) PIDD low-level expression does not significantly affect p53 oscillatory behavior, but high-level expression could induce two-phase kinetics of p53; (ii) Too high and too low concentration of etoposide is not conducive to p53 oscillation. These results are in good agreement with experimental findings. Finally, we consider the influence of internal noise on the system through Binomial τ -leap algorithm. Stochastic simulations reveal that high-intensity noise completely destroys p53 dynamics in the deterministic model, whereas low-intensity noise does not alter p53 dynamics. Interestingly, for the stable focus, the internal noise with appropriate intensity can induce quasi-limit cycle oscillations of the system. Our work may provide the useful insights for the development of anticancer therapy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. How network properties and epidemic parameters influence stochastic SIR dynamics on scale-free random networks.
- Author
-
Sottile, Sara, Kahramanoğulları, Ozan, and Sensi, Mattia
- Abstract
With the premise that social interactions are described by power-law distributions, we study the stochastic dynamics of SIR (Susceptible-Infected-Removed) compartmental models on static scale-free random networks generated via the configuration model. We compare simulations of our model to analytical results, providing a closed formula and a lower bound for the probability of having a minor epidemic of the disease. We explore the variability in disease spread by stochastic simulations. In particular, we demonstrate how important epidemic indices change as a function of the contagiousness of the disease and the connectivity of the network. Our results quantify the role of the starting node's degree in determining these indices, commonly used to describe epidemic spread. Our results and implementation set a baseline for studying epidemic spread on networks, showing how analytical methods can help in the interpretation of stochastic simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Discrete Event Systems Theory for Fast Stochastic Simulation via Tree Expansion.
- Author
-
Zeigler, Bernard P.
- Subjects
DISCRETE systems ,BINOMIAL coefficients ,STOCHASTIC systems ,SYSTEMS theory ,MARKOV processes ,TREES - Abstract
Paratemporal methods based on tree expansion have proven to be effective in efficiently generating the trajectories of stochastic systems. However, combinatorial explosion of branching arising from multiple choice points presents a major hurdle that must be overcome to implement such techniques. In this paper, we tackle this scalability problem by developing a systems theory-based framework covering both conventional and proposed tree expansion algorithms for speeding up discrete event system stochastic simulations while preserving the desired accuracy. An example is discussed to illustrate the tree expansion framework in which a discrete event system specification (DEVS) Markov stochastic model takes the form of a tree isomorphic to a free monoid over the branching alphabet. We derive the computation times for baseline, non-merging, and merging tree expansion algorithms to compute the distribution of output values at any given depth. The results show the remarkable reduction from exponential to polynomial dependence on depth effectuated by node merging. We relate these results to the similarly reduced computation time of binomial coefficients underlying Pascal's triangle. Finally, we discuss the application of tree expansion to estimating temporal distributions in stochastic simulations involving serial and parallel compositions with potential real-world use cases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Geospatial Analysis of Transmissivity and Uncertainty in a Semi-Arid Karst Region.
- Author
-
Gonçalves, Thiago dos Santos, Klammler, Harald, and Bastos Leal, Luíz Rogério
- Subjects
ARID regions ,VALLEYS ,AQUIFERS ,KARST ,WATER management ,STANDARD deviations - Abstract
Aquifer properties, such as hydraulic transmissivity T and its spatial variability, are fundamental for sustainable groundwater exploitation in arid regions. Especially in karst aquifers, spatial variability can be considerable, and the application of geostatistical methods allows for spatial interpolation and mapping based on observations combined with the quantification of uncertainties. Moreover, direct measurements of T are typically scarce, while those of specific capacity S
c are more frequent. In this study, we establish the linear regression relationship between the logarithms of T and Sc measured in 51 wells in a semi-arid karst region in Northeastern Brazil. This relationship is used to estimate empirical values logTemp based on measurements of logSc at 269 wells. LogTemp values are found to be normally distributed with an isotropic variogram of a significant nugget effect (attributed to local-scale karst features) and approximately 10 km range (attributed to larger-scale gradual changes in karst feature density). Ordinary kriging cross-validation indicates an optimum number of 25 neighboring wells for interpolation, which is used in a conditional sequential Gaussian simulation (SGSIM) to generate 500 realizations of logTemp with respective maps of standard deviations and probabilities of (not) exceeding threshold values. High-transmissivity areas mostly coincide with karstified river valleys, while low-transmissivity areas occur toward the edges where aquifer thickness decreases. The resulting transmissivity maps are relevant for optimizing regional water management strategies, which includes stochastic approaches where transmissivity realizations can be used to parameterize multiple runs of numerical groundwater models. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
25. Information Geometry Theoretic Measures for Characterizing Neural Information Processing from Simulated EEG Signals.
- Author
-
Hua, Jia-Chen, Kim, Eun-jin, and He, Fei
- Subjects
INFORMATION processing ,ALZHEIMER'S disease ,ELECTROENCEPHALOGRAPHY ,NEURAL codes ,PROBABILITY density function ,NONLINEAR oscillators - Abstract
In this work, we explore information geometry theoretic measures for characterizing neural information processing from EEG signals simulated by stochastic nonlinear coupled oscillator models for both healthy subjects and Alzheimer's disease (AD) patients with both eyes-closed and eyes-open conditions. In particular, we employ information rates to quantify the time evolution of probability density functions of simulated EEG signals, and employ causal information rates to quantify one signal's instantaneous influence on another signal's information rate. These two measures help us find significant and interesting distinctions between healthy subjects and AD patients when they open or close their eyes. These distinctions may be further related to differences in neural information processing activities of the corresponding brain regions, and to differences in connectivities among these brain regions. Our results show that information rate and causal information rate are superior to their more traditional or established information-theoretic counterparts, i.e., differential entropy and transfer entropy, respectively. Since these novel, information geometry theoretic measures can be applied to experimental EEG signals in a model-free manner, and they are capable of quantifying non-stationary time-varying effects, nonlinearity, and non-Gaussian stochasticity presented in real-world EEG signals, we believe that they can form an important and powerful tool-set for both understanding neural information processing in the brain and the diagnosis of neurological disorders, such as Alzheimer's disease as presented in this work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. A Soluble Model for the Conflict between Lying and Truth-Telling.
- Author
-
Vieira, Eduardo V. M. and Fontanari, José F.
- Subjects
SOCIAL learning ,LEARNING strategies ,SOCIAL systems ,INFORMATION resources ,IMITATIVE behavior - Abstract
Lying and truth-telling are conflicting behavioral strategies that pervade much of the lives of social animals and, as such, have always been topics of interest to both biology and philosophy. This age-old conflict is linked to one of the most serious threats facing society today, viz., the collapse of trustworthy sources of information. Here, we revisit this problem in the context of the two-choice sender–receiver game: the sender tosses a coin and reports the supposed outcome to the receiver, who must guess the true outcome of the toss. For the sender, the options are to lie or tell the truth, while for the receiver, the options are to believe or disbelieve the sender's account. We assume that social learning determines the strategy used by players and, in particular, that players tend to imitate successful individuals and thus change their strategies. Using the replicator equation formulation for infinite populations and stochastic simulations for finite populations, we find that when the sender benefits from the receiver's failure, the outcome of the game dynamics depends strongly on the choice of initial strategies. This sensitivity to the initial conditions may reflect the unpredictability of social systems whose members have antagonistic interests. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. 区间粗糙数群组 G1 法的随机聚合求解及应用.
- Author
-
梁媛媛, 刘军, 易平涛, and 李伟伟
- Abstract
Copyright of Journal of Northeastern University (Natural Science) is the property of Dongbei Daxue Xuebao and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
28. Parallel Adaptive Survivor Selection.
- Author
-
Pei, Linda, Nelson, Barry L., and Hunter, Susan R.
- Subjects
PROCESS capability ,MACHINE learning ,PARALLEL programming ,STATISTICAL errors ,NUMBER systems - Abstract
Ranking and selection (R&S) procedures in simulation optimization simulate every feasible solution to provide global statistical error control, often selecting a single solution in finite time that is optimal or near-optimal with high probability. By exploiting parallel computing advancements, large-scale problems with hundreds of thousands and even millions of feasible solutions are suitable for R&S. Naively parallelizing existing R&S methods originally designed for a serial computing setting is generally ineffective, however, as many of these conventional methods uphold family-wise error guarantees that suffer from multiplicity and require pairwise comparisons that present a computational bottleneck. Parallel adaptive survivor selection (PASS) is a new framework specifically designed for large-scale parallel R&S. By comparing systems to an adaptive "standard" that is learned as the algorithm progresses, PASS eliminates inferior solutions with false elimination rate control and with computationally efficient aggregate comparisons rather than pairwise comparisons. PASS satisfies desirable theoretical properties and performs effectively on realistic problems. We reconsider the ranking and selection (R&S) problem in stochastic simulation optimization in light of high-performance, parallel computing, where we take "R&S" to mean any procedure that simulates all systems (feasible solutions) to provide some statistical guarantee on the selected systems. We argue that when the number of systems is very large, and the parallel processing capability is also substantial, then neither the standard statistical guarantees such as probability of correct selection nor the usual observation-saving methods such as elimination via paired comparisons or complex budget allocation serve the experimenter well. As an alternative, we propose a guarantee on the expected false elimination rate that avoids the curse of multiplicity and a method to achieve it that is designed to scale computationally with problem size and parallel computing capacity. To facilitate this approach, we present a new mathematical representation, prove small-sample and asymptotic properties, evaluate variations of the method, and demonstrate a specific implementation on a problem with over 1 , 100 , 000 systems using only 21 parallel processors. Although we focus on inference about the best system here, our parallel adaptive survivor selection framework can be generalized to many other useful definitions of "good" systems. Funding: This work was supported by the National Science Foundation [Grants CMMI-1537060 and CMMI-1554144]. Supplemental Material: The online appendix is available at https://doi.org/10.1287/opre.2022.2343. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. pyMPSLib: A robust and scalable open-source Python library for mutiple-point statistical simulation.
- Author
-
Chen, Qiyu, Zhou, Ruihong, Liu, Cui, Huang, Qianhong, Cui, Zhesi, and Liu, Gang
- Subjects
PYTHON programming language ,PROGRAMMING languages ,SCIENTIFIC computing ,SCIENTIFIC language ,ELECTRONIC data processing - Abstract
Python has become an essential programming language for scientific computing and data analysis and processing. Various multiple-point statistics (MPS) algorithms are used to characterize complex heterogeneous structures and phenomena in earth sciences. However, there is currently no Python library that integrates mainstream MPS methods for simulation and computation in geosciences. Aiming to establish a stable MPS tool, we developed an open-source Python library of commonly used MPS methods, named pyMPSLib. pyMPSLib consists of ENESIM, SNESIM, and DS algorithms and provides a flexible and convenient API interface. To ensure the maintainability of pyMPSLib, the Python objects and toolkits of MPS algorithms are defined and implemented. To improve the compatibility and extensibility of the presented library, uniform coding standard is adopted in pyMPSLib. We performed the parameter sensitivity analysis under multiple configurations to validate the performance of the library. This open-source library also provides optional tools to quantitatively evaluate the realizations of the integrated MPS methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. Assessing Cyber-Physical Threats under Water Demand Uncertainty †.
- Author
-
Moraitis, Georgios, Tsoukalas, Ioannis, Kossieris, Panagiotis, Nikolopoulos, Dionysios, Karavokiros, George, Kalogeras, Dimitrios, and Makropoulos, Christos
- Subjects
WATER demand management ,CYBER physical systems ,WATER distribution ,MONTE Carlo method ,PROBABILISTIC inference - Abstract
This study presents an approach for the assessment of cyber-physical threats to water distribution networks under the prism of the uncertainty which stems from the variability and stochastic nature of nodal water demands. The proposed framework investigates a single threat scenario under a spectrum of synthetic, yet realistic, system states which are driven by an ensemble of stochastically generated nodal demands. This Monte Carlo-type experiment enables the probabilistic inference about model outputs, and hence the derivation of probabilistic estimates over consequences. The approach is showcased for a cyber-physical attack scenario against the monitoring and control system of a benchmark network. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. 结构时变失效概率函数估计的加权重要抽样方法.
- Author
-
钱宇耕, 袁修开, and 陈敬强
- Abstract
Copyright of Journal of Mechanical Strength / Jixie Qiangdu is the property of Zhengzhou Research Institute of Mechanical Engineering and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
32. Assessing Markovian and Delay Models for Single-Nucleus RNA Sequencing.
- Author
-
Gorin, Gennady, Yoshida, Shawn, and Pachter, Lior
- Abstract
The serial nature of reactions involved in the RNA life-cycle motivates the incorporation of delays in models of transcriptional dynamics. The models couple a transcriptional process to a fairly general set of delayed monomolecular reactions with no feedback. We provide numerical strategies for calculating the RNA copy number distributions induced by these models, and solve several systems with splicing, degradation, and catalysis. An analysis of single-cell and single-nucleus RNA sequencing data using these models reveals that the kinetics of nuclear export do not appear to require invocation of a non-Markovian waiting time. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. Translation regulation by RNA stem-loops can reduce gene expression noise.
- Author
-
Çelik, Candan, Bokes, Pavol, and Singh, Abhyudai
- Subjects
HAIRPIN (Genetics) ,GENETIC translation ,GENE expression ,STOCHASTIC models ,PROTEIN expression - Abstract
Background: Stochastic modelling plays a crucial role in comprehending the dynamics of intracellular events in various biochemical systems, including gene-expression models. Cell-to-cell variability arises from the stochasticity or noise in the levels of gene products such as messenger RNA (mRNA) and protein. The sources of noise can stem from different factors, including structural elements. Recent studies have revealed that the mRNA structure can be more intricate than previously assumed. Results: Here, we focus on the formation of stem-loops and present a reinterpretation of previous data, offering new insights. Our analysis demonstrates that stem-loops that restrict translation have the potential to reduce noise. Conclusions: In conclusion, we investigate a structured/generalised version of a stochastic gene-expression model, wherein mRNA molecules can be found in one of their finite number of different states and transition between them. By characterising and deriving non-trivial analytical expressions for the steady-state protein distribution, we provide two specific examples which can be readily obtained from the structured/generalised model, showcasing the model's practical applicability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. Probabilistic Risk Assessment of the Economy‐Wide Impacts From a Changing Wildfire Climate on a Regional Rural Landscape.
- Author
-
Monge, Juan J., Dowling, Leslie J., Wegner, Simon, Melia, Nathanael, Cheon, Pascal E. S., Schou, Wayne, McDonald, Garry W., Journeaux, Phil, Wakelin, Steve J., and McDonald, Nicola
- Subjects
WILDFIRES ,HAZARD mitigation ,COMPUTABLE general equilibrium models ,WILDFIRE prevention ,RISK assessment ,WILDFIRE risk ,GROSS domestic product ,VALUE at risk - Abstract
A warmer and drier future combined with rising population trends will result in increased wildfire risk. The design of robust mitigation/adaptation strategies requires the assessment of the economy‐wide risks from potentially more frequent large wildfire events under different futures. This study uses a novel interdisciplinary approach by integrating wildfire climate and land‐use forecasts into probabilistic and simulation models of wildfires to estimate direct impacts using damage curves and indirect impacts as lost Gross Domestic Product (GDP) using a computable general equilibrium model. Based on the financial concept of Value at Risk, a probabilistic measure of extreme economy‐wide impacts was developed using GDP as the representative metric, namely "GDP at Risk (G@R)," under various climatic and socio‐economic scenarios. Using the new metric in the Waikato, New Zealand as a case study, due its primary industries' national relevance, it was identified that there is a 5% probability that the region will experience GDP losses greater or equal to NZ$0.1–1.2 billion (similar to the regional GDP growth in 2021) over a 48‐year period from future potential large wildfires affecting vulnerable primary industries. The regional impacts result in larger national GDP losses by a factor of 5 due to the high dependence of downstream sectors on the regional primary industries. While "G@R" estimates are similar across socio‐economic scenarios, there are no discernible patterns when compared across midcentury climate scenarios. The approach developed could be used to assess the consequences from afforestation projects, driven by mitigation policies, and adaptation strategies to reduce wildfire risk. Plain Language Summary: The risk posed by wildfires will increase under a warmer and drier future. Previous studies have estimated the economic risk from wildfires on the sectors that would likely be physically and directly impacted. However, to design inclusive and enduring strategies to reduce or adapt to wildfires; the potential impacts from wildfires should be assessed for entire national economic networks under many different social and climatic scenarios—including sectors that demand products from, or supply to, the ones directly affected. This study uses an interdisciplinary approach by combining computational models to estimate the potential economic impacts from wildfires on multiple sectors of the economy under various relevant scenarios. Rather than focusing on expected impacts, this study focuses on the potentially large impacts from rare, large wildfires on the Waikato region—economically important to New Zealand. The economic impacts from potentially more frequent large wildfires in the region are modest when compared to a scenario without large wildfires. However, the nationwide impacts are substantially larger than the regional ones when considering the dependent sectors outside of the affected region. The approach developed could be used to jointly assess mitigation (e.g., afforestation) and adaptation policies to reduce the risk from wildfires. Key Points: There is a 5% chance that the studied region will lose 0.008%–0.06% or more of the base Gross Domestic Product (GDP) from potentially more frequent large wildfiresThe regional impacts result in larger national GDP losses by a factor of 5 when considering the downstream sectors outside of the regionWhile the GDP loss estimates are similar across socio‐economic scenarios, there are no discernible patterns across climate scenarios [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. Efficient credit portfolios under IFRS 9.
- Author
-
Brito, Rui Pedro and Júdice, Pedro
- Subjects
BANK management ,BUSINESS cycles ,GROSS domestic product ,ROBUST optimization ,MARKOV processes ,BANK employees - Abstract
In this paper, we devise a forward‐looking methodology to determine efficient credit portfolios under the IFRS 9 framework. We define and implement a credit loss model based on prospective point‐in‐time probabilities of default. We determine these probabilities of default and the credits' stage allocation through a credit stochastic simulation. This simulation is based on the estimation of transition matrices. Using data from 1981 to 2019, in a non‐homogeneous Markov chain setting, we estimate transition matrices conditional on the global real gross domestic product growth. This allows considering the effects of the economic cycle, which are of great importance in bank management. Finally, we develop a robust optimization model that allows the bank manager to analyze the trade‐off between the annual average portfolio income and the corresponding portfolio volatility. According to the proposed bi‐objective model, we compute the efficient credit portfolios constructed based on 10‐year maturity credits. We compare their structure to those generated by the IAS 39 and CECL accounting frameworks. The results indicate that the IFRS 9 and CECL frameworks generate efficient credit portfolios whose structure penalizes riskier‐rated credits. In turn, the riskier efficient credit portfolios under the IAS 39 framework concentrate entirely on speculative‐grade credits. This pattern is also encountered in efficient credit portfolios constructed based on credits with different maturities, namely 5 and 15 years. Moreover, the longer the maturity of the credits that enter into the composition of the efficient portfolios, the more the speculative‐grade credits tend to be penalized. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. Simulation and Analysis of Line 1 of Mexico City's Metrobus: Evaluating System Performance through Passenger Satisfaction.
- Author
-
Rodriguez, Jose Pablo and Muñoz, David F.
- Subjects
SIMULATION methods & models ,INFRASTRUCTURE (Economics) ,BUS rapid transit ,ENERGY consumption ,STOCHASTIC analysis - Abstract
The Mexico City Metrobus is one of the most popular forms of public transportation inside the city, and since its opening in 2005, it has become a vital piece of infrastructure for the city; this is why the optimal functioning of the system is of key importance to Mexico City, as it plays a crucial role in moving millions of passengers every day. This paper presents a model to simulate Line 1 of the Mexico City Metrobus, which can be adapted to simulate other bus rapid transit (BRT) systems. We give a detailed description of the model development so that the reader can replicate our model. We developed various response variables in order to evaluate the system's performance, which focused on passenger satisfaction and measured the maximum occupancy that a passenger experiences inside the buses, as well as the time that he spends in the queues at the stations. The results of the experiments show that it is possible to increase passenger satisfaction by considering different combinations of routes while maintaining the same fuel consumption. It was shown that, by considering an appropriate combination of routes, the average passenger satisfaction could surpass the satisfaction levels obtained by a 10% increase in total fuel consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Estimation of Expectations and Variance Components in Two-Level Nested Simulation Experiments.
- Author
-
Muñoz, David Fernando
- Subjects
DECOMPOSITION method ,CENTRAL limit theorem ,CONFIDENCE intervals ,BAYESIAN analysis ,SIMULATION methods & models - Abstract
When there is uncertainty in the value of parameters of the input random components of a stochastic simulation model, two-level nested simulation algorithms are used to estimate the expectation of performance variables of interest. In the outer level of the algorithm n observations are generated for the parameters, and in the inner level m observations of the simulation model are generated with the values of parameters fixed at the values generated in the outer level. In this article, we consider the case in which the observations at both levels of the algorithm are independent and show how the variance of the observations can be decomposed into the sum of a parametric variance and a stochastic variance. Next, we derive central limit theorems that allow us to compute asymptotic confidence intervals to assess the accuracy of the simulation-based estimators for the point forecast and the variance components. Under this framework, we derive analytical expressions for the point forecast and the variance components of a Bayesian model to forecast sporadic demand, and we use these expressions to illustrate the validity of our theoretical results by performing simulation experiments with this forecast model. We found that, given a fixed number of total observations n m , the choice of only one replication in the inner level ( m = 1 ) is recommended to obtain a more accurate estimator for the expectation of a performance variable. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Experimental and Numerical Study on Rolling Friction in Tower Saddle of Suspension Bridges.
- Author
-
Zhang, Chong, Wang, Ren-gui, Tao, Mu-Xuan, Fan, Jian-sheng, and Wei, Le-yong
- Subjects
SUSPENSION bridges ,ROLLING friction ,TOWERS ,SADDLERY - Abstract
Rolling friction pairs in the tower saddle of suspension bridges could significantly reduce the horizontal load that is transferred to the tower. To investigate the dependence of the rolling friction coefficient (μ) of a rolling friction pair on the normal load (f
n ) and the cylinder radius (R), an experiment was conducted where a sandwich-like mechanism measured μ. Based on the μ–fn –R relationship, a numerical model, which used one-dimensional (1D) Gaussian random nonplanar surface representation, was proposed in this research to calculate the overall rolling friction coefficient (μT ) of a multiroller plate system. The experimental results showed that the dependence of μ on fn and R could be divided into three stages: (1) roughness; (2) elastic; and (3) inelastic. In addition, μ was proportional to fn/R in the elastic stage. The proposed numerical model could accurately calculate μT for a multiroller plate system. The μT rose with increasing surface nonplanarity and slightly increased with the number of rollers (N). This clarified that the μ–fn –R relationship and proposed numerical model could help to quickly determine the size of the rollers and the flatness of plates during design and reduce the scale of the experiments required. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
39. Stochastic analysis for bending capacity of precast prestressed concrete bridge piers using Monte-Carlo simulation and gradient boosted regression trees algorithm.
- Author
-
Lai, Xiaopan, Lu, Zhao, Xu, Xinyu, and Yu, Chuanjin
- Subjects
PRESTRESSED concrete bridges ,MONTE Carlo method ,PRECAST concrete ,STOCHASTIC analysis ,BRIDGE foundations & piers ,PRESTRESSED concrete beams ,BENDING moment - Abstract
The use of precast prestressed concrete bridge piers is rapidly evolving and widely applied. Nevertheless, the probabilistic behavior of the bending performance of precast prestressed concrete bridge piers has often been overlooked. This study aims to address this issue by utilizing actual precast bridge piers as the engineering context. Through the implementation of the Monte-Carlo simulation and Gradient Boosted Regression Trees (GBRT) algorithm, the stochastic distribution of the bending performance and their critical factors are identified. The results show that the normal distribution is the most suitable for the random distribution of bending performance indicators. The variability of the elastic modulus of ordinary steel bars, initial strain of prestressed steel hinge wires, and constant load axial force has little effect on the bending moment performance, while the yield stress of ordinary steel bars, elastic modulus of concrete, compressive strength of unrestrained concrete, and elastic modulus of prestressed steel hinge wires have a greater impact on the bending performance. Additionally, the compressive strength of unrestrained concrete has a significant influence on the equivalent bending moment of the cross-section that concerns designers. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. Trends, Shifting, or Oscillations? Stochastic Modeling of Nonstationary Time Series for Future Water‐Related Risk Management.
- Author
-
Lee, Taesam and Ouarda, Taha B. M. J.
- Subjects
FLOOD warning systems ,TIME series analysis ,WATER management ,HILBERT-Huang transform ,STOCHASTIC models ,FLOOD damage ,FLOODS ,FLOOD risk - Abstract
Hydrological time series often present nonstationarities such as trends, shifts, or oscillations due to anthropogenic effects and hydroclimatological variations, including global climate change. For water managers, it is crucial to recognize and define the nonstationarities in hydrological records. The nonstationarities must be appropriately modeled and stochastically simulated according to the characteristics of observed records to evaluate the adequacy of flood risk mitigation measures and future water resources management strategies. Therefore, in the current study, three approaches were suggested to address stochastically nonstationary behaviors, especially in the long‐term variability of hydrological variables: as an overall trend, shifting mean, or as a long‐term oscillation. To represent these options for hydrological variables, the autoregressive model with an overall trend, shifting mean level (SML), and empirical mode decomposition with nonstationary oscillation resampling (EMD‐NSOR) were employed in the hydrological series of the net basin supply in the Lake Champlain‐River Richelieu basin, where the International Joint Committee recently managed and significant flood damage from long consistent high flows occurred. The detailed results indicate that the EMD‐NSOR model can be an appropriate option by reproducing long‐term dependence statistics and generating manageable scenarios, while the SML model does not properly reproduce the observed long‐term dependence, that are critical to simulate sustainable flood events. The trend model produces too many risks for floods in the future but no risk for droughts. The overall results conclude that the nonstationarities in hydrological series should be carefully handled in stochastic simulation models to appropriately manage future water‐related risks. Plain Language Summary: The current study explores and develops three possible approaches to simulate nonstationarities in hydrometeorological variables for future management strategies of water‐related disasters, such as floods. These include the simulation as trends, shifting mean models, and oscillatory signals. Depending on the characteristics of the series, the three stochastic simulation models are tested, and conclusions are formulated. Key Points: Stochastic modeling of nonstationary hydrological time series was proposed for future flood management strategiesAutoregressive with a trend, shifting mean level, and nonstationary oscillation modes were testedStochastic models were validated with performance statistics, and the series were adequately fitted with the oscillation model [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. Probabilistic Integration of Geomechanical and Geostatistical Inferences for Mapping Natural Fracture Networks.
- Author
-
Chandna, Akshat and Srinivasan, Sanjay
- Subjects
CRACK propagation (Fracture mechanics) ,INFERENTIAL statistics ,EARTHQUAKE hazard analysis - Abstract
Geomechanical modeling of the fracturing process accounts for the physical factors that inform the propagation and termination of the fractures. However, the resultant models may not honor the fracture statistics derived from auxiliary sources such as outcrop images. Stochastic algorithms, on the other hand, generate natural fracture maps based purely on statistical inferences from outcrop images excluding the effects of any physical processes guiding the propagation and termination of fractures. This paper, therefore, focuses on presenting a methodology for combining information from geomechanical and stochastic approaches necessary to obtain a fracture modeling approach that is geologically realistic as well as consistent with the geomechanical conditions for fracture propagation. As a prerequisite for this integration approach, a multi-point statistics-based stochastic simulation algorithm is implemented that yields the probability of fracture propagation along various paths. The application and effectiveness of this probability integration paradigm are demonstrated on a synthetic fracture set. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Toward Probabilistic Risk Assessment of Wildland–Urban Interface Communities for Wildfires.
- Author
-
Masoudvaziri, Nima, Elhami-Khorasani, Negar, and Sun, Kang
- Subjects
WILDLAND-urban interface ,COMMUNITIES ,WILDFIRE prevention ,CUMULATIVE distribution function ,WILDFIRES ,RISK assessment - Abstract
The number of wildfire incidents affecting communities in Wildland–Urban Interface (WUI) areas has been rapidly increasing. Understanding the fire spread between structures and evaluation of the response of the communities to the possible wildfire scenarios are crucial for proper risk management in the existing and future communities. This paper discusses a stochastic methodology to evaluate the community's response to potential wildfire scenarios. The methodology has three primary features: (1) it is based on stochastic modeling of fire spread; (2) it breaks the wildfire incident into two consecutive segments: spread inside the wildland and spread inside the community; (3) it integrates the two spread models in the form of a conditional probability. The paper focuses on fire spread inside the community and applies the proposed methodology to two case studies in California, US. The two case studies demonstrate variations in fire spread within the communities for the given fire scenarios approaching from the wildland. The performance of communities is characterized using cumulative distribution functions of the number of ignited buildings over time. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Assessment of the Spatial Variability and Uncertainty of Shreddable Pruning Biomass in an Olive Grove Based on Canopy Volume and Tree Projected Area.
- Author
-
Rodríguez-Lizana, Antonio, Ramos, Alzira, Pereira, María João, Soares, Amílcar, and Ribeiro, Manuel Castro
- Subjects
PEARSON correlation (Statistics) ,OLIVE ,PLANT biomass ,CIRCULAR economy ,CARBON cycle ,SOIL protection - Abstract
Olive pruning residues are a by-product that can be applied to soil or used for energy production in a circular economy model. Its benefits depend on the amount of pruning, which varies greatly within farms. This study aimed to investigate the spatial variability of shreddable olive pruning in a traditional olive grove in Córdoba (Spain) with an area of 15 ha and trees distanced 12.5 m from each other. To model the spatial variability of shreddable olive pruning, geostatistical methods of stochastic simulation were applied to three correlated variables measured on sampled trees: the crown projected area (n = 928 trees), the crown volume (n = 167) and the amount of shreddable pruning (n = 59). Pearson's correlation between pairs of variables varied from 0.71 to 0.76. The amount of pruning showed great variability, ranging from 7.6 to 76 kg tree
−1 , with a mean value of 37 kg tree−1 . Using exponential and spherical variogram models, the spatial continuity of the variables under study was established. Shreddable dry pruning weight values showed spatial autocorrelation up to 180 m. The spatial uncertainty of the estimation was obtained using sequential simulation algorithms. Stochastic simulation algorithms provided 150 possible images of the amount of shreddable pruning on the farm, using tree projected area and crown volume as secondary information. The interquartile range and 90% prediction interval were used as indicators of the uncertainty around the mean value. Uncertainty validation was performed using accuracy plots and the associated G-statistic. Results indicate with high confidence (i.e., low uncertainty) that shreddable dry pruning weight in the mid-western area of the farm will be much lower than the rest of the farm. In the same way, results show with high confidence that dry pruning weight will be much higher in a small area in the middle east of the farm. The values of the G-statistic ranged between 0.89 and 0.90 in the tests performed. The joint use of crown volume and projected areas is valuable in estimating the spatial variability of the amount of pruning. The study shows that the use of prediction intervals enables the evaluation of farm areas and informed management decisions with a low level of risk. The methodology proposed in this work can be extrapolated to other 3D crops without requiring modifications. On a larger scale, it can be useful for predicting optimal locations for biomass plants, areas with high potential as carbon sinks or areas requiring special soil protection measures. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
44. 考虑渗透系数不确定性的地浸过程溶浸范围随机模拟.
- Author
-
纪文贵, 罗跃, 刘金辉, 李寻, 李立尧, 余东原, and 吴慧
- Subjects
DISTRIBUTION (Probability theory) ,URANIUM mining ,INJECTION wells ,RANDOM fields ,CHANNEL flow - Abstract
Copyright of Atomic Energy Science & Technology is the property of Editorial Board of Atomic Energy Science & Technology and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
45. Stochastic Approach for 2D Superficial Seismic Amplification Based on Quad4M; City of L'Aquila (Italy) Test Case.
- Author
-
Pasculli, Antonio, Sciarra, Nicola, and Mangifesta, Massimo
- Subjects
MONTE Carlo method ,SEISMIC response ,SHEAR waves ,DISTRIBUTION (Probability theory) ,FREQUENCY spectra - Abstract
The values of the physical–mechanical properties of any soil are affected by uncertainties both due to experimental measurements and the impossibility of knowing them, in detail, at every point of the spatial domain. Accordingly, this work focuses on uncertainty in shear wave velocity (V
s ) and its impact on the seismic response. The Monte Carlo method, based on pseudo-random number generation, was selected. To understand which random distributions could identify the site's real conditions, the Fourier spectrum frequencies were calculated for each realization and were compared with the predominant natural site frequency. The experimental range data were used to calculate the spectral average acceleration and the horizontal amplification factors. The simulations were performed and interpreted by a modified version of VisualQ4M software based on 2D Quad4M, including the generation of pseudo-random numbers and pre- and post-data processing. A site at a small scale, in the territory of the city of L'Aquila (Italy), was selected as the test case. This paper demonstrates, from a numerical point of view, that both a simple local topographic modification due to excavation and the uncertainties of the numerical values, even of the shear wave velocity alone, can have an important impact on the local seismic amplification. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
46. Multilevel emulation for stochastic computer models with application to large offshore wind farms.
- Author
-
Kennedy, Jack C, Henderson, Daniel A, and Wilson, Kevin J
- Subjects
COMPUTER simulation ,OFFSHORE wind power plants ,STOCHASTIC models ,APPLICATION software ,GAUSSIAN processes ,WIND power plants - Abstract
Renewable energy projects, such as large offshore wind farms, are critical to achieving low-emission targets set by governments. Stochastic computer models allow us to explore future scenarios to aid decision making while considering the most relevant uncertainties. Complex stochastic computer models can be prohibitively slow, and thus an emulator may be constructed and deployed to allow for efficient computation. We present a novel heteroscedastic Gaussian Process emulator that exploits cheap approximations to a stochastic offshore wind farm simulator. We also conduct a probabilistic sensitivity analysis to understand the influence of key parameters in the wind farm model, which will help us to plan a probability elicitation in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. 基于防洪重现期的水库安全设计研究.
- Author
-
陶昌弟 and 刘旻
- Subjects
FLOOD control ,FLOOD routing ,COPULA functions ,SAFETY standards ,FLOOD risk ,WATERSHEDS ,WATER levels ,RESERVOIRS - Abstract
Copyright of China Rural Water & Hydropower is the property of China Rural Water & Hydropower Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
48. Assessment of long-term trends in genetic mean and variance after the introduction of genomic selection in layers: a simulation study.
- Author
-
Pocrnic, Ivan, Obšteter, Jana, Gaynor, R. Chris, Wolc, Anna, and Gorjanc, Gregor
- Subjects
CHICKEN breeds ,GENETIC variation ,INBREEDING ,POULTRY breeding ,VARIANCES - Abstract
Nucleus-based breeding programs are characterized by intense selection that results in high genetic gain, which inevitably means reduction of genetic variation in the breeding population. Therefore, genetic variation in such breeding systems is typically managed systematically, for example, by avoiding mating the closest relatives to limit progeny inbreeding. However, intense selection requires maximum effort to make such breeding programs sustainable in the longterm. The objective of this study was to use simulation to evaluate the longterm impact of genomic selection on genetic mean and variance in an intense layer chicken breeding program. We developed a large-scale stochastic simulation of an intense layer chicken breeding program to compare conventional truncation selection to genomic truncation selection optimized with either minimization of progeny inbreeding or full-scale optimal contribution selection. We compared the programs in terms of genetic mean, genic variance, conversion efficiency, rate of inbreeding, effective population size, and accuracy of selection. Our results confirmed that genomic truncation selection has immediate benefits compared to conventional truncation selection in all specified metrics. A simple minimization of progeny inbreeding after genomic truncation selection did not provide any significant improvements. Optimal contribution selection was successful in having better conversion efficiency and effective population size compared to genomic truncation selection, but it must be fine-tuned for balance between loss of genetic variance and genetic gain. In our simulation, we measured this balance using trigonometric penalty degrees between truncation selection and a balanced solution and concluded that the best results were between 45° and 65°. This balance is specific to the breeding program and depends on how much immediate genetic gain a breeding program may risk vs. save for the future. Furthermore, our results show that the persistence of accuracy is better with optimal contribution selection compared to truncation selection. In general, our results show that optimal contribution selection can ensure long-term success in intensive breeding programs using genomic selection. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Stochastic Simulations of Casual Groups.
- Author
-
Fontanari, José F.
- Subjects
FIRST-order phase transitions ,SHOPPING malls ,AD hoc organizations ,NATURAL products ,RANDOM variables ,FREE groups ,PHASE transitions - Abstract
Free-forming or casual groups are groups in which individuals are in face-to-face interactions and are free to maintain or terminate contact with one another, such as clusters of people at a cocktail party, play groups in a children's playground or shopping groups in a mall. Stochastic models of casual groups assume that group sizes are the products of natural processes by which groups acquire and lose members. The size distributions predicted by these models have been the object of controversy since their derivation in the 1960s because of the neglect of fluctuations around the mean values of random variables that characterize a collection of groups. Here, we check the validity of these mean-field approximations using an exact stochastic simulation algorithm to study the processes of the acquisition and loss of group members. In addition, we consider the situation where the appeal of a group of size i to isolates is proportional to i α . We find that, for α ≤ 1 , the mean-field approximation fits the equilibrium simulation results very well, even for a relatively small population size N. However, for α > 1 , this approximation scheme fails to provide a coherent description of the distribution of group sizes. We find a discontinuous phase transition at α c > 1 that separates the regime where the variance of the group size does not depend on N from the regime where it grows linearly with N. In the latter regime, the system is composed of a single large group that coexists with a large number of isolates. Hence, the same underlying acquisition-and-loss process can explain the existence of small, temporary casual groups and of large, stable social groups. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. An analysis of approximation algorithms for iterated stochastic integrals and a Julia and Matlab simulation toolbox.
- Author
-
Kastner, Felix and Rößler, Andreas
- Subjects
STOCHASTIC integrals ,ITERATED integrals ,STOCHASTIC partial differential equations ,WIENER processes ,PROGRAMMING languages ,APPROXIMATION algorithms - Abstract
For the approximation and simulation of twofold iterated stochastic integrals and the corresponding Lévy areas w.r.t. a multi-dimensional Wiener process, we review four algorithms based on a Fourier series approach. Especially, the very efficient algorithm due to Wiktorsson and a newly proposed algorithm due to Mrongowius and Rößler are considered. To put recent advances into context, we analyse the four Fourier-based algorithms in a unified framework to highlight differences and similarities in their derivation. A comparison of theoretical properties is complemented by a numerical simulation that reveals the order of convergence for each algorithm. Further, concrete instructions for the choice of the optimal algorithm and parameters for the simulation of solutions for stochastic (partial) differential equations are given. Additionally, we provide advice for an efficient implementation of the considered algorithms and incorporated these insights into an open source toolbox that is freely available for both Julia and Matlab programming languages. The performance of this toolbox is analysed by comparing it to some existing implementations, where we observe a significant speed-up. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.