2,899 results on '"Epistemic uncertainty"'
Search Results
2. Optimal replacement policy for multi-state manufacturing system with economic and resource dependence under epistemic uncertainty.
- Author
-
Chen, Zhaoxiang, Chen, Zhen, Zhou, Di, Shao, Chi, and Pan, Ershun
- Subjects
MANUFACTURING processes ,EPISTEMIC uncertainty ,ECONOMIC systems ,MARKOV processes ,STOCHASTIC models - Abstract
This paper develops an optimal replacement policy V* for a multi-state manufacturing system. The manufacturing system would be repaired imperfectly once its performance cannot meet the production demand, and would be replaced when the production demand is not met for the V*-th time. Due to imprecise state assignments and unpredictable external working conditions, the performance and transition intensity of the multi-state machine cannot be accurately identified and then inevitably lead to epistemic uncertainty. In addition, the economic dependence and resource dependence that prevailed in the manufacturing system should be considered. In this paper, economic dependence is described as the time and cost saved by simultaneously repairing multiple identical machines, and resource dependence is caused by finite capacity buffers. To take these into account, the fuzzy Markov model and fuzzy stochastic flow manufacturing network (FSFMN) are tailored to evaluate the fuzzy reliability of machines and manufacturing systems, respectively. To obtain the optimal replacement policy V*, we derive the expression of the long run fuzzy profit rate under epistemic uncertainty. The replacement policy is demonstrated on the ferrite phase shifting unit manufacturing system, and the results of the subsequent comparative study and sensitivity analysis show that this policy is more effective. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Finite‐time attractivity analysis for nonautonomous dynamical systems with uncertainty.
- Author
-
Lu, Ziqiang and Chen, Xin
- Subjects
- *
UNCERTAIN systems , *DYNAMICAL systems , *EPISTEMIC uncertainty - Abstract
Uncertain dynamical system driven by Liu process is of importance to depict the operation laws of real systems disturbed by human epistemic uncertainty. This paper mainly investigates the finite‐time attractivity of uncertain dynamical systems. New concepts of the finite‐time attractivity are first introduced for uncertain dynamical systems from different perspectives, and the relationships among these types of concepts are revealed based on uncertainty theory. The judgement theorems for ensuring the finite‐time exponential attractivity of two classes of uncertain dynamical systems are proposed, respectively. Several examples are provided to illustrate the main concepts and results derived. Finally, the uncertain mean‐reverting process with time‐varying parameters is considered as an application. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Treatment of epistemic uncertainty in conjunction analysis with Dempster-Shafer theory.
- Author
-
Sánchez, Luis, Vasile, Massimiliano, Sanvido, Silvia, Merz, Klaus, and Taillan, Christophe
- Subjects
- *
DEMPSTER-Shafer theory , *EPISTEMIC uncertainty , *TIME series analysis , *STATISTICS , *DATABASES - Abstract
• New model of epistemic uncertainty in Conjunction Data Messages. • Combination of Dvoretzky–Kiefer–Wolfowitz inequality and Dempster-Shafer theory. • New robust classification system for conjunction events. • Validation of the robust classification system against real conjunction scenarios. • Statistical analysis of high-risk and uncertain events detection in a real database. The paper presents an approach to the modelling of epistemic uncertainty in Conjunction Data Messages (CDM) and the classification of conjunction events according to the confidence in the probability of collision. The approach proposed in this paper is based on Dempster-Shafer Theory (DSt) of evidence and starts from the assumption that the observed CDMs are drawn from a family of unknown distributions. The Dvoretzky–Kiefer–Wolfowitz (DKW) inequality is used to construct robust bounds on such a family of unknown distributions starting from a time series of CDMs. A DSt structure is then derived from the probability boxes constructed with DKW inequality. The DSt structure encapsulates the uncertainty in the CDMs at every point along the time series and allows the computation of the belief and plausibility in the realisation of a given probability of collision. The methodology proposed in this paper is tested on a number of real events and compared against existing practices in the European and French Space Agencies. We will show that the classification system proposed in this paper is more conservative than the approach taken by the European Space Agency but provides an added quantification of uncertainty in the probability of collision. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. A Bayesian framework for in-flight calibration and discrepancy reduction of spacecraft operational simulation models.
- Author
-
Antonello, Federico, Segneri, Daniele, and Eggleston, James
- Subjects
- *
MACHINE learning , *EPISTEMIC uncertainty , *SPACE industrialization , *SPACE vehicles , *SIMULATION methods & models - Abstract
Modeling and Simulation (M&S) tools have become indispensable for the comprehensive design, operations, and maintenance of products in the space industry. An example is the European Space Agency (ESA), which relies heavily on M&S throughout the entire lifecycle of a spacecraft. However, their use in operational settings poses significant challenges, mainly attributable to (i) the harsh, uncontrollable, and often unforeseen environmental conditions; (ii) the dramatic changes in operating conditions throughout a spacecraft's lifespan, often beyond the intended designed-for lifetime; and (iii) the presence of epistemic and aleatoric uncertainty. This results in unavoidable discrepancies between the numerical simulations and real measurements, limiting their use for delicate operational tasks. To address those challenges, we present a Bayesian framework for simultaneous calibration of M&S tools, reduction of the model discrepancy, and quantification of the process and model uncertainties. The approach leverages the Kennedy and O'Hagan (KOH) calibration, tailored for a multi-objective problem. Its effectiveness is shown by its application to flying Earth observation spacecraft data and the operational simulation models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Residential flood risk in Metro Vancouver due to climate change using probability boxes.
- Author
-
Viseh, Hiva and Bristow, David N.
- Subjects
- *
MONTE Carlo method , *FLOOD risk , *EPISTEMIC uncertainty , *SAMPLING errors , *CLIMATE change , *FLOOD damage - Abstract
To enhance the decision-making process and reduce economic losses caused by future flooding, it is critical to quantify uncertainty in each step of flood risk analysis. To address the often-missing uncertainty quantification, we propose a new methodology that combines damage functions and probability bounds analysis. We estimate the likely direct tangible damage to 375,973 residential buildings along the Fraser River through Metro Vancouver, Canada, for a range of climate change driven flood scenarios, while transparently representing the associated uncertainties caused by sampling error, imprecise measurement, and uncertainty in building height and basement conditions. Furthermore, for the purposes of developing effective management strategies, uncertainties in this study are classified into two categories, namely aleatory and epistemic. According to our findings and absent significant action, we should expect an enormous increase in flood damage to the four categories of residential buildings considered in this study by the year 2100. Moreover, the results show that second-order Monte Carlo simulation cannot adequately represent epistemic uncertainty for small sample sizes. In such a case, we recommend employing a probability box to delineate the epistemic uncertainty. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Uncertainty quantification in electrical resistivity tomography inversion: hybridizing block-wise bootstrapping with geostatistics.
- Author
-
Khabaz, Zahra Tafaghod, Ghanati, Reza, and Bérubé, Charles L
- Subjects
- *
ELECTRICAL resistivity , *CONTINUATION methods , *EPISTEMIC uncertainty , *TOMOGRAPHY , *STATISTICS - Abstract
Electrical resistivity tomography inversion often encounters uncertainty stemming from two primary sources: epistemic uncertainty, arising from imperfect underlying physics and improper initial approximation of model parameters, and aleatory variability in observations due to measurement errors. Despite the widespread application of electrical resistivity tomography in imaging, the resistivity distribution of subsurface structures for various hydro-geophysical and engineering purposes, the assessment of uncertainty is seldom addressed within the inverted resistivity tomograms. To explore the combined impact of epistemic and aleatory uncertainty on resistivity models, we initially perturb the observed data using non-parametric block-wise bootstrap resampling with an optimal choice of the block size, generating different realizations of the field data. Subsequently, a geostatistical method is applied to stochastically generate a set of initial models for each bootstrapped data set from the previous step. Finally, we employ a globally convergent homotopic continuation method on each bootstrapped data set and initial model realization to explore the posterior resistivity models. Uncertainty information about the inversion results is provided through posterior statistical analysis. Our algorithm's simplicity enables easy integration with existing gradient-based inversion methods, requiring only minor modifications. We demonstrate the versatility of our approach through its application to various synthetic and real electrical resistivity tomography experiments. The results reveal that this approach for quantifying uncertainty is straightforward to implement and computationally efficient. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Requirement-driven supplier selection: a multi-criteria QFD-based approach under epistemic and stochastic uncertainties.
- Author
-
Chang, Jian-Peng, Ren, Heng-Xin, Martínez, Luis, Pedrycz, Witold, and Chen, Zhen-Song
- Subjects
- *
QUALITY function deployment , *DISTRIBUTION (Probability theory) , *SUPPLY chain management , *EPISTEMIC uncertainty , *DIGITAL transformation - Abstract
Supplier selection (SS) has emerged as a critical challenge for companies aiming to enhance the operational management of their supply chains, a task that has grown in complexity with the advent of Industry 4.0 and the ongoing digital transformation. Recognizing the gaps in current literature—specifically, the lack of consideration for stakeholders' expectations in guiding SS, as well as the inadequate handling of epistemic and stochastic uncertainties—this paper introduces a multiple-criteria Quality Function Deployment (QFD)-based model for SS. To address epistemic uncertainty, we put forward a novel subjective judgment representation method, which is named as linguistic term set integrated with discrete subjective probability distribution (LTS-DSPD), to enable decision-makers to express their judgments in a manner that is both simpler and more nuanced. Furthermore, we also give the elicitation methods and computing techniques for LTS-DSPD. Then, we integrate stakeholders' requirements, along with their preferences and expectations for these requirements to inform and guide SS. To effectively operationalize this guidance, we design the QFD-based methods to transform stakeholders' inputs into the assessment criteria for SS, the weights of criteria, and the expectations for the performances of suppliers on each criterion, respectively. To address stochastic uncertainty, we have developed an innovative methodology for characterizing it, and adopt prospect theory to quantify the overall utility of alternative suppliers. The paper concludes with a case study to demonstrate its practical application and effectiveness in streamlining SS process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Uncertainty Aware Model predictive control for free-floating space manipulator based on probabilistic ensembles neural network.
- Author
-
Wang, Xu, Liu, Yanfang, Qi, Ji, Qi, Naiming, and Peng, Na
- Subjects
- *
ARTIFICIAL neural networks , *REINFORCEMENT learning , *PREDICTIVE control systems , *EPISTEMIC uncertainty , *PREDICTION models - Abstract
• Development of model predictive control for FFSM system based-on Probabilistic ensemble neural network. • Reduction of the modeling error effect in control by model variance. • Demonstration of different missions for a 3-DOF FFSM. Precise control of a free-floating space manipulator (FFSM) is of a great challenge due to the strong dynamic and kinematic coupling between its arms and base. This paper presents a model-based reinforcement learning framework for precise control of FFSMs with dynamics unknown. Dynamic behavior of an FFSM is predicted by a probabilistic ensembles neural network (PENN) trained off-line. The PENN employs probabilistic neural networks to handle aleatoric uncertainty, which is further combined with ensemble method to capture epistemic uncertainty, and used to plan action sequences on-line under the model predictive control framework. Unlike model-free methods which train a particular policy to pursue maximum reward for the corresponding task, this framework allows the same trained PENN to be applied to various tasks with task-specified reward function. Results of numerical experiments demonstrate the fast and robust performance of the proposed framework for both angular and end-effector position control. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Epistemic uncertainty challenges aging clock reliability in predicting rejuvenation effects.
- Author
-
Kriukov, Dmitrii, Kuzmina, Ekaterina, Efimov, Evgeniy, Dylov, Dmitry V., and Khrameeva, Ekaterina E.
- Subjects
- *
EPISTEMIC uncertainty , *DNA methylation , *REJUVENATION , *EMBRYOLOGY , *EPIGENETICS - Abstract
Epigenetic aging clocks have been widely used to validate rejuvenation effects during cellular reprogramming. However, these predictions are unverifiable because the true biological age of reprogrammed cells remains unknown. We present an analytical framework to consider rejuvenation predictions from the uncertainty perspective. Our analysis reveals that the DNA methylation profiles across reprogramming are poorly represented in the aging data used to train clock models, thus introducing high epistemic uncertainty in age estimations. Moreover, predictions of different published clocks are inconsistent, with some even suggesting zero or negative rejuvenation. While not questioning the possibility of age reversal, we show that the high clock uncertainty challenges the reliability of rejuvenation effects observed during in vitro reprogramming before pluripotency and throughout embryogenesis. Conversely, our method reveals a significant age increase after in vivo reprogramming. We recommend including uncertainty estimation in future aging clock models to avoid the risk of misinterpreting the results of biological age prediction. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Risk-Targeted Seismic Design Maps with Aleatory and Epistemic Uncertainty in Tehran and Surrounding Areas.
- Author
-
Zarrineghbal, Alireza, Zafarani, Hamid, Rahimian, Mohammad, Jalalalhosseini, Seyed Mostafa, and Khanmohammadi, Mohammad
- Subjects
- *
EARTHQUAKE resistant design , *MAP design , *EPISTEMIC uncertainty , *URBAN planners , *INSURANCE companies , *EARTHQUAKE hazard analysis - Abstract
This study aims to introduce a risk-targeted probabilistic model that facilitates seismic design hazard mapping. Here, we investigate how to incorporate hazard regional variations and different structural fragility typologies so as to achieve a uniform target risk across the map. This paper takes Tehran, the capital city, and its surrounding area as a case study and explores the highly digitized hazard curves conditioned on the time-dependent occurrence of an earthquake at a 50-year exposure time. We introduce a new set of parameterized quantile functions for Risk-targeted design Intensity Measure (IMR) as the seismic design hazard. A group of IMR maps is subsequently prepared based on a 1% collapse probability during the next 50-year lifetime. Furthermore, the quantile function corresponding to each fragility type is able to show the aleatory uncertainty at each site on the region map. The uncertainty maps illustrate relatively less inherent variability than what exists in the hazard curves themselves. Then, we tackle the source of uncertainty arising from percentile hazard curves into IMR, known as epistemic uncertainty, by deriving a new closed-form expression, which allows for a sampling-free estimation of the epistemic uncertainty. The risk-targeted seismic design hazard with its accompanying uncertainties can lead to a promising seismic design hazard with potential applications for the insurance industry, urban planners, and risk-aware building owners. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Propagation of Modelling Uncertainties for Seismic Risk Assessment: The Effect of Sampling Techniques on Low-Rise Non-Ductile RC Frames.
- Author
-
Miano, Andrea, Ebrahimian, Hossein, Jalayer, Fatemeh, Vamvatsikos, Dimitrios, and Prota, Andrea
- Subjects
- *
LATIN hypercube sampling , *GROUND motion , *MONTE Carlo method , *SIMULATED annealing , *STATISTICAL sampling - Abstract
Quantifying the impact of modelling uncertainty on seismic performance assessment of existing buildings is non-trivial when considering the partial information available on material properties, construction details, and the uncertainty in the capacity models. This task is further complicated when uncertainty related to ground motion representation is considered. To address this issue, record-to-record variability, uncertainties in structural model parameters, and fragility model parameters due to limited sample size are propagated herein by employing a nonlinear dynamic analysis procedure based on recorded ground motions. A one-to-one sampling approach is adopted in which each recorded ground motion is paired up with a different structural model realization. Uncertainty propagation is explored by measuring the impact of different sampling techniques, such as Monte Carlo simulation with standard random sampling and Latin Hypercube sampling (with Simulated Annealing) in the presence of three alternative nonlinear dynamic analysis procedures: Incremental Dynamic Analysis (IDA), Modified Cloud Analysis (MCA), and Cloud to IDA (a highly efficient IDA-like procedure). This is all illustrated through application to an existing reinforced-concrete school building in southern Italy. It is shown that with a small subset of records, both MCA and Cloud to IDA can provide reliable structural fragility (and risk) estimates for three considered limit states, comparable to the results of more resource-intensive schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. A novel fuzzy algorithm for assembly precision management.
- Author
-
Liu, Sheng and Yu, Haidong
- Subjects
- *
FUZZY algorithms , *FUZZY sets , *EPISTEMIC uncertainty , *FUZZY numbers , *INDUSTRIAL goods - Abstract
• Proposed fuzzy small displacement torsor model for tolerance classification representation. • Established an assembly reliability index for evaluating assembly precision. • Developed constrained transformation method for efficient and high-accuracy calculation. • Proposed a novel fuzzy algorithm for assembly precision management. The assembly precision of a specified product is subject to epistemic uncertainty due to manufacturing and measurement errors. The efficient and accurate assembly precision management is essential for realizing smart production lines, which rely on the robust assembly precision analysis model and calculation method. Parametric models are extensively used for assembly precision analysis in industrial products, typically integrating worst-case and statistical methods for calculations. Nevertheless, the combined application of these methods presents inherent limitations. Therefore, a novel fuzzy algorithm for assembly precision management is proposed in this paper by using fuzzy sets to quantify epistemic uncertainty in assembly precision, which integrates the proposed fuzzy-based assembly precision analysis model and fuzzy-based calculation method. The proposed fuzzy small displacement torsor model is a fuzzy-based model for deviation representation and tolerance classification using decomposed fuzzy numbers, where an assembly reliability index is always established for the hierarchical evaluation and management of assembly precision. Subsequently, a comprehensive assembly precision analysis model is developed for precision prediction and contribution quantification by integrating the Jacobian model for deviation propagation. A new constrained transformation method is proposed as a fuzzy-based calculation method, offering efficient and highly accurate assembly precision computation. It accounts for torsor parameter constraints to ensure greater prediction accuracy than the worst-case method and improves computational efficiency compared to statistical methods. An assembly case of the centering pin mechanism is applied to verify the superiority of the proposed novel fuzzy algorithm compared with the Jacobian-Torsor model with worst-case or statistical methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Probabilistic Regional Liquefaction Hazard and Risk Analysis: A Case Study of Residential Buildings in Alameda, California.
- Author
-
Mongold, Emily and Baker, Jack W.
- Subjects
GROUND motion ,EPISTEMIC uncertainty ,AREA studies ,DWELLINGS ,RISK assessment - Abstract
The impact of liquefaction on a regional scale is not well understood or modeled with traditional approaches. This paper presents a method to quantitatively assess liquefaction hazard and risk on a regional scale, accounting for uncertainties in soil properties, groundwater conditions, ground-shaking parameters, and empirical liquefaction potential index equations. The regional analysis is applied to a case study to calculate regional occurrence rates for the extent and severity of liquefaction and to quantify losses resulting from ground shaking and liquefaction damage to residential buildings. We present a regional-scale metric to quantify the extent and severity of liquefaction. A sensitivity analysis on epistemic uncertainty indicates that the two most important factors on output liquefaction maps are the empirical liquefaction equation, emphasizing the necessity of incorporating multiple equations in future regional studies, and the ground motion model, highlighting the same necessity for the peak ground acceleration input. Furthermore, the disaggregation of seismic sources reveals that triggering earthquakes for various extents of liquefaction originate from multiple sources, though primarily nearby faults and large magnitude ruptures. This finding indicates the value of adopting regional probabilistic analysis in future studies to capture the diverse sources and spatial distribution of liquefaction. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Overconfidence in Probability Distributions: People Know They Don't Know, but They Don't Know What to Do About It.
- Author
-
Soll, Jack B., Palley, Asa B., Klayman, Joshua, and Moore, Don A.
- Subjects
BUSINESS schools ,EPISTEMIC uncertainty ,JUDGMENT (Psychology) ,DECISION making ,DISTRIBUTION (Probability theory) - Abstract
Overconfidence is pervasive in subjective probability distributions (SPDs). We develop new methods to analyze judgments that entail both a distribution of possible outcomes in a population (aleatory uncertainty) and imperfect knowledge about that distribution (epistemic uncertainty). In four experiments, we examine the extent to which subjective probability mass is concentrated in a small portion of the distribution versus spread across all possible outcomes. We find that although SPDs roughly match the concentration of the empirical, aleatory distributions, people's judgments are consistently overconfident because they fail to spread out probability mass to account for their own epistemic uncertainty about the location and shape of the distribution. Although people are aware of this lack of knowledge, they do not appropriately incorporate it into their SPDs. Our results offer new insights into the causes of overconfidence and shed light on potential ways to address this fundamental bias. This paper was accepted by Yuval Rottenstreich, behavioral economics and decision analysis. Funding: Support for this research was provided by the Fuqua School of Business at Duke University and the Haas School of Business at the University of California at Berkeley. Supplemental Material: The online appendix and data files are available at https://doi.org/10.1287/mnsc.2019.00660. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Improving Re-Identification by Estimating and Utilizing Diverse Uncertainty Types for Embeddings.
- Author
-
Eisenbach, Markus, Gebhardt, Andreas, Aganian, Dustin, and Gross, Horst-Michael
- Subjects
- *
EPISTEMIC uncertainty - Abstract
In most re-identification approaches, embedding vectors are compared to identify the best match for a given query. However, this comparison does not take into account whether the encoded information in the embedding vectors was extracted reliably from the input images. We propose the first attempt that illustrates how all three types of uncertainty, namely model uncertainty (also known as epistemic uncertainty), data uncertainty (also known as aleatoric uncertainty), and distributional uncertainty, can be estimated for embedding vectors. We provide evidence that we do indeed estimate these types of uncertainty, and that each type has its own value for improving re-identification performance. In particular, while the few state-of-the-art approaches that employ uncertainty for re-identification during inference utilize only data uncertainty to improve single-shot re-identification performance, we demonstrate that the estimated model uncertainty vector can be utilized to modify the feature vector. We explore the best method for utilizing the estimated model uncertainty based on the Market-1501 dataset and demonstrate that we are able to further enhance the performance above the already strong baseline UAL. Additionally, we show that the estimated distributional uncertainty resembles the degree to which the current sample is out-of-distribution. To illustrate this, we divide the distractor set of the Market-1501 dataset into four classes, each representing a different degree of out-of-distribution. By computing a score based on the estimated distributional uncertainty vector, we are able to correctly order the four distractor classes and to differentiate them from an in-distribution set to a significant extent. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Quantifying Epistemic Uncertainty in Binary Classification via Accuracy Gain.
- Author
-
Qian, Christopher, Ganter, Tyler, Michalenko, Joshua, Liang, Feng, and Adams, Jason
- Subjects
- *
EPISTEMIC uncertainty , *POINT set theory , *CLASSIFICATION , *FORECASTING - Abstract
Recently, a surge of interest has been given to quantifying epistemic uncertainty (EU), the reducible portion of uncertainty due to lack of data. We propose a novel EU estimator in the binary classification setting, as the posterior expected value of the empirical gain in accuracy between the current prediction and the optimal prediction. In order to validate the performance of our EU estimator, we introduce an experimental procedure where we take an existing dataset, remove a set of points, and compare the estimated EU with the observed change in accuracy. Through real and simulated data experiments, we demonstrate the effectiveness of our proposed EU estimator. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Evidence‐based fault tree analysis of the hydraulic system in CNC machine tools.
- Author
-
Chen, Hong‐Xia, Xie, Sui‐Xin, Zhang, Jun‐Feng, Chen, Wang‐Hao, Niu, Bo, and Zhang, Jiao‐Teng
- Subjects
- *
NUMERICAL control of machine tools , *EPISTEMIC uncertainty , *FAILURE mode & effects analysis , *RELIABILITY in engineering , *MACHINE parts , *FAULT trees (Reliability engineering) - Abstract
The hydraulic system is an integral part of CNC machine tools. In analyzing the reliability of machine tool hydraulic systems, their failures are influenced by both aleatory and epistemic uncertainties. This paper utilizes the fault tree analysis method to address failure modes subject to epistemic uncertainty, using the interval rough number scoring method to evaluate the probability of such failures occurring. The resulting reliability calculation is termed as "subjective reliability". For failure modes influenced by aleatory uncertainty, objective data combined with the Dempster–Shafer evidence theory is used to determine their failure probability, with the corresponding reliability calculation referred to as "objective reliability". Finally, a comprehensive calculation of both subjective and objective reliability is conducted to determine the overall reliability of the hydraulic system, along with the ranking of the importance of basic events of fault tree. This methodology covers scenarios with small samples, sufficient data, and their combinations, offering extensive application prospects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Fatigue reliability evaluation for impellers with consideration of multi‐source uncertainties using a WOA‐XGBoost surrogate model.
- Author
-
Qian, Cheng, Li, Wenjuan, Wei, Shengxing, Sun, Bo, and Ren, Yi
- Subjects
- *
LATIN hypercube sampling , *MONTE Carlo method , *FINITE element method , *EPISTEMIC uncertainty , *MODULUS of elasticity - Abstract
When using Monte Carlo simulation involving repeated finite element analysis (FEA) to perform fatigue reliability evaluation for an impeller, a variety of uncertainties should be considered to ensure the comprehensiveness of fatigue predictions. These uncertainties include the aleatory uncertainty from the geometric, material and load condition, and epistemic uncertainty from the parameters of the physics‐of‐failure (PoF) model to yield fatigue prediction. However, the latter uncertainty is often ignored in fatigue reliability analysis. And the reliability assessment will become computationally unaffordable and inefficient when there are many random variables involved, as an enormous amount of FEAs are demanded. To address this problem, a Whale Optimization Algorithm‐extreme gradient boosting (WOA‐XGBoost) surrogate model is developed, based on relatively few FEA results obtained using a Latin hypercube sampling (LHS). Its strengths lie in the interpretability of the design variables and effective determination of fine‐tuned hyperparameters. A case study on an impeller is conducted considering uncertainties from 11 input variables, where an efficient XGBoost model with an R2 greater than 0.93 on test set is established using 400 samples from practical FEAs. In addition, the importance analysis indicates that elasticity modulus and density play the greatest impact on the maximum strain, showing a combined importance of 82.3%. Furthermore, the reliability assessment results under fatigue parameter derived from the Median method tend to be more conservative compared to those obtained from the Seeger method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. A neural network copula function approach for solving joint basic probability assignment in structural reliability analysis.
- Author
-
Yang, Rui‐Shi, Sun, Li‐Jun, Li, Hai‐Bin, and Yang, Yong
- Subjects
- *
COPULA functions , *STRUCTURAL reliability , *EPISTEMIC uncertainty , *ENGINEERING reliability theory , *DATA distribution - Abstract
Applying evidence theory to structural reliability analysis under epistemic uncertainty, it is necessary to consider the correlation of evidence variables. Among them, solving the joint basic probability assignment (BPA) of the evidence variables is a crucial link. In this study, a solution method of joint BPA based on neural network copula function is proposed. This method is to automatically construct copula function through neural network, which avoids the process of selecting the optimal copula function. Firstly, the neural network copula function is constructed based on the sample set of evidence variables. Then, the expression for solving the joint BPA using the neural network copula function is derived through vectors. Furthermore, the expression is used to map the marginal BPA of evidence variables to joint BPA, thus realizing the solution of joint BPA. Finally, the effectiveness of this method is verified by three examples. The results show that the neural network copula function describes the data distribution better than the optimal copula function selected by the traditional method. In addition, there is actually an error in solving the reliability intervals using the traditional optimal copula function method, whereas the results of this paper's neural network copula function method are more accurate and better for decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. LPI-based correction factor for response spectrum at liquefied sites.
- Author
-
Tsai, Chi-Chin, Kan, Chun-Yu, and Hwang, Yi-Wei
- Subjects
- *
CORRECTION factors , *SPECTRAL sensitivity , *STRAINS & stresses (Mechanics) , *MOTION analysis , *EPISTEMIC uncertainty - Abstract
Liquefaction can significantly alter the ground response. However, no existing design spectrum accounts for the severity of soil liquefaction. This work aims to develop correction factors that can be used to adjust code-based design spectra to reflect the specific liquefaction susceptibility of a site. The correction factor is derived as the ratio of response spectra calculated by two types of 1D nonlinear site response analyses: effective stress analysis, which can model porewater pressure (PWP) generation, and total stress analysis. We considered seven real profiles and 200 motions in our analysis. Four combinations of soil nonlinear models and PWP generation models are also utilized to account for epistemic uncertainties. Results show that the response spectral ratio for liquefied sites typically falls below one for periods less than 1–2 s and rises above one for longer periods. Meanwhile, the response spectral ratio reflects the overall liquefaction susceptibility influenced by PWP, factor of safety, and liquefiable layer depth, while the liquefaction potential index (LPI) captures their complex interplay. Accordingly, we propose four LPI-dependent factors: three correction factors for peak ground acceleration, 0.2 s spectral acceleration (Sa), and 1.0 s Sa, and a long-period adjustment factor applicable for periods exceeding 1 s. The correction factors linearly decrease with increasing LPI, while the adjustment factor exhibits the opposite trend. A design spectrum for a liquefiable site can be readily constructed by adjusting the code-based design spectrum using the proposed correction factor, as illustrated in the example. This approach is applicable as long as LPI is available from a simplified liquefaction analysis or a liquefaction hazard map. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. A New Method of Failure Mode and Severity Effects Analysis for Hydrogen-Fueled Combustion Systems.
- Author
-
Gill, Adrian, Pielecha, Ireneusz, and Szwajca, Filip
- Subjects
- *
FAILURE mode & effects analysis , *LITERATURE reviews , *FAILURE analysis , *EPISTEMIC uncertainty , *HYDROGEN analysis - Abstract
This article aims to align its content with current trends in hybrid risk analysis methods while utilizing experimental research. This paper presents a hybrid methodology for analyzing the failure severity of a two-stage hydrogen-powered combustion system and details its implementation. This methodology assumes the use of the original FMESA method (Failure Mode and Effects Severity Analysis) with dedicated tabular scales of the failure severity. Obtaining results under the FMESA using experimental research is intended to reduce epistemic uncertainty, which is an important component of hazard severity or risk models. Its essence is to change the way of obtaining the results of the basic components of known methods such as FMEA/FMECA (Failure Mode and Effect Analysis/Failure Mode, Effects and Criticality Analysis). Experimental research was also used to develop the original failure severity scales for a two-stage hydrogen-fueled combustion system. The article presents a review of the literature on methods for identifying and analyzing hazards in hydrogen systems, the FMESA method with its mathematical model, results in the form of tabular scales of the failure severity, results of selected experimental tests, and quantitative results of a severity analysis of eleven failure modes of a two-stage hydrogen-fueled combustion system for a selected engine operating point. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. A logic-tree based probabilistic seismic hazard assessment for the central ionian islands of cephalonia and ithaca (Western Greece).
- Author
-
Kaviris, George, Zymvragakis, Angelos, Kapetanidis, Vasilis, Kouskouna, Vasiliki, Spingos, Ioannis, Sakellariou, Nikolaos, and Voulgaris, Nicholas
- Subjects
- *
EARTHQUAKE zones , *GROUND motion , *LATIN hypercube sampling , *EPISTEMIC uncertainty , *ACCELERATION (Mechanics) , *EARTHQUAKE hazard analysis - Abstract
The Central Ionian Islands of Cephalonia and Ithaca belong to the most seismically active Greek region, mainly due to the presence of the dextral Cephalonia-Lefkada Transform Fault Zone. The study area has experienced strong earthquakes in the twentieth century, including the destructive 1953 sequence with maximum intensity 9.0. The Paliki peninsula, western Cephalonia, hosted two strong earthquakes (Mw = 6.1 and 5.8) in 2014, with ground acceleration reaching ~ 560 cm/s2 and 735 cm/s2, respectively. This study updates the seismic hazard evaluation in Cephalonia and Ithaca using new data and computational techniques to reduce epistemic uncertainties. The probabilistic approach of Cornell and McGuire was used, and the uncertainties are reduced through data variability of the source models, seismicity data, and Ground Motion Prediction Equations using a logic tree approach, sampled by implementing the Latin Hypercube Sampling method. The spatial distribution of Peak Ground Acceleration and Peak Ground Velocity for return periods of 475 and 950 years indicates low variation in the entire study area and that the Paliki peninsula possesses the highest level of seismic hazard. Additionally, site-specific analysis across the three main towns, Lixouri and Argostoli in Cephalonia and Vathi in Ithaca, reveals that Lixouri has the highest level of seismic hazard, while Vathi the lowest. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Probabilistic Hesitant Bifuzzy Set and its Application in Risk Assessment.
- Author
-
CHAUBE, SHSHANK, SINGH, MANOJ K., PANT, SANGEETA, KUMAR, ANUJ, JANA, MRINAL, KOTECHA, KETAN, and ABRAHAM, AJITH
- Subjects
AGGREGATION operators ,EPISTEMIC uncertainty ,RISK assessment ,ENTROPY (Information theory) ,INFORMATION processing - Abstract
In real-world issues, randomness and imprecision are frequent characteristics. This paper introduces the idea of the Probabilistic Hesitant Bifuzzy Set (PHBFS), which is a novel paradigm for dealing with aleatory and epistemic uncertainty. The PHBFS offers the depiction of both forms of uncertainty in a single framework and allows for the consideration of more conflicting information. The article outlines the basic operations of PHBFSs and develops a basic aggregation operator for PHBFSs to enable its implementation. The research also provides a basic overview of information fusion processes and suggests a visualization method based on PHBFS entropy to analyze the aggregated information and improve assessment results. The proposed method is applied in the risk assessments, more notably the Arctic geopolitical risk assessment, to show how effectively it works. The merits and drawbacks of the PHBFS framework are thoroughly covered in the article's conclusion. [ABSTRACT FROM AUTHOR]
- Published
- 2024
25. Improving disaster relief plans for hurricanes with social media.
- Author
-
Paul, Jomon A., Zhang, Minjiao, Yang, Muer, and Xu, Chong
- Subjects
EMERGENCY management ,LANDFALL ,EPISTEMIC uncertainty ,STOCHASTIC programming ,NATURAL disasters - Abstract
Decisions on humanitarian responses to natural disasters are subject to considerable epistemic uncertainty. This paper advocates for postponing the decision point of pre-positioning relief supplies as close to landfall as possible and searching social media right post-landfall so that the demands can be estimated more accurately. We use a realistic hurricane preparedness case to demonstrate the effectiveness of our models and parametric estimation using social media data. The optimal timing to deploy relief supplies before hurricane landfall is noted to be 12 h in advance, which reduces the total cost by 13% more than if relief supplies are deployed 18+ hours in advance. Meanwhile, utilizing social media information can reduce the total cost as well as all kinds of specific costs being considered, excluding the point of dispensing (POD) sites setup cost, by approximately 15%. As the attitude toward risk goes from optimistic, to neutral, and to pessimistic, the number of PODs increases from 3 to 7, and to 8. A similar pattern can be noted in the total costs incurred by these decision-makers. Further, as the aversion to risk increases, locations tend to be chosen farther from landfall with these farther locations serving the less severe patients. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Embracing epistemic uncertainty: a risk evaluation method for pollutants in stormwater
- Author
-
Vincent Pons, Merethe Strømberg, Godecke-Tobias Blecken, Franz Tscheikner-Gratl, Maria Viklander, and Tone Merete Muthanna
- Subjects
epistemic uncertainty ,gaussian process ,monitoring campaign ,pollutant modelling ,urban drainage system ,Environmental technology. Sanitary engineering ,TD1-1066 - Abstract
In this study, we show that pollutants of emerging concern are, by nature, prone to the emergence of epistemic uncertainty. We also show that the current uncertainty quantification methods used for pollutant modelling rely almost exclusively on parameter uncertainty, which is not adequate to tackle epistemic uncertainty affecting the model structure. We, therefore, suggest a paradigm shift in the current pollutant modelling approaches by adding a term explicitly accounting for epistemic uncertainties. In a proof-of-concept, we use this approach to investigate the impact of epistemic uncertainty in the fluctuation of pollutants during wet-weather discharge (input information) on the distribution of mass of pollutants (output distributions). We found that the range of variability negatively impacts the tail of output distributions. The fluctuation time, associated with high covariance between discharge and concentration, is a major driver for the output distributions. Adapting to different levels of epistemic uncertainty, our approach helps to identify critical unknown information in the fluctuation of pollutant concentration. Such information can be used in a risk management context and to design smart monitoring campaigns. HIGHLIGHTS Current modelling approaches are not suitable for the deep epistemic uncertainty associated with pollutants of emerging concern.; Variability and fluctuation of concentration and the concentration-discharge dependency can worsen the severity of an overflow event.; Through our method for the impact of epistemic uncertainty, we suggest a paradigm shift toward the design of smart stormwater quality monitoring campaigns.;
- Published
- 2024
- Full Text
- View/download PDF
27. The development and implementation of design flowchart for probabilistic rock slope stability assessments: a review.
- Author
-
Rusydy, Ibnu, Canbulat, Ismet, Zhang, Chengguo, Wei, Chunchen, and McQuillan, Alison
- Subjects
SLOPE stability ,ROCK properties ,EPISTEMIC uncertainty ,EARTHQUAKES ,MINING engineering - Abstract
Background: Rock slope instability is a complex geotechnical issue that is affected by site-specific rock properties, geological structures, groundwater, and earthquake load conditions. Numerous studies acknowledge these aleatory uncertainties in slope stability assessment; however, understanding the rock behaviour could still be improved. Therefore, this paper aims to summarise the probability methods applied in rock slope stability analysis in mining and civil engineering and develop new probabilistic design and assessment methodologies for four methods, namely empirical/rock mass classifications techniques, kinematic analysis, limit equilibrium (LE), and numerical methods and introduces how to integrate all methods to determine the total probability of failure. The case studies have been conducted based on slopes from Indonesia, a seismically active country, utilising the proposed design methods. Results: Regarding the probabilistic empirical/rock mass classification (RMC) technique, this study has identified that seven of the ten most involved input parameters in RMC naturally exhibit aleatory uncertainty. Thus, the optimal way to present the output probability of RMC is as a confidence interval (CI) or total and conditional probability associated with each rock mass class. In probabilistic kinematic analysis, this study presents a systematic method to compute the probabilities of different types of failure alongside the total probability of occurrence (P
tK ). The probability of failure (PoF) for jointed generalized Hoek-Brown (GHB) numerical modelling was lower than that obtained through the probabilistic LE approach for a similar slope. However, the PoF of jointed GHB is higher than the LE approach when loaded with 0.1 and 0.15 earthquake coefficients. Conclusions: The variation of PoF across different failure criteria determines how epistemic uncertainty is apparent in the modelling process, while the aleatory uncertainty arises from input parameters. Furthermore, this study introduces the total probability of failure equation as a combination of kinematic and kinetic probabilities (limit equilibrium and numerical modelling). [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
28. Hall thruster model improvement by multidisciplinary uncertainty quantification.
- Author
-
Eckels, Joshua D., Marks, Thomas A., Allen, Madison G., Jorns, Benjamin A., and Gorodetsky, Alex A.
- Subjects
HALL effect thruster ,ENGINEERING models ,EPISTEMIC uncertainty ,ELECTRON transport ,PREDICTION models - Abstract
We study the analysis and refinement of a predictive engineering model for enabling rapid prediction of Hall thruster system performance across a range of operating and environmental conditions and epistemic and aleatoric uncertainties. In particular, we describe an approach by which experimentally-observed facility effects are assimilated into the model, with a specific focus on facility background pressure. We propose a multifidelity, multidisciplinary approach for Bayesian calibration of an integrated system comprised of a set of component models. Furthermore, we perform uncertainty quantification over the calibrated model to assess the effects of epistemic and aleatoric uncertainty. This approach is realized on a coupled system of cathode, thruster, and plume models that predicts global quantities of interest (QoIs) such as thrust, efficiency, and discharge current as a function of operating conditions such as discharge voltage, mass flow rate, and background chamber pressure. As part of the calibration and prediction, we propose a number of metrics for assessing predictive model quality. Based on these metrics, we found that our proposed framework produces a calibrated model that is more accurate, sometimes by an order of magnitude, than engineering models using nominal parameters found in the literature. We also found for many QoIs that the remaining uncertainty was not sufficient to account for discrepancy with experimental data, and that existing models for facility effects do not sufficiently capture experimental trends. Finally, we confirmed through a global sensitivity analysis the prior intuition that anomalous transport dominates model uncertainty, and we conclude by suggesting several paths for future model improvement. We envision that the proposed metrics and procedures can guide the refinement of future model development activities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Shake-table test on a historical masonry aggregate: prediction and postdiction using an equivalent-frame model.
- Author
-
Tomić, Igor and Beyer, Katrin
- Subjects
- *
STONEMASONRY , *EPISTEMIC uncertainty , *FAILURE mode & effects analysis , *EARTHQUAKE engineering , *MASONRY testing - Abstract
Modeling the seismic response of historical masonry buildings is challenging due to many aleatory and epistemic uncertainties. Additionally, the interaction between structural units further complicates predictions of the seismic behavior of unreinforced masonry aggregates found throughout European city centers. This motivated the experimental campaign on half-scale, double-leaf stone masonry aggregates within the SERA (Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe)—Adjacent Interacting Masonry Structures project. The experimental campaign included a blind prediction competition that provided participants with data on materials, geometry, construction details, and seismic input. After the test, the actual seismic input and all recorded and processed data on accelerations, base-shear, and displacements were shared with participants. Instead of a single analysis for the prediction phase, we performed broader stochastic incremental dynamic analyses to answer whether the common assumptions for aggregate modeling of either fully coupled or completely separated units yield safe predictions of aggregate behavior. We modeled buildings as equivalent frames in OpenSEES using a newly developed macroelement, which captures both in-plane and out-of-plane failure modes. To simulate the interaction between two units, we implemented a new material model and applied it to zero-length elements connecting the units. Our results demonstrate the importance of explicitly modeling the non-linear connection between the units and using probabilistic approaches when evaluating the aggregate response. Although modeling simplifications of the unit interaction and deterministic approaches might produce conservative results in predicted failure peak ground acceleration, we found that these simplified approaches overlook the likely damage and failure modes. Our results further stress the importance of calibrating material parameters with results from equivalent quasi-static cyclic tests and using appropriate damping models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Probabilistic Seismic Performance Assessment of RC Frame Structures Considering Dynamic Effect and Structural Parameter Uncertainties.
- Author
-
Li, Rou-Han, Li, Chao, Li, Hong-Nan, Xu, Wei-Xiao, and Gao, Mao
- Subjects
- *
SHAKING table tests , *STRUCTURAL frames , *STRAIN rate , *REINFORCED concrete , *EPISTEMIC uncertainty , *SEISMIC response - Abstract
In this paper, an efficient and reliable method is developed for assessing the seismic performance of reinforced concrete (RC) frame structures by using the dynamic concentrated plastic beam–column element. Firstly, the beam–column element considering the dynamic effect caused by strain rate sensitivity of RC materials and the epistemic uncertainties in structural parameters is proposed, in which the mechanical behavior of plastic hinge is described by the damage index-based hysteretic model. Moreover, a simplified approach is employed to consider the strain rate-sensitivity of RC materials. Then the computation procedure for probabilistic seismic analysis of RC frame structures is illustrated based on the proposed element. The change in strain rate at each time step is considered by modifying the hysteretic model, which is further used in updating the matrices in dynamic equations. Finally, the probabilistic seismic response and damage analyses of a shaking table test RC frame structure are performed and the proposed method is validated with the experimental data. Furthermore, the influences of structural uncertainties on the analytical results of maximum drift ratio and collapse probability are discussed. It is indicated that both the dynamic effect and structural uncertainties need to be seriously taken into account for obtaining a more reliable seismic response and collapse assessment of RC frame structures. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. An introduction to Bayesian simulation-based inference for quantum machine learning with examples.
- Author
-
Nikoloska, Ivana and Simeone, Osvaldo
- Subjects
BAYESIAN field theory ,MACHINE learning ,EPISTEMIC uncertainty ,PHENOMENOLOGICAL theory (Physics) ,ALGORITHMS - Abstract
Simulation is an indispensable tool in both engineering and the sciences. In simulation-based modeling, a parametric simulator is adopted as a mechanistic model of a physical system. The problem of designing algorithms that optimize the simulator parameters is the focus of the emerging field of simulation-based inference (SBI), which is often formulated in a Bayesian setting with the goal of quantifying epistemic uncertainty. This work studies Bayesian SBI that leverages a parameterized quantum circuit (PQC) as the underlying simulator. The proposed solution follows the well-established principle that quantum computers are best suited for the simulation of certain physical phenomena. It contributes to the field of quantum machine learning by moving beyond the likelihood-based methods investigated in prior work and accounting for the likelihood-free nature of PQC training. Experimental results indicate that well-motivated quantum circuits that account for the structure of the underlying physical system are capable of simulating data from two distinct tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. How dependent are quantitative volcanic ash concentration and along‐flight dosage forecasts to model structural choices?
- Author
-
James, Lauren A., Dacre, Helen F., and Harvey, Natalie J.
- Subjects
- *
EXPLOSIVE volcanic eruptions , *VOLCANIC ash, tuff, etc. , *FLIGHT planning (Aeronautics) , *SPATIAL resolution , *EPISTEMIC uncertainty - Abstract
Producing quantitative volcanic ash forecasts is challenging due to multiple sources of uncertainty. Careful consideration of this uncertainty is required to produce timely and robust hazard warnings. Structural uncertainty occurs when a model fails to produce accurate forecasts, despite good knowledge of the eruption source parameters, meteorological conditions and suitable parameterizations of transport and deposition processes. This uncertainty is frequently overlooked in forecasting practices. Using a Lagrangian particle dispersion model, simulations with varied output spatial resolution, temporal averaging period and particle release rate are performed to quantify the impact of these structural choices. This experiment reveals that, for the 2019 Raikoke eruption, structural choices give measurements of peak ash concentration spanning an order of magnitude, significantly impacting decision‐relevant thresholds used in aviation flight planning. Conversely, along‐flight dosage estimates exhibit less sensitivity to structural choices, suggesting it is a more robust metric to use in flight planning. Uncertainty can be reduced by eliminating structural choices that do not result in a favourable level of agreement with a high‐resolution reference simulation. Reliable forecasts require output spatial resolution ≤$$ \le $$ 80 km, temporal averaging periods ≤$$ \le $$ 3 h and particle release rates ≥$$ \ge $$ 5000 particles/h. This suggests that simulations with relatively small numbers of particles could be used to produce a large ensemble of simulations without significant loss of accuracy. Comparison with previous Raikoke simulations indicates that the uncertainty associated with these constrained structural choices is smaller than those associated with satellite constrained eruption source parameter and internal model parameter uncertainties. Thus, given suitable structural choices, other epistemic sources of uncertainty are likely to dominate. This insight is useful for the design of ensemble methodologies which are required to enable a shift from deterministic to probabilistic forecasting. The results are applicable to other long‐range dispersion problems and to Eulerian dispersion models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Uncertain Time Series Analysis for the Confirmed Case of Brucellosis in China.
- Author
-
Zhang, Shanshan, Zhang, Yaxuan, Lio, Waichon, and Kang, Rui
- Subjects
- *
SYMMETRY (Biology) , *TIME series analysis , *BRUCELLOSIS , *BIOLOGICAL systems , *EPISTEMIC uncertainty - Abstract
Brucellosis, as an infectious disease that affects both humans and livestock, poses a serious threat to human health and has a severe impact on economic development. Essentially, brucellosis transmission is a kind of study in biological systems, and the epistemic uncertainty existing in the data of confirmed brucellosis cases in China is realized as significant uncertainty that needs to be addressed. Therefore, this paper proposes an uncertain time series model to explore the confirmed brucellosis cases in China. Then, some methods based on uncertain statistics and symmetry of the biological system are applied, including order estimation, parameter estimation, residual analysis, uncertain hypothesis test, and forecast. The proposed model is practically applied to the data of confirmed brucellosis cases in China from January 2017 to December 2020, and the results show that the uncertain model fits the observed data better than the probabilistic model due to the frequency instability inherent in the data of confirmed brucellosis cases. Based on the proposed model and statistical method, this paper develops an approach to rapidly forecast the number of confirmed brucellosis cases in small sample scenarios, which can contribute to epidemic control in real application. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Chance, probability, and uncertainty at the edge of human reasoning: What is Knightian uncertainty?
- Author
-
Townsend, David M., Hunt, Richard A., and Rady, Judy
- Subjects
EPISTEMIC uncertainty ,ACTION theory (Psychology) ,FREE will & determinism ,ENTREPRENEURSHIP ,PROBABILITY theory - Abstract
Research Summary: For more than a century, Frank Knight's Risk, Uncertainty, and Profit has significantly influenced entrepreneurship theory development by exploring the nature of uncertainty and the epistemic limits of entrepreneurial action. Knight's work highlights how economic actors cannot fully predict the consequences of their actions. Despite its broad influence, debates persist regarding the nature of Knightian uncertainty. This study addresses these debates through a comprehensive analysis of RUP and Knight's other published and unpublished writings to offer new insights into the nature and meaning of Knightian uncertainty, revealing Knight's holistic theory that integrates "real indeterminism," "partial knowledge," and "subjective beliefs." This analysis provides much needed construct clarity to advance contemporary theories of entrepreneurial action and the role of uncertainty in business venturing processes. Managerial Summary: This article revisits Frank Knight's foundational work, Risk, Uncertainty, and Profit, a cornerstone in entrepreneurship research for over a century. We highlight Knight's holistic approach to uncertainty, which integrates the concepts of real indeterminism (the inherent unpredictability of future events), partial knowledge (the incomplete understanding of the present and future), and subjective beliefs (individual perceptions and interpretations). The study offers new perspectives on how Knightian uncertainty influences entrepreneurial decision‐making and action, highlighting how this unique type of uncertainty plays a critical role in the business venturing process. These insights provide valuable contributions to contemporary theories of entrepreneurship, emphasizing the complexity and multifaceted nature of navigating uncertainty in business. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Constraining acyclicity of differentiable Bayesian structure learning with topological ordering.
- Author
-
Tran, Quang-Duy, Nguyen, Phuoc, Duong, Bao, and Nguyen, Thin
- Subjects
EPISTEMIC uncertainty ,DIRECTED graphs ,ORDER picking systems ,PRIOR learning ,SCALABILITY - Abstract
Distributional estimates in Bayesian approaches in structure learning have advantages compared to the ones performing point estimates when handling epistemic uncertainty. Differentiable methods for Bayesian structure learning have been developed to enhance the scalability of the inference process and are achieving optimistic outcomes. However, in the differentiable continuous setting, constraining the acyclicity of learned graphs emerges as another challenge. Various works utilize post-hoc penalization scores to impose this constraint which cannot assure acyclicity. The topological ordering of the variables is one type of prior knowledge that contains valuable information about the acyclicity of a directed graph. In this work, we propose a framework to guarantee the acyclicity of inferred graphs by integrating the information from the topological ordering into the inference process. Our integration framework does not interfere with the differentiable inference process while being able to strictly assure the acyclicity of learned graphs and reduce the inference complexity. Our extensive empirical experiments on both synthetic and real data have demonstrated the effectiveness of our approach with preferable results compared to related Bayesian approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Belief reliability of structures with hybrid uncertainties.
- Author
-
Metagudda, Sushma H. and Balu, A. S.
- Abstract
Reliability of structures is evaluated by considering uncertainties present in the system, which can be characterized into aleatory and epistemic. Inherent randomness in the physical environment leads to aleatory, whereas insufficient knowledge about the system leads to epistemic uncertainty. For the reliability evaluation, ascertaining the sources of uncertainties poses a great challenge since both uncertainties coexist widely in structural systems. Aleatory uncertainties are quantified by probabilistic measures (such as first order reliability method, second order reliability method and Monte Carlo techniques), whereas epistemic uncertainties are quantified by various non-probabilistic approaches (such as interval analysis methods, evidence theory, possibility theory and fuzzy theory). However, major issues like interval extension problem and duality conditions that lead to overestimation hinder the versatility of application of such methods, thus uncertainty theory has been emerged to overcome these limitations. Given the existing uncertainties and limitations, a hybrid strategy has been constructed and referred to as "belief reliability". A belief reliability metric is integration of three key factors: design margin, aleatory and epistemic uncertainty factor to evaluate the reliability of the structural system. In this paper, Monte Carlo simulation is adopted to account for aleatory uncertainty. On the other hand, epistemic uncertainty is quantified through adjustment factor approach using FMEA (failure mode effective analysis). Numerical examples are presented to substantiate the proposed methodology being applied to variety of problems both implicit and explicit nature in structural engineering. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Evidential uncertainty sampling strategies for active learning.
- Author
-
Hoarau, Arthur, Lemaire, Vincent, Le Gall, Yolande, Dubois, Jean-Christophe, and Martin, Arnaud
- Subjects
EPISTEMIC uncertainty ,LEARNING strategies ,DILEMMA - Abstract
Recent studies in active learning, particularly in uncertainty sampling, have focused on the decomposition of model uncertainty into reducible and irreducible uncertainties. In this paper, the aim is to simplify the computational process while eliminating the dependence on observations. Crucially, the inherent uncertainty in the labels is considered, i.e. the uncertainty of the oracles. Two strategies are proposed, sampling by Klir uncertainty, which tackles the exploration–exploitation dilemma, and sampling by evidential epistemic uncertainty, which extends the concept of reducible uncertainty within the evidential framework, both using the theory of belief functions. Experimental results in active learning demonstrate that our proposed method can outperform uncertainty sampling. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Two-Stage Hyperelliptic Kalman Filter-Based Hybrid Fault Observer for Aeroengine Actuator under Multi-Source Uncertainty.
- Author
-
Wang, Yang, Sun, Rui-Qian, and Gou, Lin-Feng
- Subjects
POLYNOMIAL chaos ,STOCHASTIC systems ,EPISTEMIC uncertainty ,STOCHASTIC models ,ACTUATORS - Abstract
An aeroengine faces multi-source uncertainty consisting of aeroengine epistemic uncertainty and the control system stochastic uncertainty during operation. This paper investigates actuator fault estimation under multi-source uncertainty to enhance the fault diagnosis capability of aero-engine control systems in complex environments. With the polynomial chaos expansion-based discrete stochastic model quantification, the optimal filter under multi-source uncertainty, the Hyperelliptic Kalman Filter, is proposed. Meanwhile, by treating actuator fault as unknown input, the Two-stage Hyperelliptic Kalman Filter (TSHeKF) is also proposed to achieve optimal fault estimation under multi-source uncertainty. However, considering that the biases of the model are often fixed for the individual, the TSHeKF-based fault estimation is robust and leads to inevitable conservativeness. By adding the additional estimation of the unknown deviation in state function caused by probabilistic system parameters, the hybrid fault observer (HFO) is proposed based on the TSHeKF and realizes conservativeness-reduced estimation for actuator fault under multi-source uncertainty. Numerical simulations show the effectiveness and optimality of the proposed HFO in state estimation, output prediction, and fault estimation for both single and multi-fault modes, when considering multi-source uncertainty. Furthermore, Monte Carlo experiments have demonstrated that the HFO-based optimal fault estimation is less conservative and more accurate than the Two-stage Kalman Filter and TSHeKF, providing better safety and more reliable aeroengine operation assurance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. An Improving Lung Disease Detection by Combining Ensemble Deep Learning and Maximum Mean Discrepancy Transfer Learning.
- Author
-
Kousiga, Thiruvenkadasamy and Nithya, Palanisamy
- Subjects
CONVOLUTIONAL neural networks ,COMPUTED tomography ,LONG-term memory ,EPISTEMIC uncertainty ,DEEP learning - Abstract
In accordance to the World Health Organization (WHO), various pulmonary diseases cause thousands of deaths annually. The early diagnosis is required to lessen the mortality rate. For this reason, A Convolutional Neural Network (CNN)-based Lung Disease (LD) detection system is developed to classify segregated lung sections into various pulmonary diseases types. However, epistemic uncertainty in the scanned images affecting the performance of detection classifiers. Hence, in this paper, a multi-modal approach is proposed to solve the epistemic uncertainty issue and provides a reliable solution for rapid detection of various LD types from CXR images. In this method, CT images are additionally used to improve model's performance as it contains detailed information that might be exploited to provide efficient results. Initially, the collected images are segmented using U-Net model to get enhanced lung Region of Interest (ROIs). Then ResNet50, DenseNet121, InceptionResNetV2 and XceptionV3 are used to hierarchically extract informative and discriminative features from collected CXR and CT images. The retrieved deep features are fed into the Ensemble-Convolutional Long Short Term Memory with Extreme Machine Learning (EconLSTM-ELM) to minimize the computational time and increase the accuracy. Moreover, Transfer Learning (TL) model is employed to learn the weight of the E-conLSTM-ELM to exchange the knowledge between features and classes relation among CXR and CT images. Also, the domain adaptation approach is a variant of TL model that relies on employing similar datasets for a shared learning problem. This adaption strategy reduces the domain shift (data dispersion) using Maximum Mean Discrepancy (MMD). The shared semantic features from CT images through TL improve the in-depth learning of softmax layer to classify different LD types. The proposed work is simply named as Convolutional LD Scan (CovLscan) framework The test outcomes reveal that the CovLscan model accomplishes an overall accuracy of 95.46% and 96.15% on the collected ChestX-ray8 and NIH-CXR datasets, which is higher than the existing models like Automated Hierarchical Deep Learning based LD Diagnosis(AHDL-LDD), EfficientNet version2-Medium (EfficientNet v2-M), Lung diseases prediction Network22(LungNet22), Chest tract disorder prediction using Dilated Convolutional Network(CDCNet) and Auction-Based Optimization Algorithm-CNN (ABOA-CNN). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Static and Seismic Safety of the Inclined Tower of Portogruaro: A Preliminary Numerical Approach.
- Author
-
Shehu, Rafael
- Subjects
MECHANICAL behavior of materials ,EPISTEMIC uncertainty ,MASONRY ,VALUATION of real property ,SIMPLICITY - Abstract
Masonry towers are peculiar structures with complex structural behavior despite biased conclusions deriving from their geometrical regularity and simplicity. Their geometrical features and the epistemic uncertainty that masonry material bears strongly influence their static and seismic behavior. This paper investigates a remarkable and representative case study. The bell tower of Portogruaro (Italy) is a 57 m high tall construction, built in the XII-th century, and has a notable inclination. The Italian Guideline for the safety assessment of masonry towers is a key focus in this paper, highlighting the pros and cons of different suggested approaches. Some relevant proposals are presented in this paper in order to address the seismic safety assessment of masonry bell towers. The findings show that very slender structures do not meet the guidelines recommendations due to limitations in their current stress state. In addition, in similar cases, the recommended values for the mechanical properties of masonry material led to predicting non-withstanding structural behavior, questioning the correct choice of the adapted material properties. Advanced pushover analysis has been conducted in order to investigate the results of the simplified approach in terms of failure patterns and seismic safety estimation. The simulations are implemented for four different hypothetical scenarios of the existing masonry mechanical properties. The results obtained for the case study tower reflect a different perspective in the seismic assessment of masonry towers when specific approaches are defined. The preliminary results on the safety of Portogruaro Tower show a significant variability of seismic safety based on the adopted scenario, highlighting the necessity to pay attention to the preservation state of the present case and of similar ones. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. A condition‐based maintenance optimization method with oscillating uncertain degradation process.
- Author
-
Li, Shuyu, Wen, Meilin, Zu, Tianpei, and Kang, Rui
- Subjects
- *
EPISTEMIC uncertainty - Abstract
Condition‐based maintenance (CBM) has gradually gained more attention, and the degradation process has been increasingly applied to maintenance optimization models. The insufficient data and the complex degradation process of the equipment conditions will contribute to epistemic uncertainty. Besides, the implementation of maintenance introduces oscillatory features into the equipment degradation process, deviating from a monotonically decreasing trend, complicating the optimization of CBM. In this article, to simultaneously address the problem of epistemic uncertainty and consider the influence of inspection and maintenance, we establish a new type of degradation model based on uncertainty theory to deal with epistemic uncertainty. Then an uncertain maintenance optimization model is proposed to give an optimal CBM strategy. Finally, a case study is provided to illustrate the proposed CBM optimization method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Modelling Up-and-Down Moves of Binomial Option Pricing with Intuitionistic Fuzzy Numbers.
- Author
-
Andrés-Sánchez, Jorge de
- Subjects
- *
PRICES , *FUZZY numbers , *FUZZY mathematics , *EPISTEMIC uncertainty , *OPTIONS (Finance) - Abstract
Since the early 21st century, within fuzzy mathematics, there has been a stream of research in the field of option pricing that introduces vagueness in the parameters governing the movement of the underlying asset price through fuzzy numbers (FNs). This approach is commonly known as fuzzy random option pricing (FROP). In discrete time, most contributions use the binomial groundwork with up-and-down moves proposed by Cox, Ross, and Rubinstein (CRR), which introduces epistemic uncertainty associated with volatility through FNs. Thus, the present work falls within this stream of literature and contributes to the literature in three ways. First, analytical developments allow for the introduction of uncertainty with intuitionistic fuzzy numbers (IFNs), which are a generalization of FNs. Therefore, we can introduce bipolar uncertainty in parameter modelling. Second, a methodology is proposed that allows for adjusting the volatility with which the option is valued through an IFN. This approach is based on the existing developments in the literature on adjusting statistical parameters with possibility distributions via historical data. Third, we introduce into the debate on fuzzy random binomial option pricing the analytical framework that should be used in modelling upwards and downwards moves. In this sense, binomial modelling is usually employed to value path-dependent options that cannot be directly evaluated with the Black–Scholes–Merton (BSM) model. Thus, one way to assess the suitability of binomial moves for valuing a particular option is to approximate the results of the BSM in a European option with the same characteristics as the option of interest. In this study, we compared the moves proposed by Renddleman and Bartter (RB) with CRR. We have observed that, depending on the moneyness degree of the option and, without a doubt, on options traded at the money, RB modelling offers greater convergence to BSM prices than does CRR modelling. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Entities, Uncertainties, and Behavioral Indicators of Consciousness.
- Author
-
Johnson, L. Syd M
- Subjects
- *
CONSCIOUSNESS , *ARTIFICIAL intelligence , *EPISTEMIC uncertainty - Abstract
Two problems related to the identification of consciousness are the distribution problem—or how and among which entities consciousness is distributed in the world—and the moral status problem—or which species, entities, and individuals have moral status. The use of inferences from neurobiological and behavioral evidence, and their confounds, for identifying consciousness in nontypically functioning humans, nonhuman animals, and artificial intelligence is considered in light of significant scientific uncertainty and ethical biases, with implications for both problems. Methodological, epistemic, and ethical consensus are needed for responsible consciousness science under epistemic and ethical uncertainty. Consideration of inductive risk is proposed as a potential tool for managing both epistemic and ethical risks in consciousness science. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. An Uncertainty-Quantification Machine Learning Framework for Data-Driven Three-Dimensional Mineral Prospectivity Mapping.
- Author
-
Zhang, Zhiqiang, Wang, Gongwen, Carranza, Emmanuel John M., Du, Jingguo, Li, Yingjie, Liu, Xinxing, and Su, Yongjun
- Subjects
MACHINE learning ,EPISTEMIC uncertainty ,GEOLOGICAL modeling ,RANDOM forest algorithms ,THREE-dimensional modeling ,MINERALS ,CONCEPTUAL models - Abstract
The uncertainty inherent in three-dimensional (3D) mineral prospectivity mapping (MPM) encompasses (a) mineral system conceptual model uncertainty stemming from geological conceptual frameworks, (b) aleatoric uncertainty, attributable to the variability and noise due to multi-source geoscience datasets collection and processing, as well as 3D geological modeling process, and (c) epistemic uncertainty due to predictive algorithm modeling. Quantifying the uncertainty of 3D MPM is a prerequisite for accepting predictive models in exploration. Previous MPM studies were centered on addressing the mineral system conceptual model uncertainty. To the best of our knowledge, few studies quantified the aleatoric and epistemic uncertainties of 3D MPM. This study proposes a novel uncertainty-quantification machine learning framework to qualify aleatoric and epistemic uncertainties in 3D MPM by the uncertainty-quantification random forest. Another innovation of this framework is utility of the accuracy–rejection curve to provide a quantitative uncertainty threshold for exploration target delineation. The Bayesian hyperparameter optimization tunes the hyperparameters of the uncertainty-quantification random forest automatically. The case study of 3D MPM for exploration target delineation in the Wulong gold district of China demonstrated the practicality of our framework. The aleatoric uncertainty of the 3D MPM indicates that the 3D Early Cretaceous dyke model is the main source of this uncertainty. The 3D exploration targets delineated by the uncertainty-quantification machine learning framework can benefit subsurface gold exploration in the study area. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. The 2023 US National Seismic Hazard Model: Subduction ground-motion models.
- Author
-
Rezaeian, Sanaz, Powers, Peter M, Altekruse, Jason, Ahdi, Sean K, Petersen, Mark D, Shumway, Allison M, Frankel, Arthur D, Wirth, Erin A, Smith, James A, Moschetti, Morgan P, Withers, Kyle B, and Herrick, Julie A
- Subjects
SUBDUCTION ,SUBDUCTION zones ,SEISMIC surveys ,EPISTEMIC uncertainty ,EARTHQUAKE intensity - Abstract
The US Geological Survey National Seismic Hazard Models (NSHMs) are used to calculate earthquake ground-shaking intensities for design and rehabilitation of structures in the United States. The most recent 2014 and 2018 versions of the NSHM for the conterminous United States included major updates to ground-motion models (GMMs) for active and stable crustal tectonic settings; however, the subduction zone GMMs were largely unchanged. With the recent development of the next generation attenuation-subduction (NGA-Sub) GMMs, and recent progress in the utilization of "M9" Cascadia earthquake simulations, we now have access to improved models of ground shaking in the US subduction zones and the Seattle basin. The new NGA-Sub GMMs support multi-period response spectra calculations. They provide global models and regional terms specific to Cascadia and terms that account for deep-basin effects. This article focuses on the updates to subduction GMMs for implementation in the 2023 NSHM and compares them to the GMMs of previous NSHMs. Individual subduction GMMs, their weighted averages, and their impact on the estimated mean hazard relative to the 2018 NSHM are discussed. The updated logic trees include three of the new NGA-Sub GMMs and retain two older models to represent epistemic uncertainty in both the median and standard deviation of ground-shaking intensities at all periods of interest. Epistemic uncertainty is further represented by a three-point logic tree for the NGA-Sub median models. Finally, in the Seattle region, basin amplification factors are adjusted at long periods based on the state-of-the-art M9 Cascadia earthquake simulations. The new models increase the estimated mean hazard values at short periods and short source-to-site distances for interface earthquakes, but decrease them otherwise, relative to the 2018 NSHM. On softer soils, the new models cause decreases to the estimated mean hazard for long periods in the Puget Lowlands basin but increases within the deep Seattle portion of this basin for short periods relative to the 2018 NSHM. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Probabilistic neural transfer function estimation with Bayesian system identification.
- Author
-
Wu, Nan, Valera, Isabel, Sinz, Fabian, Ecker, Alexander, Euler, Thomas, and Qiu, Yongrong
- Subjects
- *
ARTIFICIAL neural networks , *SYSTEM identification , *WEIGHT training , *EPISTEMIC uncertainty , *ANIMAL experimentation , *DEMAND forecasting - Abstract
Neural population responses in sensory systems are driven by external physical stimuli. This stimulus-response relationship is typically characterized by receptive fields, which have been estimated by neural system identification approaches. Such models usually require a large amount of training data, yet, the recording time for animal experiments is limited, giving rise to epistemic uncertainty for the learned neural transfer functions. While deep neural network models have demonstrated excellent power on neural prediction, they usually do not provide the uncertainty of the resulting neural representations and derived statistics, such as most exciting inputs (MEIs), from in silico experiments. Here, we present a Bayesian system identification approach to predict neural responses to visual stimuli, and explore whether explicitly modeling network weight variability can be beneficial for identifying neural response properties. To this end, we use variational inference to estimate the posterior distribution of each model weight given the training data. Tests with different neural datasets demonstrate that this method can achieve higher or comparable performance on neural prediction, with a much higher data efficiency compared to Monte Carlo dropout methods and traditional models using point estimates of the model parameters. At the same time, our variational method provides us with an effectively infinite ensemble, avoiding the idiosyncrasy of any single model, to generate MEIs. It allows to estimate the uncertainty of stimulus-response function, which we have found to be negatively correlated with the predictive performance at model level and may serve to evaluate models. Furthermore, our approach enables us to identify response properties with credible intervals and to determine whether the inferred features are meaningful by performing statistical tests on MEIs. Finally, in silico experiments show that our model generates stimuli driving neuronal activity significantly better than traditional models in the limited-data regime. Author summary: Neural system identification methods learn stimulus-response functions using experimental data to predict responses. These neuronal prediction models demand large amounts of training data, however, the recording time for each experiment is restricted, introducing the uncertainty about the neural features derived from trained models. Here, we present a Bayesian approach incorporating weight uncertainty to identify response functions and show that our method has higher or comparable predictive performance with a higher data efficiency compared to traditional methods using point estimates of model parameters. Additionally, our model provides an effective infinite ensemble to derive neural features, which avoid the idiosyncrasy of a single model. In this way, our method also allows us to estimate the uncertainty of the derived features and to conduct statistical tests on them. Generally, our Bayesian approach enables us to generate many similar stimuli to investigate biological information processing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Pharmacokinetics with intravenous infusion of two-compartment model based on Liu process.
- Author
-
Liu, Zhe and Kang, Rui
- Subjects
- *
INTRAVENOUS therapy , *PHARMACOKINETICS , *DRUG development , *DIFFERENTIAL equations , *EPISTEMIC uncertainty , *PROBABILITY theory - Abstract
By describing the absorption, distribution, metabolism, and excretion of drugs, pharmacokinetics helps to find optimal therapies for patients and speed up drug development. In pharmacokinetics, dynamic uncertainties, such as fluctuations in hormone levels and external environment factors, are ubiquitous. Due to sparse clinical data and patient disease specificity, these uncertainties are mainly epistemic uncertainties rather than aleatory uncertainties, which cannot be handled well by probability theory based methods. Therefore, several pharmacokinetics based on uncertain differential equations under the framework of uncertainty theory were investigated, which all considered one compartment models. Noting that many drugs follow two compartment kinetics, this article proposes a two compartment pharmacokinetic model based on uncertain differential equations. Based on the proposed model, several essential pharmacokinetic parameters are investigated, which are important information required by regulatory bodies to approve drugs for public use. Estimation for unknown parameters in the model are given. Finally, a real data analysis using lidocaine drug concentration illustrates our methodology in details. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. An uncertainty-aware domain adaptive semantic segmentation framework.
- Author
-
Yin, Huilin, Wang, Pengyu, Liu, Boyu, and Yan, Jun
- Subjects
EPISTEMIC uncertainty ,AUTONOMOUS vehicles ,EMPIRICAL research ,DEEP learning - Abstract
Semantic segmentation is significant to realize the scene understanding of autonomous driving. Due to the lack of annotated real-world data, the technology of domain adaptation is applied so that the model is trained on the synthetic data and inferred on the real data. However, this domain gap leads to aleatoric and epistemic uncertainty. These uncertainties link to the potential safety issue of autonomous driving in normal weather and adverse weather. In this study, we explore the scientific problem that has received sparse attention previously. We postulate that the Dual Attention module can mitigate the uncertainty in the task of semantic segmentation and provide some empirical study to validate it. Furthermore, the utilization of Kullback-Leibler divergence (KL divergence) helps the estimation of aleatoric uncertainty and boosts the robustness of the segmentation model. Our empirical study on the diverse datasets of semantic segmentation demonstrates the effectiveness of our method in normal and adverse weather. Our code is available at: https://github.com/liubo629/Seg-Uncertainty-dual-attention. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Confidence in Probabilistic Risk Assessment.
- Author
-
Zanetti, Luca
- Subjects
- *
EPISTEMIC uncertainty , *RISK assessment , *PROBABILITY theory , *CONFIDENCE , *ARGUMENT - Abstract
Epistemic uncertainties are included in probabilistic risk assessment (PRA) as second-order probabilities that represent the degrees of belief of the scientists that a model is correct. In this article, I propose an alternative approach that incorporates the scientist's confidence in a probability set for a given quantity. First, I give some arguments against the use of precise probabilities to estimate scientific uncertainty in risk analysis. I then extend the "confidence approach" developed by Brian Hill and Richard Bradley to PRA. Finally, I claim that this approach represents model uncertainty better than the standard (Bayesian) model does. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Anomaly Detection in Optical Coherence Tomography Angiography (OCTA) with a Vector-Quantized Variational Auto-Encoder (VQ-VAE).
- Author
-
Jebril, Hana, Esengönül, Meltem, and Bogunović, Hrvoje
- Subjects
- *
OPTICAL coherence tomography , *EPISTEMIC uncertainty , *DEEP learning , *BLOOD flow , *ANGIOGRAPHY - Abstract
Optical coherence tomography angiography (OCTA) provides detailed information on retinal blood flow and perfusion. Abnormal retinal perfusion indicates possible ocular or systemic disease. We propose a deep learning-based anomaly detection model to identify such anomalies in OCTA. It utilizes two deep learning approaches. First, a representation learning with a Vector-Quantized Variational Auto-Encoder (VQ-VAE) followed by Auto-Regressive (AR) modeling. Second, it exploits epistemic uncertainty estimates from Bayesian U-Net employed to segment the vasculature on OCTA en face images. Evaluation on two large public datasets, DRAC and OCTA-500, demonstrates effective anomaly detection (an AUROC of 0.92 for the DRAC and an AUROC of 0.75 for the OCTA-500) and localization (a mean Dice score of 0.61 for the DRAC) on this challenging task. To our knowledge, this is the first work that addresses anomaly detection in OCTA. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.