2,899 results on '"Epistemic uncertainty"'
Search Results
152. Influence of Epistemic Uncertainty on the Seismic Vulnerability of Indian Code-Compliant RC Frame Building
- Author
-
Gondaliya, Kaushik, Bhaiya, Vishisht, Vasanwala, Sandip, Desai, Atul, Amin, Jignesh, di Prisco, Marco, Series Editor, Chen, Sheng-Hong, Series Editor, Vayas, Ioannis, Series Editor, Kumar Shukla, Sanjay, Series Editor, Sharma, Anuj, Series Editor, Kumar, Nagesh, Series Editor, Wang, Chien Ming, Series Editor, Shrikhande, Manish, editor, Agarwal, Pankaj, editor, and Kumar, P. C. Ashwin, editor
- Published
- 2023
- Full Text
- View/download PDF
153. Getting It Right? The Site Selection Process for Canada’s High-level Nuclear Waste
- Author
-
Bratt, Duane, Larkin, Patricia, Deschênes-Philion, Xavier, and Gattinger, Monica, editor
- Published
- 2023
- Full Text
- View/download PDF
154. Handling Uncertainties with and Within Digital Twins
- Author
-
Abdoune, Farah, Rifi, Leah, Fontanili, Franck, Cardin, Olivier, Kacprzyk, Janusz, Series Editor, Borangiu, Theodor, editor, Trentesaux, Damien, editor, and Leitão, Paulo, editor
- Published
- 2023
- Full Text
- View/download PDF
155. Uncertainty Quantification of Structures Using Belief Theory
- Author
-
Metagudda, Sushma H., Balu, A. S., di Prisco, Marco, Series Editor, Chen, Sheng-Hong, Series Editor, Vayas, Ioannis, Series Editor, Kumar Shukla, Sanjay, Series Editor, Sharma, Anuj, Series Editor, Kumar, Nagesh, Series Editor, Wang, Chien Ming, Series Editor, Saha, Suman, editor, Sajith, A. S., editor, Sahoo, Dipti Ranjan, editor, and Sarkar, Pradip, editor
- Published
- 2023
- Full Text
- View/download PDF
156. Optimal Dispatch of Integrated Energy System Based on Flexibility of Thermal Load
- Author
-
HU Bo, CHENG Xin, SHAO Changzheng, HUANG Wei, SUN Yue, XIE Kaigui
- Subjects
demand flexibility ,epistemic uncertainty ,heat and electricity integrated energy system(he-ies) ,optimal dispatch ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Chemical engineering ,TP155-156 ,Naval architecture. Shipbuilding. Marine engineering ,VM1-989 - Abstract
The flexibility of thermal loads of buildings is a valuable balancing resource for operation of the heat and electricity integrated energy system (HE-IES). Considering the characteristics of large scale and small single load capacity of the themal load, the non-intrusive data-driven method has become an effective means to quantify the flexibility of building thermal load. However, due to the inaccuracy of the model or the lack of data, this method inevitably produces errors and brings epistemic uncertainty to the optimal dispatch of the HE-IES. An optimal dispatch model of the HE-IES that is compatible with the epistemic uncertainty of demand flexibility in the thermal loads of buildings is proposed. First, a data-driven flexible demand assessment method for building thermal load is described. The measurement errors are modeled as epistemic uncertainty and the multiple error sources are combined by using the D-S evidence theory. Then, the representative scenarios are selected to represent the epistemic uncertainty of the demand flexibility based Latin hypercube sampling(LHS) method, and the scenarios are reduced by the fuzzy clustering method. Finally, the representative scenarios are embedded in the coordinated and optimized dispatch of the HE-IES to realize the comprehensive consideration of the thermal load flexibility and related epistemic uncertainty of the building. The results demonstrate that considering the epistemic uncertainties of the thermal load demand is crucial for reducing the wind power curtailments and improving the operational flexibility of HE-IES.
- Published
- 2023
- Full Text
- View/download PDF
157. A new type of P-box for structural reliability analysis.
- Author
-
Solovev, Sergey and Ilichev, Evgeniy
- Subjects
- *
STRUCTURAL reliability , *DISTRIBUTION (Probability theory) , *EPISTEMIC uncertainty , *STANDARD deviations , *STRUCTURAL failures - Abstract
The article presents a new type of the p-box (probability box) for structural reliability analysis. The p-box is an area bounded by two cumulative distribution functions. There is a real (but unknown in advance) cumulative distribution function inside this area. The p-box can be used as a model of random variables. Reliability is expressed as an interval of values of no-failure probabilities in the p-boxes approaches for structural reliability analysis. Only two moments of a random variable is using for proposed p-box construction: expected value and standard deviation. The new p-box model is based on the Chebyshev's inequality, the Kolmogorov-Smirnov bounds and confidence intervals for an expected value, standard deviation and median. A numerical example shows that it is possible to reduce epistemic uncertainty by about 10-15 percent using the presented p-box model. An even greater effect can be achieved with algebraic operations with p-boxes when they are discretized in Dempster-Shafer's structures. The article demonstrates it based on the well-known algorithm for summing two p-boxes. Also, the advantage of the proposed model is that there is no need to confirm the hypothesis about the type of probability distribution. Because once the assumption about the probability distributions is not satisfied, the following structural reliability analysis seems doubtful and meaningless. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
158. Sobre o conatus de Thomas Hobbes e as artes retóricas.
- Author
-
Nakayama, Patrícia
- Subjects
- *
SEVENTEENTH century , *PHENOMENOLOGICAL theory (Physics) , *TRADITION (Philosophy) , *ENGLISH language , *RHETORIC , *EPISTEMIC uncertainty - Abstract
This study argues that, for Hobbes, rhetoric was an important linguistic and conceptual apparatus available in his time, especially for the description of physical phenomena in the nascent experimental science of the English 17th century. Considering that the epistemic parameters for describing this science were yet to be developed, Hobbes sought solutions in the doctrines of the arts of good speech, starting with his anthropology. The notion of conatus in Hobbes indicates the weight of the classical rhetorical tradition in his philosophy, with emphasis on Aristotle, the Stoics and Quintilian. We will see how rhetoric, articulated with physics and ethics, constituted a relevant paradigm of expression and research for Thomas Hobbes on conatus and went beyond the ornamentation of the text in his study of bodies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
159. Asphalt concrete dynamic modulus prediction: Bayesian neural network approach.
- Author
-
Asadi, Babak, Hajj, Ramez, and Al-Qadi, Imad L.
- Subjects
- *
ASPHALT concrete , *BAYESIAN analysis , *MINERAL aggregates , *EPISTEMIC uncertainty , *WEB-based user interfaces - Abstract
This paper presents a probabilistic model for predicting the dynamic modulus |E*| of asphalt concrete (AC). A Bayesian Neural Network (BNN) trained on a substantial dataset collected from various states was employed. This approach accounts for the inherent stochasticity in the data generation process (i.e. aleatoric uncertainty) and addresses epistemic or model uncertainty as well. The model successfully predicted dynamic moduli for unseen testing datasets. In practice, predicted moduli could be as accurate and effective as measured values, considering the model uncertainty and |E*| test variability. However, an elevated epistemic uncertainty at extremely low and high |E*| ranges would be expected due to relatively low data points. To enhance interpretability, Shapley Additive Explanations (SHAP) analysis was utilised, demonstrating adherence to physical laws. The primary factors influencing predicted moduli, as identified by the analysis, are temperature, frequency, voids in mineral aggregates, reclaimed asphalt pavement content, and binder low-performance grade. The BNN model was deployed in a web application, making it accessible for application. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
160. Bayesian neural networks with physics‐aware regularization for probabilistic travel time modeling.
- Author
-
Olivier, Audrey, Mohammadi, Sevin, Smyth, Andrew W., and Adams, Matt
- Subjects
- *
TRAVEL time (Traffic engineering) , *BAYESIAN analysis , *EMERGENCY medical services , *EPISTEMIC uncertainty , *FIRE departments - Abstract
The integration of data‐driven models such as neural networks for high‐consequence decision making has been largely hindered by their lack of predictive power away from training data and their inability to quantify uncertainties often prevalent in engineering applications. This article presents an ensembling method with function‐space regularization, which allows to integrate prior information about the function of interest, thus improving generalization performance, while enabling quantification of aleatory and epistemic uncertainties. This framework is applied to build a probabilistic ambulance travel time predictor, leveraging historical ambulance data provided by the Fire Department of New York City. Results show that the integration of a non‐Gaussian likelihood and prior information from a road network analysis yields appropriate probabilistic predictions of travel times, which could be further leveraged for emergency medical service (EMS) decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
161. Similarity-Based Remaining Useful Lifetime Prediction Method Considering Epistemic Uncertainty.
- Author
-
Wu, Wenbo, Zou, Tianji, Zhang, Lu, Wang, Ke, and Li, Xuzhi
- Subjects
- *
REMAINING useful life , *EPISTEMIC uncertainty , *GENOME editing - Abstract
Measuring the similarity between two trajectories is fundamental and essential for the similarity-based remaining useful life (RUL) prediction. Most previous methods do not adequately account for the epistemic uncertainty caused by asynchronous sampling, while others have strong assumption constraints, such as limiting the positional deviation of sampling points to a fixed threshold, which biases the results considerably. To address the issue, an uncertain ellipse model based on the uncertain theory is proposed to model the location of sampling points as an observation drawn from an uncertain distribution. Based on this, we propose a novel and effective similarity measure metric for any two degradation trajectories. Then, the Stacked Denoising Autoencoder (SDA) model is proposed for RUL prediction, in which the models can be first trained on the most similar degradation data and then fine-tuned by the target dataset. Experimental results show that the predictive performance of the new method is superior to prior methods based on edit distance on real sequence (EDR), longest common subsequence (LCSS), or dynamic time warping (DTW) and is more robust at different sampling rates. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
162. Health status assessment of radar systems at aerospace launch sites by fuzzy analytic hierarchy process.
- Author
-
Li, Chunyang, Chen, Hong, Xiahou, Tangfan, Zhang, Qin, and Liu, Yu
- Subjects
- *
ANALYTIC hierarchy process , *EPISTEMIC uncertainty , *FUZZY numbers - Abstract
The radar system is a fundamental part of the launch site ground equipment, and its health status is critical to the success of testing tasks and launch missions at aerospace launch sites. The health status assessment of radar systems is, however, challenging due to multiple health indicators of its components and epistemic uncertainty associated with the elicited data. In this article, a fuzzy analytic hierarchy process method, which combines the analytic hierarchy process (AHP) and fuzzy comprehensive evaluation methods, is put forth to assess the health status of radar systems at aerospace launch sites. In the first place, the four‐layer hierarchical structure, including the system, subsystem, performance indicator, and parameter index layers, of a real‐world radar system at an aerospace launch site is constructed. The AHP is, then, leveraged to identify the weights of each layer of the hierarchical structure. By eliciting the data of parameter indices as fuzzy numbers and using the fuzzy comprehensive evaluation method to calculate the layer‐by‐layer health status of the hierarchical structure, the health grade membership of the radar system is calculated. The health status of the radar system is, then, assessed in accordance to the maximum membership principle. A new importance metric is proposed to assess the influence of parameter indices on the current system's health status. The results of the importance metric can provide valuable insight for reliability enhancement and health management of the radar system. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
163. The Objectivity of Science.
- Author
-
Sankey, Howard
- Subjects
- *
OBJECTIVITY , *REALISM , *SCIENTIFIC method , *EPISTEMIC uncertainty - Abstract
The idea that science is objective, or able to achieve objectivity, is in large part responsible for the role that science plays within society. But what is objectivity? The idea of objectivity is ambiguous. This paper distinguishes between three basic forms of objectivity. The first form of objectivity is ontological objectivity: the world as it is in itself does not depend upon what we think about it; it is independent of human thought, language, conceptual activity or experience. The second form of objectivity is the objectivity of truth: truth does not depend upon what we believe or justifiably believe; truth depends upon the way reality itself is. The third form of objectivity is epistemic objectivity: this form of objectivity resides in the scientific method which ensures that subjective factors are excluded, and only epistemically relevant factors play a role in scientific inquiry. The paper considers two problems that arise for the notion of epistemic objectivity: the theorydependence of observation and the variability of the methods of science. It is argued that the use of shared standard procedures ensures the objectivity of observation despite theory-dependence. It is argued that the variability of methods need not lead to an epistemic relativism about science. The paper concludes with the realist suggestion that the best explanation of the success of the sciences is that the methods employed in the sciences are highly reliable truth-conducive tools of inquiry. The objectivity of the methods of the sciences leads to the objective truth about the objective world. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
164. Nuclear Data Uncertainty Propagation for the Molten Salt Fast Reactor Design.
- Author
-
Abrate, Nicolo', Aimetta, Alex, Dulla, Sandra, and Pedroni, Nicola
- Subjects
- *
MOLTEN salt reactors , *TECHNOLOGY assessment , *MONTE Carlo method , *PERTURBATION theory , *FUSED salts , *DATA libraries , *EPISTEMIC uncertainty - Abstract
The development of new reactor technologies requires careful assessments of the various sources of epistemic uncertainties. In this work, nuclear data uncertainties featuring the main isotopes of the U/Th molten salt fast reactor (MSFR) design are propagated through Monte Carlo calculations to quantify the final uncertainty on some relevant integral parameters. In the first part of this paper, some best-estimate calculations are performed by selecting different nuclear data libraries, showing the remarkable impact of this choice on the final responses. Then the Generalized Perturbation Theory routine available in Serpent 2 is adopted for a preliminary sensitivity and uncertainty analyses with respect to keff, highlighting a significant discrepancy between the covariance of the JEFF-3.3 and ENDF/B-VIII.0 libraries. After the selection of a few relevant nuclides, namely, 7Li, 19F, 232Th, and 233U, the Total Monte Carlo method and the unscented transform (UT) are then adopted to estimate the uncertainty of other responses of interest like the conversion ratio and some multigroup constants. Some potential issues of the UT are highlighted, and a mitigation strategy is applied. A relevant result of this analysis concerns the need for better data evaluations for the nuclides constituting the circulating salt for an effective deployment of the MSFR technology. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
165. Designing structures with polymorphic uncertainty: Enhanced decision making using information reduction measures to quantify robustness.
- Author
-
Böttcher, Maria, Graf, Wolfgang, and Kaliske, Michael
- Subjects
- *
INFORMATION measurement , *DECISION making , *EPISTEMIC uncertainty , *ENGINEERING design , *DATA modeling - Abstract
The application of information reduction measures (IRMs) can provide valuable insight and enhance the process of design optimization when dealing with data uncertainty. For the engineering task of designing structures or products, an adequate modeling of data uncertainty is required. Therefore, a consideration of both aleatoric and epistemic uncertainty in combined form as polymorphic uncertain input variables is utilized. The resulting uncertain output quantities are post‐processed to provide relevant insights into robustness and performance for the design optimization. To this end, IRMs are applied, categorized into representative and uncertainty quantifying measures. Various IRMs exist, but clear recommendations or explanations of why certain uncertainty quantifying measures are chosen are scarce, although different features of uncertain quantities are considered with different measures. The aim of this contribution is to give an insight to commonly applied IRMs and the specific information of the uncertain quantity they reflect. Additionally, handling results of nested uncertainty analyses of polymorphic uncertain quantities regarding robustness towards aleatoric and epistemic uncertainty is investigated. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
166. Logic tree branches' weights in the probabilistic seismic hazard analysis: the need to combine inter-subjective and propensity probability interpretations.
- Author
-
Motaghed, Sasan, Eftekhari, Nasrollah, Mohammadi, Mohammad, and Khazaee, Mozhgan
- Subjects
- *
EARTHQUAKE hazard analysis , *TREE branches , *GROUND motion , *PROBABILITY theory , *EPISTEMIC uncertainty , *LOGIC , *HAZARD mitigation - Abstract
Probabilistic seismic hazard analysis (PSHA) is the primary method for determining the earthquake forces as input to structural seismic evaluation and design. Epistemic uncertainty has been incorporated into the PSHA process using a logic tree. One of the main challenges in using logic trees is determining ground motion prediction equations (GMPEs) and their branches' weights. In this paper, regarding the different definitions of probability, the philosophy of GMPE selection and logic tree branches' weight allocation in the PSHA is investigated. The results show that the classical and frequency definitions of probability are not applicable in the selection and weight allocation process. We suggest that the best way to allocate weight can be obtained by combining the inter-subjective and propensity probability definitions. To evaluate the effect of weight allocation on the PSHA results, PSHA was performed for a site in Tehran using different selection and weighting approaches. The results of the numerical example show up to a 50% variation in the spectral acceleration in the range of common building periods. We show that the issue of GMPE selection and weight allocation has not been adequately addressed in the current procedures of PSHA. So, it is necessary to develop specific agendas in this field. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
167. On complementing unsupervised learning with uncertainty quantification.
- Author
-
Kazemi, Ehsan, Taherkhani, Fariborz, and Wang, Liqiang
- Subjects
- *
CONFIRMATION bias , *EPISTEMIC uncertainty , *SUPERVISED learning , *SECURE Sockets Layer (Computer network protocol) , *GENERALIZATION - Abstract
Pseudo-label-based semi-supervised learning (SSL) recently has achieved significant success in unlabeled data utilization. Recent success on pseudo-label-based SSL methods crucially hinges on thresholded pseudo-labeling and consistency regularization for the unlabeled data. However, most of the existing methods do not measure and incorporate the uncertainties due to the noisy pseudo-labels or out-of-distribution unlabeled samples. Therefore, the model's discernment becomes noisier in real-life applications that involve a substantial amount of out-of-distribution unannotated data. This leads to slow convergence in the training process and poor generalization performance. Inspired by the recent developments in SSL, our goal in this paper is to propose a novel unsupervised uncertainty-aware objective and threshold-mediated pseudo-labeling scheme that rely on uncertainty quantification from aleatoric and epistemic sources. By incorporating recent techniques in SSL, our proposed uncertainty-aware framework can mitigate the issue of confirmation bias and the impact of noisy pseudo-labels, resulting in improved training efficiency and enhanced generalization performance within the SSL domain. Despite its simplicity and computational efficiency, our approach demonstrates improved performance compared to state-of-the-art SSL methods on challenging datasets, such as CIFAR-100 and the real-world dataset Semi-iNat. • We introduce an uncertainty SSL loss using the aleatoric and epistemic uncertainty. • The goal is to align the uncertainty distributions of labeled and unlabeled data. • The proposed uncertainty-aware framework improves confidence thresholding in SSL. • Experiments verifies alleviation in noisy pseudo-labeling leading in robust learning. • Our method can be used with any pseudo-label-based SSL to mitigate confirmation bias. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
168. Cohesion: A Measure of Organisation and Epistemic Uncertainty of Incoherent Ensembles.
- Author
-
Davey, Timothy
- Subjects
- *
EPISTEMIC uncertainty , *COHESION , *DYNAMICAL systems , *ATTRACTORS (Mathematics) - Abstract
This paper offers a measure of how organised a system is, as defined by self-consistency. Complex dynamics such as tipping points and feedback loops can cause systems with identical initial parameters to vary greatly by their final state. These systems can be called non-ergodic or incoherent. This lack of consistency (or replicability) of a system can be seen to drive an additional form of uncertainty, beyond the variance that is typically considered. However, certain self-organising systems can be shown to have some self-consistency around these tipping points, when compared with systems that find no consistent final states. Here, we propose a measure of this self-consistency that is used to quantify our confidence in the outcomes of agent-based models, simulations or experiments of dynamical systems, which may or may not contain multiple attractors. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
169. Using Decision Analysis to Determine the Feasibility of a Conservation Translocation.
- Author
-
Keating, Laura M., Randall, Lea, Stanton, Rebecca, McCormack, Casey, Lucid, Michael, Seaborn, Travis, Converse, Sarah J., Canessa, Stefano, and Moehrenschlager, Axel
- Subjects
DECISION making ,WILDLIFE reintroduction ,EPISTEMIC uncertainty ,ENVIRONMENTAL degradation ,SUSTAINABILITY ,LAND management - Abstract
Conservation translocations, intentional movements of species to protect against extinction, have become widespread in recent decades and are projected to increase further as biodiversity loss continues worldwide. The literature abounds with analyses to inform translocations and assess whether they are successful, but the fundamental question of whether they should be initiated at all is rarely addressed formally. We used decision analysis to assess northern leopard frog reintroduction in northern Idaho, with success defined as a population that persists for at least 50 years. The Idaho Department of Fish and Game was the decision maker (i.e., the agency that will use this assessment to inform their decisions). Stakeholders from government, indigenous groups, academia, land management agencies, and conservation organizations also participated. We built an age-structured population model to predict how management alternatives would affect probability of success. In the model, we explicitly represented epistemic uncertainty around a success criterion (probability of persistence) characterized by aleatory uncertainty. For the leading alternative, the mean probability of persistence was 40%. The distribution of the modelling results was bimodal, with most parameter combinations resulting in either very low (<5%) or relatively high (>95%) probabilities of success. Along with other considerations, including cost, the Idaho Department of Fish and Game will use this assessment to inform a decision regarding reintroduction of northern leopard frogs. Conservation translocations may benefit greatly from more widespread use of decision analysis to counter the complexity and uncertainty inherent in these decisions. History: This paper has been accepted for the Decision Analysis Special Issue on Decision Analysis to Advance Environmental Sustainability. Funding: This work was supported by the Wilder Institute/Calgary Zoo, the U.S. Fish and Wildlife Service [Grant F18AS00095], the NSF Idaho EPSCoR Program and the National Science Foundation [Grant OIA-1757324], and the Hunt Family Foundation. Supplemental Material: The online appendix is available at https://doi.org/10.1287/deca.2023.0472. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
170. On crosswords and jigsaw puzzles: the epistemic limits of the EU Courts and a board of appeal in handling empirical uncertainty.
- Author
-
Krajewski, Michał
- Subjects
CROSSWORD puzzles ,JIGSAW puzzles ,APPELLATE courts ,JUDICIAL review ,JUDGES ,HELP-seeking behavior ,EPISTEMIC uncertainty ,WISDOM - Abstract
This Article sheds new light on the long-running debate in EU legal studies about how intense the EU judicial review of complex and uncertain assessments requiring specialist knowledge could and should be. It argues that it is necessary to move beyond formulas and concepts hammered out in the judicial statements of reasons and consider how the institutional context affects legal epistemology. How likely is it that the judges form an independent opinion about the probative value of the presented evidence and the soundness of the administration's specialist reasoning? How likely is it that their opinion is reliable? Answering these questions helps appraise the boundaries in which judicial review or proliferating administrative review by partly specialised boards of appeal foster the rule of law understood as the pursuit of non-arbitrariness. The Article examines recent case law of the EU Courts and the Board of Appeal of the European Chemical Agency concerning public health and environmental issues, in which complex and uncertain specialist assessments were prevalent. It contends that, due to institutional limitations of EU adjudicatory bodies, a further expansion of the rule of law in EU decision-making requiring specialist knowledge should be pursued through extra-judicial means fostering transparency, inclusiveness, and accountability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
171. Uncertainty Source Analysis in LCA: Case Studies—3D Printed versus Conventionally Manufactured Building Components.
- Author
-
Mrazovic, Natasa, Fischer, Martin, and Lepech, Michael
- Subjects
ENVIRONMENTAL impact analysis ,PRODUCT life cycle assessment ,EPISTEMIC uncertainty ,COMPUTER software development ,THREE-dimensional printing - Abstract
Life cycle assessment (LCA), standardized by ISO 14040, is today's most commonly used method for analysis of environmental impact and sustainability and an aid to decision-making in the industry. However, LCA’s inherent flexibility and associated uncertainty sources are causing inconsistencies in the assessment results if different assessors with similar backgrounds analyze identical input data and use identical LCA software. The authors identified and quantified this problem at Stanford University while using LCA for environmental impact analysis of additive manufacturing (AM), popularly known as 3D printing, compared to conventional manufacturing, for the production of metallic building components. To address the identified inconsistencies, the authors formalized a semiautomated assessment framework that improved the consistency of the LCA results by a factor of four in total, but specific inconsistencies remained. The authors then conducted an uncertainty analysis to identify the uncertainty sources causing inconsistencies, quantify their impact on the LCA results, and propose mitigations. The uncertainty analysis found that the use of the formalized framework reduces the impact of the behavioral uncertainty source category, such as decision-making and preferences, and the interaction category, such as teamwork. The proposed mitigations include the development of direct automated links to design and LCA software and the development of product models of existing AM processes for LCA databases. The proposed mitigations cannot fully influence the impact of the epistemic behavioral uncertainty sources, such as the learning process. Future work includes testing the methodology and the findings with more cases and on other similar process-based LCAs. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
172. Uncertainty-aware hierarchical reinforcement learning for long-horizon tasks.
- Author
-
Hu, Wenning, Wang, Hongbin, He, Ming, and Wang, Nianbin
- Subjects
REINFORCEMENT learning ,MACHINE learning ,EPISTEMIC uncertainty ,HORIZON - Abstract
Hierarchical reinforcement learning excels at dividing difficult task goals into easily achievable subgoals. It provides an effective means to solve long-horizon planning tasks that are trapped in high-dimensional complex environments. However, because it is challenging to train multiple levels of policies simultaneously, hierarchical reinforcement learning often suffers from the training non-stationary problem. Existing work analyzing the training non-stationary problem focuses on the noisy data created by the changes in low-level policy, which makes the high-level policy with aleatoric uncertainty. But the uncertain factors leading to the instability of high-level policy training are manifold. First, the randomness of the environments also generates noise in the high-level replay buffer, forming aleatoric uncertainty. Second, the limited transitions due to the agent's insufficient exploration ability constitute the high-level policy's epistemic uncertainty. In this paper, we first comprehensively examine the causes of the instability of hierarchical reinforcement learning training to address the uncertainty of high-level policy networks. On this basis, we propose uncertainty-aware hierarchical reinforcement learning (UAHRL), a novel framework to solve long-horizon tasks with stable learning. UAHRL constructs an action uncertainty estimation network based on deep ensembles to capture both uncertainties. The calculated uncertainties are then considered in the high-level training process to reduce non-stationary phenomena. The experiment results demonstrate that UAHRL outperforms the state-of-the-art hierarchical reinforcement learning algorithms in terms of sampling efficiency while also performing better on a series of long-horizon tasks with continuous action and state space. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
173. Effets de la caractérisation probabiliste des paramètres du sol et de l'ISS sur la vulnérabilité sismique d'une structure en béton armé.
- Author
-
Henni, Fodil, Zoutat, Meriem, Mekki, Mohammed, Hemsas, Miloud, and Hentri, Mohammed
- Subjects
SOIL-structure interaction ,EPISTEMIC uncertainty ,SOIL classification ,REINFORCED concrete ,SOILS - Abstract
Copyright of Journal of Materials & Engineering Structures is the property of Journal of Materials & Engineering Structures and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
174. Study on Impact Resistance of Steel Slag Fine Aggregate Concrete Based on Drop Hammer Impact Test.
- Author
-
LIU Qiuyu, XUE Gang, TAN Junqing, and LI Jingjun
- Subjects
IMPACT testing ,SLAG ,CONCRETE ,WEIBULL distribution ,STEEL ,EPISTEMIC uncertainty - Abstract
To investigate the impact resistance of steel slag fine aggregate concrete, the concrete specimens were prepared by replacing fine aggregate with equal volume of steel slag at replacement rates of 10%, 20% and 30%, respectively. The impact resistance test was carried out by drop hammer impact device to study the impact life and damage evolution law, and the impact resistance of specimen was compared with ordinary concrete. The results show that the addition of steel slag fine aggregate can improve the impact resistance of concrete, and the impact life is obviously improved when the drop weight is small. There is a certain degree of dispersion in the results of concrete impact resistance test, and the impact life of steel slag concrete conforms to Weibull distribution. Based on the Weibull distribution, the impact life under different failure probabilities is estimated, and the impact damage evolution equation is established to calculate the corresponding impact times when the concrete with different steel slag content reaches a certain damage variable. [ABSTRACT FROM AUTHOR]
- Published
- 2023
175. Using Bayesian Neural Networks for Uncertainty Assessment of Ore Type Boundaries in Complex Geological Models.
- Author
-
Jordão, Helga, Sousa, António Jorge, and Soares, Amílcar
- Subjects
CONVOLUTIONAL neural networks ,BAYESIAN analysis ,ARTIFICIAL neural networks ,GEOLOGICAL modeling ,ORES ,METAL sulfides - Abstract
Building an orebody model is a key step in the design and operation of a mine because it provides the basis for follow-up mine decisions. Recently, it was shown that convolutional neural networks can successfully reproduce the manual geological interpretation of a complex ore deposit. The deep learning approach mitigates the shortcomings of a labor-intensive process that greatly limits the speed at which geological resources can be updated. However, convolutional neural network architectures lack the ability to measure the confidence of their predictions. In this study, we tried to assess the uncertainty of the boundaries of these domains so that the characterization of metal grades within them can account for this uncertainty. We explored and compared Monte Carlo Dropout and Bayesian neural networks to assess the uncertainty of deep convolutional neural network models trained to predict geological domains conditioned to drill-hole data. Monte Carlo Dropout uncertainty maps reflect the uncertainty in geological interpretations. The uncertainty is highest in areas where the interpreter/geologist had more difficulty delineating the boundaries of geological bodies. This is known as geological interpretation uncertainty. In contrast, Bayesian neural network uncertainty is visible depending on ore type frequency, complexity, and heterogeneity. Bayesian neural networks are able to better represent the uncertainty regarding the unknown. The application example here is a real case study of several ore types from a polymetallic sulfide orebody located in the south of Portugal. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
176. In 'crisis' we trust? On (un)intentional knowledge distortion and the exigency of terminological clarity in academic and political discourses on Russia's war against Ukraine.
- Author
-
Tyushka, Andriy
- Subjects
ACADEMIC discourse ,WAR ,ACADEMIC debating ,CRISES ,EPISTEMIC uncertainty - Abstract
The study of Russia's evolving (c)overt aggression against Ukraine has received vast scholarly attention, with multivariate accounts offered on what the conflict is actually about, what are its root causes, its legal and political nature as well as what the future may hold for its evolution and resolution. Thereby, a number of contending conceptualizations of the Russia-Ukraine conflict(s), including the hyperinflated 'Ukraine crisis' term, vividly showed that there is a clash of (factual and fictional) narratives in both media, politics, and academia, a good share of which (un)intentionally contribute to the distortion, rather than production, of knowledge. With a critical introspection into the academic debate on the matter, this article seeks to uncover the dominant framings of the evolving Russia-Ukraine war and modalities of both unintentional, intentional and collateral knowledge distortion, as well as to stimulate overdue discussion on the conceptual distinction between 'crisis', 'conflict' and 'war' paradigms in the conflict-sensitive context. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
177. Mapping Seismic Hazard for Canadian Sites Using Spatially Smoothed Seismicity Model.
- Author
-
Feng, Chao and Hong, Han-Ping
- Subjects
EARTHQUAKE hazard analysis ,EARTHQUAKE resistant design ,EPISTEMIC uncertainty ,GROUND motion ,EARTHQUAKES ,STRUCTURAL design - Abstract
The estimated seismic hazard based on the delineated seismic source model is used as the basis to assign the seismic design loads in Canadian structural design codes. An alternative for the estimation is based on a spatially smoothed source model. However, a quantification of differences in the Canadian seismic hazard maps (CanSHMs) obtained based on the delineated seismic source model and spatially smoothed model is unavailable. The quantification is valuable to identify epistemic uncertainty in the estimated seismic hazard and the degree of uncertainty in the CanSHMs. In the present study, we developed seismic source models using spatial smoothing and historical earthquake catalogue. We quantified the differences in the estimated Canadian seismic hazard by considering the delineated source model and spatially smoothed source models. For the development of the spatially smoothed seismic source models, we considered spatial kernel smoothing techniques with or without adaptive bandwidth. The results indicate that the use of the delineated seismic source model could lead to under or over-estimation of the seismic hazard as compared to those estimated based on spatially smoothed seismic source models. This suggests that an epistemic uncertainty caused by the seismic source models should be considered to map the seismic hazard. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
178. Data‐driven predictive modeling in risk assessment: Challenges and directions for proper uncertainty representation.
- Author
-
Stødle, Kaia, Flage, Roger, Guikema, Seth D., and Aven, Terje
- Subjects
PREDICTION models ,RISK assessment ,EPISTEMIC uncertainty - Abstract
Data‐driven predictive modeling is increasingly being used in risk assessments. While such modeling may provide improved consequence predictions and probability estimates, it also comes with challenges. One is that the modeling and its output does not measure and represent uncertainty due to lack of knowledge, that is, "epistemic uncertainty." In this article, we demonstrate this point by conceptually linking the main elements and output of data‐driven predictive models with the main elements of a general risk description, thereby placing data‐driven predictive modeling on a risk science foundation. This allows for an evaluation of such modeling with reference to risk science recommendations for what constitutes a complete risk description. The evaluation leads us to conclude that, as a minimum, to cover all elements of a complete risk description a risk assessment using data‐driven predictive modeling needs to be supported by assessments of the uncertainty and risk related to the assumptions underlying the modeling. In response to this need, we discuss an approach for assessing assumptions in data‐driven predictive modeling. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
179. Seismic excitation model in probabilistic risk assessment of civil infrastructure.
- Author
-
Falamarz‐Sheikhabadi, Mohammad R.
- Subjects
INFRASTRUCTURE (Economics) ,RISK assessment ,GROUND motion ,PROBABILITY density function ,EPISTEMIC uncertainty ,ASSET management - Abstract
Realistic risk assessment of civil infrastructure subjected to catastrophic events plays a crucial role in strategic asset management. In this process, visualization of the asset and its environment always possesses some epistemic uncertainty associated with the modeler's simplification/abstraction. The present paper aims at providing a critical perspective into the characterization of the seismic excitation model for risk assessment of civil infrastructures and identifying gaps in the present knowledge. For this purpose, the main factors that influence the development of a comprehensive framework for the simulation of seismic excitation are highlighted. The coupling effect of the spatial variability of ground motions and the substructure's footprint on the realization of the seismic excitation model is illustrated. Finally, the necessity of direct incorporation of the load analysis subtask in the probabilistic seismic risk assessment of civil infrastructures is discussed. In addition, simplified probability density functions are proposed for the simulation of the seismic excitation model. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
180. Judgment under radical uncertainty: Epistemic rational heuristics.
- Author
-
Grandori, Anna
- Subjects
JUDGMENT (Psychology) ,EPISTEMIC uncertainty ,LEGAL judgments ,HEURISTIC ,SCIENTIFIC method ,FLIGHT simulators - Abstract
This article explores the concept of judgment under radical uncertainty and the need for rational heuristics in decision-making. It argues that traditional models of rationality are insufficient for complex and uncertain situations, and proposes the use of constructivist rationality instead. The article discusses various heuristics that can be used in conditions of radical uncertainty, and emphasizes their importance in navigating complex problems. It also considers the potential implications of these heuristics for governance structures and decision-making bodies. A viewpoint commentary further discusses how artificial intelligence can support decision-making under high levels of uncertainty. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
181. Where will the next flank eruption at Etna occur? An updated spatial probabilistic assessment.
- Author
-
Sandri, Laura, Garcia, Alexander, Proietti, Cristina, Branca, Stefano, Ganci, Gaetana, and Cappello, Annalisa
- Subjects
VOLCANIC eruptions ,EPISTEMIC uncertainty ,KERNEL functions ,SENSITIVITY analysis ,RIFTS (Geology) ,VOLCANOES - Abstract
In this paper, we propose an update of the spatial probability map for flank eruptions from Etna (Italy), based on the distribution of the flank eruptive fissures that opened in the last 4000 years. The general procedure followed is to split the fissure dataset into training and testing subsets; then we build models on the training subset under different assumptions and test them on their likelihood of the testing subset. This allows selecting objectively the best models and assumptions. Furthermore, it allows testing whether (i) unavoidable incompleteness in the mapped fissures, and (ii) possible migration through time in the location of the flank activity, have an effect on the training models that can or cannot be neglected. We used different spatial models by exploiting different Kernel functions (Exponential, Cauchy, Uniform, and Gaussian), and calculated the degree of clustering of flank fissures in the training data. The results show that neither under-recording nor possible migration in time affect significantly the informativeness of the previous flank fissures in forecasting the location of the successive ones. Our study provides a canonical map of the spatial probability for future flank eruptions at Etna based on the location of flank fissures that opened in the last 4000 years. The map confirms a preferred location along a Northeast-to-South area, corresponding to the location of the most active rifts. It also shows that the Southern flank of the volcano, which is the most urbanized one, sits downhill of the largest cumulated-probability area for flank eruption. We also run sensitivity analyses to test the effect of (i) restricting the data to the most recent 400 years, and (ii) including the information on the unclamping stress induced on the mapped fissures by sources of deformation proposed in literature for recent eruptions of Etna. The results of the sensitivity analyses confirm the main features of the canonical map, and add information on the epistemic uncertainty attached to it. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
182. Palaeoseismic crisis in the Galera Fault (S Spain). Consequences in Bronze Age settlements?
- Author
-
Martin-Rojas, Ivan, Medina-Cascales, Ivan, García-Tortosa, Francisco J., Rodríguez-Ariza, Maria Oliva, Molina González, Fernando, Cámara Serrano, José, and Alfaro, Pedro
- Subjects
- *
HUMAN settlements , *EARTHQUAKE hazard analysis , *PALEOSEISMOLOGY , *EPISTEMIC uncertainty , *SEISMOGRAMS , *SURFACE analysis - Abstract
Palaeoseismological studies play a crucial role in the seismic characterization of regions with slow moving faults. This is the case of the Central Betic Cordillera, a highly populated area where the record of prehistoric earthquakes is very scarce, despite of being one of the regions with the highest seismic hazard in Spain. We present here a palaeoseismological characterization of the Galera Fault, one of the active faults accommodating deformation in the Central Betic Cordillera. We excavated and analysed several trenches along the fault trace. We quantitatively correlate the results from these trenches, resulting in a surface rupture history involving 7 or 8 events (accounting for the epistemic uncertainties) during the last ca. 24000 yr, with a recurrence interval ranging between 1520 and 1720 yr. Further analysis of this surface rupture history seems to indicate that the Galera Fault is prone to produce earthquakes clusters, as we recorded five events in ca. 400 yr (ca. 1536-1126 BC), and only two events in the next ca. 3200 yr. Using the fault geometry and palaeoseismological data, we also carried out a seismogenic characterization of the fault. This analysis yielded a maximum expected magnitude of 6.7 ± 0.3 and a recurrence interval of 1857 yr. Furthermore, we also present a geodetic rupture scenario for the maximum expected event, involving displacements of up to 0.5 m. Finally, we discuss the possible impact of the deduced palaeoearthquakes in the development of Bronze Age human settlements located in the vicinity of the fault. Other than their intrinsic value, our results will be the basis for future seismic hazard assessment carried out in the Central Betic Cordillera. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
183. Transient subordinate clauses in Balkan Turkic in its shift to Standard Average European subordination. Dialectal and historical evidence.
- Author
-
Keskin, Cem
- Subjects
- *
TURKIC languages , *LINGUISTIC change , *INDO-Europeans , *SYSTEMS theory , *EPISTEMIC uncertainty , *UNCERTAINTY - Abstract
The Turkic varieties of the Balkans use two main diametrically opposed subordination strategies: (i) the Turkic model, where typical subordinate clauses are prepositive, nonfinite, contain clause-final subordinators, etc. and (ii) the Indo-European model, where typical subordinate clauses are postpositive, finite, contain clause-initial subordinators, etc. The paper observes that Balkan Turkic additionally uses several kinds of subordinate clause that allow for problematic mixtures of these two models ('X-clauses'). Spread over a spectrum between the Turkic and Indo-European extremes, X-clauses can, for instance, be prepositive but contain clause-initial subordinators. The paper, then, hypothesizes that X-clauses emerge due to uncertainties in the structural parameters of the Balkan Turkic subordination system. Such uncertainties are typical of complex systems undergoing change and arise in the present case due to the shift in Balkan Turkic away from Turkic towards Indo-European subordination. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
184. A reliability analysis technique for complex systems with retaining fuzzy information.
- Author
-
Zeng, Yining and Sun, Youchao
- Subjects
- *
MONTE Carlo method , *FUZZY systems , *FLIGHT control systems , *FAULT trees (Reliability engineering) , *PETRI nets - Abstract
This paper proposed a method for the reliability analysis of systems characterized by fuzzy failure probabilities and intricate failure behaviors, while retaining the fuzzy information throughout the analysis process. Specifically, we introduced a combinational modeling method that integrates generalized stochastic Petri nets (GSPN) and dynamic fault trees (DFT) to capture the dynamic failure behaviors and address the limitations of DFT modeling. This combinational approach is capable of capturing more sophisticated failure behaviors, such as competing failures and global failure propagation patterns. Furthermore, we propose an Monte Carlo technique based on endpoint sampling to enable quantitative analysis of GSPN‐based composite models, which preserves the fuzzy fault information of the system. Finally, we demonstrate the effectiveness of the proposed method through an example of a flight control system. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
185. Diversity entropy-based Bayesian deep learning method for uncertainty quantification of remaining useful life prediction in rolling bearings.
- Author
-
Bai, Rui, Li, Yongbo, Noman, Khandaker, and Wang, Shun
- Subjects
- *
REMAINING useful life , *ROLLER bearings , *DEEP learning , *EPISTEMIC uncertainty , *DECISION making - Abstract
Remaining useful life (RUL) prediction of rolling bearings plays a critical role in reducing unplanned downtime and improving machine productivity. The existing prediction methods primarily provide point estimates of RUL without quantifying uncertainty. However, uncertainty quantification of RUL is crucial to conduct reliable risk analysis and make maintenance decision, which can significantly decrease the maintenance costs. To solve the uncertainty quantification problem and improve prediction accuracy at the same time, a novel diversity entropy-based Bayesian deep learning (DE-BDL) method is proposed. First, start degradation time (SDT) of bearings is adaptively determined using diversity entropy, which can extract early degradation information. Then, multi-scale diversity entropy (MDE) is developed to extract dynamic characteristics over multiple scales. Third, the obtained features using MDE are fed into the BDL model for degradation tracking and prediction. By doing this, the proposed DE-BDL method has merits in subsequent decision making, which can not only provide point estimation but also offer uncertainty quantification with epistemic uncertainty and aleatoric uncertainty. The superiority of the proposed method is validated using run-to-failure data. The experimental results and comparison with state-of-art prediction methods have demonstrated that the proposed DE-BDL method is promising for RUL of rolling bearings. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
186. Approaching the risk perception gap: effects of a subject matter knowledge-based intervention in a health context.
- Author
-
Heuckmann, Benedikt and Krüger, Finja
- Subjects
- *
RISK perception , *CONTROL (Psychology) , *ALZHEIMER'S disease , *PLANNED behavior theory , *HEALTH literacy , *EPISTEMIC uncertainty - Abstract
We conducted a short subject matter knowledge-based intervention in a quasi-experimental design and explored whether providing health-related subject matter knowledge affects university students' risk perception and their behavioural intentions. We chose the everyday context of using antiperspirants that contain aluminium and focused on the presumption that antiperspirants facilitate Alzheimer's disease development. This study was devoted to a risk perception gap caused by the epistemic and ontological uncertainty involved in the contextual background of using antiperspirants and developing Alzheimer's disease. By computing repeated measures ANOVA, we found that imparting subject matter knowledge as system, action-related and effectiveness health knowledge increased students' cognitive and affective risk perception. Path analyses revealed that cognitive and affective risk perception had an indirect, negative effect on behavioural intentions towards using antiperspirants. Attitudes and perceived behavioural control fully mediated the relationship between risk perception and intention. The mediation effect differed between students who received subject matter knowledge and students who did not. We discuss the findings from our study related to the role of subject matter knowledge for understanding risk perception, the ambiguity of overcoming and creating a risk perception gap when uncertainty holds and how risk perception relates to attitudes and behavioural intention. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
187. A seismic risk assessment method for cultural artifacts based on the Law of Large Numbers.
- Author
-
Yang, Weiguo, Zou, Xiaoguang, Liu, Pei, Wang, Meng, and Cao, Wupeng
- Subjects
- *
LAW of large numbers , *EARTHQUAKES , *RISK assessment , *DISTRIBUTION (Probability theory) , *SEISMIC response , *GROUND motion , *EPISTEMIC uncertainty , *CULTURAL values - Abstract
• A simple and easy-to-understand seismic risk assessment method for cultural artifacts was proposed. • The Law of Large Numbers was applied to achieve the seismic risk assessment. • Three types of common porcelain vases were taken as case studies for seismic risk assessment. • A comparison was conducted between the developed method and an existing method. Earthquakes can cause significant damage to cultural artifacts, which often hold significant historical or cultural value. Seismic risk assessments can contribute to a more targeted approach by museum staff to the preventive conservation of cultural artifacts. The primary objective of this research is to suggest a new approach for assessing the seismic risk of cultural artifacts. This innovative method is founded on the Law of Large Numbers and aims to provide a more user-friendly way of evaluating the likelihood of potential seismic threat to cultural artifacts. The proposed method takes into account the statistical distribution of seismic ground motion and the seismic response characteristics of artifacts, and its accuracy and practicality are demonstrated by case studies. Compared with existing methods, the proposed method has the advantages of theoretical simplicity, low computational effort, and easier to be understood and mastered by museum staff. Furthermore, the impact of sample size on the assessment results was investigated. The findings demonstrate that the proposed method represents a valuable tool for cultural heritage risk decision-makers to evaluate the seismic risk of artifacts. By using this method, they can more effectively assess the potential damage caused by seismic effects and design suitable mitigation measures accordingly. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
188. Remaining useful life prediction with imprecise observations: An interval particle filtering approach.
- Author
-
Xiahou, Tangfan, Liu, Yu, Zeng, Zhiguo, and Wu, Muchen
- Subjects
- *
REMAINING useful life , *FATIGUE crack growth , *EPISTEMIC uncertainty , *PROBABILITY density function , *INDUSTRIAL goods , *STATISTICAL smoothing , *RANDOM noise theory - Abstract
Particle Filtering (PF) has been widely used for predicting Remaining Useful Life (RUL) of industrial products, especially for those with nonlinear degradation behavior and non-Gaussian noise. Traditional PF is a recursive Bayesian filtering framework that updates the posterior probability density function of RULs when new observation data become available. In engineering practice, due to the limited accuracy of monitoring/inspection techniques, the observation data available for PF are inevitably imprecise and often need to be treated as interval data. In this article, a novel Interval Particle Filtering (IPF) approach is proposed to effectively leverage such interval-valued observations for RUL prediction. The IPF is built on three pillars: (i) an interval contractor that mitigates the error explosion problem when the epistemic uncertainty in the interval-valued observation data is propagated; (ii) an interval intersection method for constructing the likelihood function based on the interval observation data; and (iii) an interval kernel smoothing algorithm for estimating the unknown parameters in the IPF. The developed methods are applied on the interval-valued capacity data of batteries and fatigue crack growth data of railroad tracks. The results demonstrate that the developed methods could improve the performance of RUL predictions based on interval observation data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
189. A Phenology-guided Bayesian-CNN (PB-CNN) framework for soybean yield estimation and uncertainty analysis.
- Author
-
Zhang, Chishan and Diao, Chunyuan
- Subjects
- *
CROP yields , *AGRICULTURAL development , *EPISTEMIC uncertainty , *CENTRAL economic planning , *CROP development , *SOYBEAN - Abstract
Large-scale crop yield estimation is important for understanding the response of agriculture production to environmental forces and management practices, and plays a critical role in insurance designing, trade decision making, and economic planning. The empirical models (e.g., deep learning models) have been increasingly utilized for estimating crop yields with the ability to take into account a range of yield predictors and complex modeling relationships. Yet empirical estimation of crop yields still faces important challenges, particularly in accommodating spatio-temporal crop phenological development patterns as well as tackling the heterogeneity of a diversity of yield predictors. The different types of uncertainties associated with empirical yield estimations have seldom been explored. The objective of this study is to develop a Phenology-guided Bayesian-Convolutional Neural Network (PB-CNN) framework for county-level crop yield estimation and uncertainty quantification, with soybean in the US Corn Belt as a case study. The PB-CNN framework comprises three key components: Phenology Imagery construction, multi-stream Bayesian-CNN modeling, as well as feature importance (i.e., yield predictor and phenological stage) and predictive uncertainty analysis (i.e., aleatoric and epistemic uncertainty). With the innovative integration of critical crop phenological stages in modeling the crop yield response to a heterogeneous set of yield predictors (i.e., satellite-based, heat-related, water-related, and soil predictors) as well as the associated uncertainties, the developed PB-CNN framework outperforms three advanced benchmark models, achieving an average RMSE of 4.622 bu/ac, an average R2 of 0.709, and an average bias of −2.057 bu/ac in estimating the county-level soybean yield of the US Corn Belt in testing years 2014–2018. Among the yield predictor groups, the satellite-based predictor group is the most critical in soybean yield estimation, followed by the water- and heat-related predictor groups. Throughout the growing season, the soybean blooming to dropping leaves phenological stages play a more crucial role in modeling the soybean yield. The soil predictor group as well as the early growing stages can improve the model estimation accuracy yet potentially brings more uncertainties into the yield estimation. The further uncertainty disentanglement indicates that the dominant uncertainty in yield estimation is the aleatoric uncertainty, mainly stemming from the fluctuations and variations inherent in the modeling input observations. The PB-CNN framework largely enhances our understanding of the complex soybean yield response to varying environmental conditions across crop phenological stages as well as associated uncertainties for more sustainable agricultural development. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
190. Climate Change Signal in Atlantic Tropical Cyclones Today and Near Future.
- Author
-
Lee, Chia‐Ying, Sobel, Adam H., Tippett, Michael K., Camargo, Suzana J., Wüest, Marc, Wehner, Michael, and Murakami, Hiroyuki
- Subjects
TROPICAL cyclones ,WINDSTORMS ,ATMOSPHERIC models ,CLIMATE change ,EPISTEMIC uncertainty ,GLOBAL warming - Abstract
This manuscript discusses the challenges in detecting and attributing recently observed trends in the Atlantic tropical cyclone (TC) and the epistemic uncertainty we face in assessing future risk. We use synthetic storms downscaled from five CMIP5 models by the Columbia HAZard model (CHAZ), and directly simulated storms from high‐resolution climate models. We examine three aspects of recent TC activity: the upward trend and multi‐decadal oscillation of the annual frequency, the increase in storm wind intensity, and the decrease in forward speed. Some data sets suggest that these trends and oscillation are forced while others suggest that they can be explained by natural variability. Projections under warming climate scenarios also show a wide range of possibilities, especially for the annual frequencies, which increase or decrease depending on the choice of moisture variable used in the CHAZ model and on the choice of climate model. The uncertainties in the annual frequency lead to epistemic uncertainties in TC risk assessment. Here, we investigate the potential for reduction of these epistemic uncertainties through a statistical practice, namely likelihood analysis. We find that historical observations are more consistent with the simulations with increasing frequency than those with decreasing frequency, but we are not able to rule out the latter. We argue that the most rational way to treat epistemic uncertainty is to consider all outcomes contained in the results. In the context of risk assessment, since the results contain possible outcomes in which TC risk is increasing, this view implies that the risk is increasing. Plain Language Summary: We use a set of computer model simulations to study recent trends in Atlantic tropical cyclones. We looked at three aspects of these storms: the number of tropical cyclones each year, which has fluctuated up and down over time (but generally increased over the last several decades); the strength of their winds, which has been increasing; and the speed at which they move, which has been decreasing. These trends could be caused either by human‐induced global warming or by natural variability; determining which cause is more important to overall risk requires us to understand how the number of tropical cyclones per year responds to warming. In our simulations, this number can either increase or decrease with warming, depending on which of two nearly identical versions of our model we use to simulate the storms. This uncertainty prevents us from reaching definitive conclusions about either present or future hurricane risk. Nonetheless, our analysis suggests that the risk of Atlantic tropical cyclones is more likely increasing than decreasing, and we argue that from a broader point of view, this is effectively equivalent to saying the risk is increasing. Key Points: Changes in the Atlantic tropical cyclone (TC) risk are uncertain due to epistemic uncertainty in the projected annual frequency under global warmingLikelihood analysis shows that observations are more consistent with simulations with upward frequency projections than those withoutIt is more likely that the risk of increased TC impacts in Atlantic is increasing than that it is decreasing, though not by a large margin [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
191. Using Logistic Regression to Identify the Key Hydrologic Controls of Ice-Jam Flooding near the Peace–Athabasca Delta: Assessment of Uncertainty and Linkage with Physical Process Understanding.
- Author
-
Beltaos, Spyros
- Subjects
FLOOD control ,FLOODS ,INFERENTIAL statistics ,LOGISTIC regression analysis ,EPISTEMIC uncertainty ,STATISTICAL significance ,STATISTICS - Abstract
The Peace–Athabasca Delta (PAD) in northern Alberta is one of the world's largest inland freshwater deltas and is home to many species of fish, mammals, and birds. Over the past five decades, the PAD has experienced prolonged dry periods in between rare floods, accompanied by a reduction in the area comprised of lakes and ponds that provide a habitat for aquatic life. In the Peace sector of the PAD, this likely resulted from a reduced frequency of spring flooding caused by major ice jams that form in the lower Peace River. There is debate in the literature regarding the factors that promote or inhibit the formation of such ice jams, deriving from physical process studies, paleolimnological studies, and—recently—statistical analysis founded in logistic regression. Logistic regression attempts to quantify ice-jam flood (IJF) probability, given the values of assumed explanatory variables, involve considerable uncertainty. Herein, different sources of uncertainty are examined and their effects on statistical inferences are evaluated. It is shown that epistemic uncertainty can be addressed by selecting direct explanatory variables, such as breakup flow and ice cover thickness, rather than through more convenient, albeit weak, proxies that rely on winter precipitation and degree-days of frost. Structural uncertainty, which derives from the unknown mathematical relationship between IJF probability and the selected explanatory variables, leads to different probability predictions for different assumed relationships but does not modify assessments of statistical significance. The uncertainty associated with the relatively small sample size (number of years of record) may be complicated by known physical constraints on IJF occurrence. Overall, logistic regression corroborates physical understanding that points to breakup flow and freezeup level as primary controls of IJF occurrence. Additional influences, related to the thermal decay of the ice cover and the flow gradient during the advance of the breakup front towards the PAD, are difficult to quantify at present. Progress requires increased monitoring of processes and an enhanced numerical modelling capability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
192. Land pressure evaluation in the Yangtze River Delta region: A perspective from production‐living‐ecology.
- Author
-
Yu, Ziqi, Chen, Longqian, Zhang, Ting, Li, Long, Yuan, Lina, Teng, Gan, Xiao, Jue, Shi, Shuai, and Chen, Longgao
- Subjects
MONTE Carlo method ,STATIC equilibrium (Physics) ,MECHANICAL models ,EPISTEMIC uncertainty - Abstract
The limited land is under unprecedented pressure from production, living, and ecology. For evaluating the land pressure in the Yangtze River Delta region in 1995, 2000, 2005, 2010, 2015, and 2020 from the perspective of production, living, and ecology, this study builds a land pressure evaluation index system based on a fuzzy comprehensive evaluation model using multisource and multiscale data. For investigating trade‐offs and synergies among production, living, and ecology pressures, we use the mechanical equilibrium model in physics. We then analyze land pressure model reliability and uncertainty using Monte Carlo simulations. The results show that (1) Our model can effectively reveal the level of land pressure and reflect the land pressure geographical pattern of "high in the east and low in the west, high in the south and low in the north" that characterizes the Yangtze River Delta region. (2) While living and ecology pressures tend to rise, production pressures tend to decrease. (3) Except for Shanghai, the trade‐off areas are primarily concentrated in economically successful regions with high production and living pressure and low ecology pressure. The coordinated areas are primarily found in northern Jiangsu Province and northern Anhui Province. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
193. The Cost of Imperfect Knowledge: How Epistemic Uncertainties Influence Flood Hazard Assessments.
- Author
-
Balbi, Mariano and Lallemant, David C. B.
- Subjects
EPISTEMIC uncertainty ,FLOOD warning systems ,RISK assessment ,BAYESIAN analysis ,WATER distribution ,FLOODS - Abstract
Classical approaches to flood hazard are obtained by the concatenation of a recurrence model for the events (i.e., an extreme river discharge) and an inundation model that propagates the discharge into a flood extent. The classical approach, however, uses "best‐fit" models that do not include uncertainty from incomplete knowledge or limited data availability. The inclusion of these, so called epistemic uncertainties, can significantly impact flood hazard estimates and the corresponding decision‐making process. We propose a simulation approach to robustly account for uncertainty in model's parameters, while developing a useful probabilistic output of flood hazard for further risk assessments via the Bayesian predictive posterior distribution of water depths. A Peaks‐Over‐Threshold Bayesian analysis is performed for future events simulation, and a pseudo‐likelihood probabilistic approach for the calibration of the inundation model is used to compute uncertain water depths. The annual probability averaged over all possible models' parameters is used to develop hazard maps that account for epistemic uncertainties. Results are compared to traditional hazard maps, showing that not including epistemic uncertainties can underestimate the hazard and lead to non‐conservative designs, and that this trend increases with return period. Results also show that the influence of the uncertainty in the future occurrence of discharge events is predominant over the inundation simulator uncertainties for the case study. Plain Language Summary: Estimating the annual probability of some flood‐depth level is a key input for risk analysis and engineering design. This is typically calculated via sophisticated probability and physics‐based models that require many parameters. However, the classical approach uses a fixed set of "best parameters" for this and do not include the degree of uncertainty, even when such uncertainties may be very high. This work proposes a method to estimate the annual probability of flood‐depth including the uncertainty in the parameters used to compute it. More importantly, it shows that not including this uncertainty might severely underestimate the hazard and consequently lead to unsafe designs. Key Points: Flood hazard assessments involve sophisticated probability and physics‐based models that require the specification of many parametersWe propose a Bayesian methodology to include uncertainty in models parameters into flood hazard estimates and mappingThe inclusion of uncertainty in parameters can significantly affect hazard estimates and its omission can lead to non‐conservative planning [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
194. Alternative methodology for epistemic uncertainty-based linear programming problem.
- Author
-
Behera, Diptiranjan
- Subjects
- *
LINEAR programming , *FUZZY numbers , *EPISTEMIC uncertainty , *REAL numbers , *MATHEMATICS - Abstract
An alternative new method for solving an epistemic uncertainty such as fuzzy-based linear programming problem with equality constraints has been proposed here. In this approach, the optimal solution of the considered fuzzy problem has been achieved by solving a related crisp problem. Moreover, the objective function of the crisp problem has been obtained by taking the linear combination of the fuzzified version of the original uncertain objective function. The coefficients of the linear combination are considered as strictly positive real numbers. And the constraints of the crisp problem are obtained equivalently from the equality constraints of the fuzzy problem. Triangular fuzzy numbers are considered for this analysis. Comparisons have been made with the results of Kumar et al. (Appl Math Model 35:817–823, 2011), Najafi and Edalatpanah (Appl Math Model 37:7865–7867, 2013), Ezzati et al. (Appl Math Model 39:3183-3193, 2015) and Ghoushchi et al. (Mathematics 9:2937, 2021. https://doi.org/10.3390/math9222937). [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
195. Considering uncertainty in the collapse fragility of New Zealand buildings for risk‐targeted seismic design.
- Author
-
Hulsey, Anne M., Horspool, Nick, Gerstenberger, Matthew C., Sullivan, Timothy J., and Elwood, Kenneth J.
- Subjects
BUILDING failures ,EPISTEMIC uncertainty - Abstract
The risk‐targeted seismic design framework is used to set design intensities, based on achieving a target risk level (e.g., collapse or fatality risk) with respect to an assumed building collapse response. This paper assesses the distribution of fatality risk associated with the risk‐targeted design intensity, considering uncertainty in both the hazard and the variety of buildings that can satisfy the minimum design requirement. The randomness among buildings and their response is due to other design decisions (aleatory variability with respect to the design intensity) and can be represented by a distribution of fragility curves that quantify the collapse probabilities of as‐built, code‐conforming buildings. First, a single design intensity is calculated based on a "design fragility," the mean hazard curve, and a risk target. The design fragility is taken as a conservative estimate from the distribution of collapse fragilities and the selected risk target approximates the risk associated with New Zealand's previous (non‐risk‐targeted) criteria for design intensities. Then the risk distribution is assessed, considering the aleatory variability in as‐built fragilities and the epistemic uncertainty in the new National Seismic Hazard Model. Accounting for variability in design decisions and uncertainty in the hazard model produces a risk distribution that more fully represents the potential risk associated with a given design intensity. This distribution can be compared to guidance on tolerable risk ranges, which suggest that risk can be variable among buildings but should fall within acceptable bounds. Sensitivity studies consider epistemic uncertainty in the assumed model for the distribution of as‐built fragilities. This inclusion of uncertainty to assess the risk distribution offers a powerful extension to the risk‐targeted framework. While this extension would not affect engineering practice (as the output is still a single design intensity), it allows building code developers to better understand and consider the risk implications associated with the selected design intensity. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
196. Barriers to operational flood forecasting in complex terrain: from precipitation forecasts to probabilistic flood forecast mapping at short lead times.
- Author
-
Bacelar, Luiz, ReifeeiNasab, Arezoo, Chaney, Nathaniel, and Barros, Ana
- Subjects
FLOOD forecasting ,PRECIPITATION forecasting ,LEAD time (Supply chain management) ,HYDRAULIC structures ,FLOODS ,EPISTEMIC uncertainty - Abstract
As flood alert systems move towards higher spatial resolutions, there is a continued need to enable approaches that provide robust predictions of flood extent that adequately account for the uncertainties from meteorological forcing, hydrologic and hydraulic model structure, and parameter uncertainty. In flood forecasting, two primary sources of uncertainty are the quantitative precipitation forecasts (QPF) and the representation of the channel and floodplain geometry. This is especially relevant as simple approaches (e.g., HAND) are being used to map floods operationally at field scales (< 10 m). This article investigates the benefits of using a computationally efficient probabilistic precipitation forecast (PPF) approach to generate multiple flood extension scenarios over a region of complex terrain prone to flash floods. First, we assess the limitations of using a calibrated version of the gridded version of the WRF-Hydro model to predict an extreme flash flood event in the Greenbrier River Basin (West Virginia) on 24 June 2016. We investigated an ensemble methodology to combine operational High-Resolution Rapid Refresh (HRRR) QPF with radar-based Quantitative Precipitation Estimates, specifically MRMS QPE products. This approach was most effective to increase the headwaters streamflow accuracy in the first hour lead time, which is still insufficient to issue actionable flood warnings in operational applications. At longer lead-times, success was elusive due to epistemic uncertainties in MRMS rainfall intensity and HRRR rainfall spatial patterns. Furthermore, a QPF ensemble was used to generate an ensemble of flood heights using the HAND flood mapping methodology at different spatial resolutions. Results revealed a scale-dependency with increasing dispersion among the predicted flooded areas with increasing spatial resolution down to 1 meter. We hypothesize the overprediction of flooded areas at higher spatial resolutions reflects the increasing number of river reaches and the need for scale-aware representation of river hydraulics that impacts flood propagation in the river network. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
197. Thermospheric density predictions during quiet time and geomagnetic storm using a deep evidential model-based framework.
- Author
-
Wang, Yiran and Bai, Xiaoli
- Subjects
- *
MAGNETIC storms , *LOW earth orbit satellites , *GAUSSIAN processes , *EPISTEMIC uncertainty , *DENSITY - Abstract
Knowledge of the thermospheric density is essential for calculating the drag in low Earth orbit satellites. Existing models struggle to predict density accurately. In this paper, we propose thermospheric density prediction using a deep evidential model-based framework that incorporates empirical models, accelerometer-inferred density from the CHAMP satellite, and geomagnetic and solar indices. The framework is investigated on both quiet and storm conditions. Our results demonstrate that the proposed model can predict the thermospheric density with high accuracy and reliable uncertainty in both quiet and storm times. The predicted results from the evidential model are advantageous over the Gaussian Processes (GPs) model in our previous studies. Furthermore, the proposed model can also provide insightful aleatoric and epistemic uncertainties. • Predict thermospheric density with a deep evidential model-based framework. • The method is investigated on both quiet and storm periods. • The density predictions are accurate with reliable uncertainty in all conditions. • The method can provide insightful aleatoric and epistemic uncertainties. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
198. A Physics-Constrained Bayesian neural network for battery remaining useful life prediction.
- Author
-
Najera-Flores, David A., Hu, Zhen, Chadha, Mayank, and Todd, Michael D.
- Subjects
- *
REMAINING useful life , *BAYESIAN analysis , *LITHIUM-ion batteries , *DIFFERENTIAL operators , *PHYSICAL laws , *EPISTEMIC uncertainty - Abstract
• Physics-Constrained Bayesian Neural Network. • Failure prognostics of lithium-ion batteries. • Comparative study of physics-based, data-driven, and physics-constrained models. In order to predict the remaining useful life (RUL) of lithium-ion batteries, a capacity degradation model may be developed using either simplified physical laws or machine learning-based methods. It is observed that even though degradation models based on simplified physical laws are easy to implement, they may result in large error in the application of failure prognostics. While data-driven prognostics models can provide more accurate degradation forecasting, they may require a large volume of training data and may invoke predictions inconsistent with physical laws. It is also very challenging for existing methods to predict the RUL at the early stages of battery life. In this paper, we propose a Bayesian physics-constrained neural network for battery RUL prediction by overcoming limitations of the current methods. In the proposed method, a neural differential operator is learned from the first 100 cycles of data. The neural differential operator is modeled with a Bayesian neural network architecture that separates the fixed history dependence from the time dependence to isolate epistemic uncertainty quantification. Using the battery dataset presented in the paper by Severson et al. as an example, we compare our proposed method with a simplified physics-based degradation forecasting model and two data-driven prognostics models. The results show that the proposed physics-constrained neural network can provide more accurate RUL estimation than the other methods with the same group of training data. Most importantly, the proposed method allows for RUL prediction at earlier stages of the battery life cycle. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
199. Emergence of open supply chain management: the role of open innovation in the future smart industry using digital twin network.
- Author
-
Rahmanzadeh, Sajjad, Pishvaee, Mir Saman, and Govindan, Kannan
- Subjects
- *
DIGITAL twins , *OPEN innovation , *SUPPLY chain management , *EVIDENCE gaps , *EPISTEMIC uncertainty , *COMPUTERS , *SUPPLY chains - Abstract
Alongside many research studies that have been presented in the open innovation domain and smart manufacturing systems, there is a research gap on integrating the outbound individual capabilities with the new smart manufacturing machines to satisfy the customers' varied and uncertain requirements. In this paper, Open Supply Chain Management (OSCM) is conceptualized as a new paradigm in the evolution of SCM. Companies can benefit from integrated physical and conceptual resources to promote efficiency and flexibility throughout the supply chain's main processes, including supplying, manufacturing, distributing, and marketing. The OSCM concept is undergoing several drivers including crowdsourcing, open innovation, Industry 4.0, cloud manufacturing, Internet of Things (IoT), big data, and the digital twin that appeared in the last decades. To validate OSCM in practice, a subset of this concept is investigated to incorporate the designing process with supply chain production planning applying a digital twin network. Additionally, dealing with epistemic uncertainty, a fuzzy tactical planning model is developed, and to study the developed model in more detail, an industrial study in the clothes manufacturing industry is employed. The results illustrate that the products' designing cost consists of only 2% of the supply chain total cost. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
200. Incorporating uncertainty in stress‐strain data acquisition: extended model‐free data‐driven identification.
- Author
-
Zschocke, Selina, Graf, Wolfgang, and Kaliske, Michael
- Subjects
- *
DIGITAL image correlation , *STRAINS & stresses (Mechanics) , *SOLID mechanics , *ACQUISITION of data , *DISPLACEMENT (Mechanics) , *COMPUTATIONAL mechanics , *EPISTEMIC uncertainty - Abstract
Based on the increasing ability to collect and store large amounts of data and information, data‐driven methods have recently gained importance in the context of computational mechanics. Compared to the traditional approach of defining a suitable constitutive model and fit the parameters to experimental data, these methods aim to capture complex material behavior without the assumption of a certain material formulation. Distinguished are model‐based data‐driven methods, leading to an approximation of the constitutive material description for example, by neural networks, and model‐free data‐driven methods. The approach of data‐driven computational mechanics (DDCM), introduced by Kirchdoerfer and Ortiz (2016), enables to circumvent any material modeling step by directly incorporating material data into the structural analysis. A basic prerequisite for both types of data‐driven methods is a large amount of data representing the material behavior, in solid mechanics consisting of stresses and strains. Obtaining these databases numerically by multiscale approaches is computationally expensive and requires the definition of lower scale models. In case of an experimental characterization, constitutive descriptions are generally required to compute the stress states corresponding to displacement fields, for example, identified by full‐field measurement techniques, such as digital image correlation (DIC). The method of data‐driven identification (DDI), introduced in Leygue et al. (2018) based on the principles of DDCM, enables the determination of detailed information about the constitutive behavior based on displacement fields and applied boundary conditions without a specific material model. Stresses corresponding to given strains are identified by iteratively clustering and enforcing equilibrium. The algorithm has shown to be applicable to synthetic as well as real data taking linear and non‐linear material behavior into account. Generalized polymorphic uncertainty models, resulting as a combination of aleatoric and epistemic uncertainty models, are utilized to take variability, imprecision, inaccuracy and incompleteness of data into account. The consideration of uncertain material properties by data‐driven simulation approaches leads to the requirement of data sets representing uncertain material behavior. In this contribution, different sources of uncertainty occurring within DDI of stress‐strain relations are addressed and an efficient method for the identification of data sets representing uncertain material behavior based on the concept of DDI is proposed. In order to demonstrate the developed methods, numerical examples are carried out. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.