16 results on '"Shenoy, Prakash P."'
Search Results
2. An expectation operator for belief functions in the Dempster–Shafer theory.
- Author
-
Shenoy, Prakash P.
- Subjects
- *
DEMPSTER-Shafer theory , *OPERATOR functions , *PROBABILITY theory , *OPERATOR theory , *CONVEX sets - Abstract
The main contribution of this paper is a new definition of expected value of belief functions in the Dempster–Shafer (D–S) theory of evidence. Our definition shares many of the properties of the expectation operator in probability theory. Also, for Bayesian belief functions, our definition provides the same expected value as the probabilistic expectation operator. A traditional method of computing expected of real-valued functions is to first transform a D–S belief function to a corresponding probability mass function, and then use the expectation operator for probability mass functions. Transforming a belief function to a probability function involves loss of information. Our expectation operator works directly with D–S belief functions. Another definition is using Choquet integration, which assumes belief functions are credal sets, i.e. convex sets of probability mass functions. Credal sets semantics are incompatible with Dempster's combination rule, the center-piece of the D–S theory. In general, our definition provides different expected values than, e.g. if we use probabilistic expectation using the pignistic transform or the plausibility transform of a belief function. Using our definition of expectation, we provide new definitions of variance, covariance, correlation, and other higher moments and describe their properties. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
3. Making inferences in incomplete Bayesian networks: A Dempster-Shafer belief function approach.
- Author
-
Shenoy, Prakash P.
- Subjects
- *
BAYESIAN analysis , *BAYESIAN field theory , *MISSING data (Statistics) , *SENSITIVITY analysis , *PROBABILITY theory , *EXPECTATION-maximization algorithms - Abstract
How do you make inferences from a Bayesian network (BN) model with missing information? For example, we may not have priors for some variables or may not have conditionals for some states of the parent variables. It is well-known that the Dempster-Shafer (D-S) belief function theory is a generalization of probability theory. So, a solution is to embed an incomplete BN model in a D-S belief function model, omit the missing data, and then make inferences from the belief function model. We will demonstrate this using an implementation of a local computation algorithm for D-S belief function models called the "Belief function machine." One advantage of this approach is that we get interval estimates of the probabilities of interest. Using Laplacian (equally likely) or maximum entropy priors or conditionals for missing data in a BN may lead to point estimates for the probabilities of interest, masking the uncertainty in these estimates. Bayesian reasoning cannot reason from an incomplete model. A Bayesian sensitivity analysis of the missing parameters is not a substitute for a belief-function analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. A new definition of entropy of belief functions in the Dempster–Shafer theory.
- Author
-
Jiroušek, Radim and Shenoy, Prakash P.
- Subjects
- *
ENTROPY (Information theory) , *DEMPSTER-Shafer theory , *MATHEMATICAL functions , *PROBABILITY theory , *UNCERTAINTY (Information theory) - Abstract
We propose a new definition of entropy of basic probability assignments (BPAs) in the Dempster–Shafer (DS) theory of belief functions, which is interpreted as a measure of total uncertainty in the BPA. Our definition is different from those proposed by Höhle, Smets, Yager, Nguyen, Dubois–Prade, Lamata–Moral, Klir–Ramer, Klir–Parviz, Pal et al., Maeda–Ichihashi, Harmanec–Klir, Abellán–Moral, Jousselme et al., Pouly et al., and Deng. We state a list of six desired properties of entropy for DS belief functions theory, four of which are motivated by Shannon's definition of entropy of probability functions, and the remaining two are requirements that adapt this measure to the philosophy of the DS theory. Three of our six desired properties are different from the five properties proposed by Klir and Wierman. We demonstrate that our definition satisfies all six properties in our list, whereas none of the existing definitions do. Our new definition has two components. The first component is Shannon's entropy of an equivalent probability mass function obtained using the plausibility transform, which constitutes the conflict measure of entropy. The second component is Dubois–Prade's definition of entropy of basic probability assignments in the DS theory, which constitutes the non-specificity measure of entropy. Our new definition is the sum of these two components. Our definition does not satisfy the subadditivity property. Whether there exists a definition that satisfies our six properties plus subadditivity remains an open question. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
5. Causal compositional models in valuation-based systems with examples in specific theories.
- Author
-
Jiroušek, Radim and Shenoy, Prakash P.
- Subjects
- *
PROBABILITY theory , *BERNSTEIN polynomials , *EPISTEMIC logic , *MODAL logic , *APPROXIMATE reasoning - Abstract
We show that Pearl's causal networks can be described using causal compositional models (CCMs) in the valuation-based systems (VBS) framework. One major advantage of using the VBS framework is that as VBS is a generalization of several uncertainty theories (e.g., probability theory, a version of possibility theory where combination is the product t -norm, Spohn's epistemic belief theory, and Dempster–Shafer belief function theory), CCMs, initially described in probability theory, are now described in all uncertainty calculi that fit in the VBS framework. We describe conditioning and interventions in CCMs. Another advantage of using CCMs in the VBS framework is that both conditioning and intervention can be easily described in an elegant and unifying algebraic way for the same CCM without having to do any graphical manipulations of the causal network. We describe how conditioning and intervention can be computed for a simple example with a hidden (unobservable) variable. Also, we illustrate the algebraic results using numerical examples in some of the specific uncertainty calculi mentioned above. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
6. Knowledge Representation and Integration for Portfolio Evaluation Using Linear Belief Functions.
- Author
-
Liping Liu, Shenoy, Catherine, and Shenoy, Prakash P.
- Subjects
EXPERT systems ,ARTIFICIAL intelligence ,COMPUTER systems ,DEMPSTER-Shafer theory ,PROBABILITY theory ,DECISION support systems ,PORTFOLIO management (Investments) - Abstract
This paper proposes a linear belief function (LBF) approach to evaluate portfolio performance. By drawing on the notion of LBFs, an elementary approach to knowledge representation in expert systems is proposed. It is shown how to use basic matrices to represent market information and financial knowledge, including complete ignorance, statistical observations, subjective speculations, distributional assumptions, linear relations, and empirical asset-pricing models. The authors then appeal to Dempster's rule of combination to integrate the knowledge for assessing the overall belief of portfolio performance and updating the belief by incorporating additional evidence. An example of three gold stocks is used to illustrate the approach. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
7. Compositional models in valuation-based systems.
- Author
-
Jiroušek, Radim and Shenoy, Prakash P.
- Subjects
- *
VALUATION theory , *DISCRETE probability theory , *DEMPSTER-Shafer theory , *GENERALIZATION , *OPERATOR theory - Abstract
Compositional models were initially described for discrete probability theory, and later extended for possibility theory and for belief functions in Dempster–Shafer (D–S) theory of evidence. Valuation-based system (VBS) is an unifying theoretical framework generalizing some of the well known and frequently used uncertainty calculi. This generalization enables us to not only highlight the most important theoretical properties necessary for efficient inference (analogous to Bayesian inference in the framework of Bayesian network), but also to design efficient computational procedures. Some of the specific calculi covered by VBS are probability theory, a version of possibility theory where combination is the product t -norm, Spohn’s epistemic belief theory, and D–S belief function theory. In this paper, we describe compositional models in the general framework of VBS using the semantics of no-double counting, which is central to the VBS framework. Also, we show that conditioning can be expressed using the composition operator. We define a special case of compositional models called decomposable models, again in the VBS framework, and demonstrate that for the class of decomposable compositional models, conditioning can be done using local computation. As all results are obtained for the VBS framework, they hold in all calculi that fit in the VBS framework. For the D–S theory of belief functions, the compositional model defined here differs from the one studied by Jiroušek, Vejnarová, and Daniel. The latter model can also be described in the VBS framework, but with a combination operator that is different from Dempster’s rule of combination. For the version of possibility theory in which combination is the product t -norm, the compositional model defined here reduces to the one studied by Vejnarová. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
8. Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks
- Author
-
Shenoy, Prakash P.
- Subjects
- *
POLYNOMIALS , *BAYESIAN analysis , *APPROXIMATION theory , *PROBABILITY theory , *DENSITY functionals , *GAUSSIAN distribution , *TAYLOR'S series - Abstract
Abstract: We discuss two issues in using mixtures of polynomials (MOPs) for inference in hybrid Bayesian networks. MOPs were proposed by Shenoy and West for mitigating the problem of integration in inference in hybrid Bayesian networks. First, in defining MOP for multi-dimensional functions, one requirement is that the pieces where the polynomials are defined are hypercubes. In this paper, we discuss relaxing this condition so that each piece is defined on regions called hyper-rhombuses. This relaxation means that MOPs are closed under transformations required for multi-dimensional linear deterministic conditionals, such as Z = X + Y, etc. Also, this relaxation allows us to construct MOP approximations of the probability density functions (PDFs) of the multi-dimensional conditional linear Gaussian distributions using a MOP approximation of the PDF of the univariate standard normal distribution. Second, Shenoy and West suggest using the Taylor series expansion of differentiable functions for finding MOP approximations of PDFs. In this paper, we describe a new method for finding MOP approximations based on Lagrange interpolating polynomials (LIP) with Chebyshev points. We describe how the LIP method can be used to find efficient MOP approximations of PDFs. We illustrate our methods using conditional linear Gaussian PDFs in one, two, and three dimensions, and conditional log-normal PDFs in one and two dimensions. We compare the efficiencies of the hyper-rhombus condition with the hypercube condition. Also, we compare the LIP method with the Taylor series method. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
9. Inference in hybrid Bayesian networks using mixtures of polynomials
- Author
-
Shenoy, Prakash P. and West, James C.
- Subjects
- *
BAYESIAN analysis , *POLYNOMIALS , *PROBABILITY theory , *RANDOM variables , *EXPONENTIAL functions , *APPROXIMATION theory - Abstract
Abstract: The main goal of this paper is to describe inference in hybrid Bayesian networks (BNs) using mixture of polynomials (MOP) approximations of probability density functions (PDFs). Hybrid BNs contain a mix of discrete, continuous, and conditionally deterministic random variables. The conditionals for continuous variables are typically described by conditional PDFs. A major hurdle in making inference in hybrid BNs is marginalization of continuous variables, which involves integrating combinations of conditional PDFs. In this paper, we suggest the use of MOP approximations of PDFs, which are similar in spirit to using mixtures of truncated exponentials (MTEs) approximations. MOP functions can be easily integrated, and are closed under combination and marginalization. This enables us to propagate MOP potentials in the extended Shenoy–Shafer architecture for inference in hybrid BNs that can include deterministic variables. MOP approximations have several advantages over MTE approximations of PDFs. They are easier to find, even for multi-dimensional conditional PDFs, and are applicable for a larger class of deterministic functions in hybrid BNs. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
10. A decision theory for partially consonant belief functions
- Author
-
Giang, Phan H. and Shenoy, Prakash P.
- Subjects
- *
DECISION theory , *DEMPSTER-Shafer theory , *MATHEMATICAL functions , *DECISION making , *PROBABILITY theory , *STATISTICS - Abstract
Abstract: Partially consonant belief functions (pcb), studied by Walley, are the only class of Dempster–Shafer belief functions that are consistent with the likelihood principle of statistics. Structurally, the set of foci of a pcb is partitioned into non-overlapping groups and within each group, foci are nested. The pcb class includes both probability function and Zadeh’s possibility function as special cases. This paper studies decision making under uncertainty described by pcb. We prove a representation theorem for preference relation over pcb lotteries to satisfy an axiomatic system that is similar in spirit to von Neumann and Morgenstern’s axioms of the linear utility theory. The closed-form expression of utility of a pcb lottery is a combination of linear utility for probabilistic lottery and two-component (binary) utility for possibilistic lottery. In our model, the uncertainty information, risk attitude and ambiguity attitude are separately represented. A tractable technique to extract ambiguity attitude from a decision maker behavior is also discussed. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
11. VALUATION-BASED SYSTEMS FOR BAYESIAN DECISION ANALYSIS.
- Author
-
Shenoy, Prakash P.
- Subjects
BAYESIAN analysis ,RANDOM variables ,STATISTICAL decision making ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,FACTORIZATION ,MATHEMATICAL optimization ,MATHEMATICAL statistics - Abstract
This paper proposes a new method for representing and solving Bayesian decision problems. The representation is called a valuation-based system and has some similarities to influence diagrams. However, unlike influence diagrams which emphasize conditional independence among random variables, valuation-based systems emphasize factorizations of joint probability distributions. Also, whereas influence diagram representation allows only conditional probabilities, valuation-based system representation allows all probabilities. The solution method is a hybrid of local computational methods for the computation of marginals of joint probability distributions and the local computational methods for discrete optimization problems. We briefly compare our representation and solution methods to those of influence diagrams. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
12. A Theory of Coarse Utility.
- Author
-
Liu, Liping and Shenoy, Prakash P.
- Subjects
UTILITY functions ,NUMERICAL analysis ,HYPOTHESIS ,PROBABILITY theory ,LOGIC - Abstract
This article presents a descriptive theory for complex choice problems. In line with the bounded rationality assumption, we hypothesize that decision makers modify a complex choice into some coarse approximations, each of which is a binary lottery. We define the value of a best coarse approximation to be the utility of the choice. Using this paradigm, we axiomatize and justify a new utility function called the coarse utility Junction. We show that the coarse utility function approximates the rank- and sign-dependent utility function. It satisfies dominance but admits violations of independence. It reduces judgmental load and allows flexible judgmental information. It accommodates phenomena associated with probability distortions and provides a better resolution to the St. Petersburg paradox than the expected and rank-dependent theories. [ABSTRACT FROM AUTHOR]
- Published
- 1995
- Full Text
- View/download PDF
13. Operations for inference in continuous Bayesian networks with linear deterministic variables
- Author
-
Cobb, Barry R. and Shenoy, Prakash P.
- Subjects
- *
BAYESIAN analysis , *GAUSSIAN distribution , *DISTRIBUTION (Probability theory) , *PROBABILITY theory - Abstract
Abstract: An important class of continuous Bayesian networks are those that have linear conditionally deterministic variables (a variable that is a linear deterministic function of its parents). In this case, the joint density function for the variables in the network does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when all variables are normally distributed. In this paper, we develop operations required for performing inference with linear conditionally deterministic variables in continuous Bayesian networks using relationships derived from joint cumulative distribution functions. These methods allow inference in networks with linear deterministic variables and non-Gaussian distributions. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
14. On the plausibility transformation method for translating belief function models to probability models
- Author
-
Cobb, Barry R. and Shenoy, Prakash P.
- Subjects
- *
PROBABILITY theory , *DEMPSTER-Shafer theory , *PLAUSIBILITY (Logic) , *INFORMATION theory - Abstract
Abstract: In this paper, we propose the plausibility transformation method for translating Dempster–Shafer (D–S) belief function models to probability models, and describe some of its properties. There are many other transformation methods used in the literature for translating belief function models to probability models. We argue that the plausibility transformation method produces probability models that are consistent with D–S semantics of belief function models, and that, in some examples, the pignistic transformation method produces results that appear to be inconsistent with Dempster’s rule of combination. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
15. Entropy for evaluation of Dempster-Shafer belief function models.
- Author
-
Jiroušek, Radim, Kratochvíl, Václav, and Shenoy, Prakash P.
- Subjects
- *
ENTROPY , *PROBABILITY theory , *DISTRIBUTION (Probability theory) , *APPROXIMATION algorithms , *ANALOGY - Abstract
Applications of Dempster-Shafer (D-S) belief functions to practical problems involve difficulties arising from their high computational complexity. One can use space-saving factored approximations such as graphical belief function models to solve them. Using an analogy with probability distributions, we represent these approximations in the form of compositional models. Since no theoretical apparatus similar to probabilistic information theory exists for D-S belief functions (e.g., dissimilarity measure analogous to the Kullback-Liebler divergence measure), the problems arise not only in connection with the design of algorithms seeking optimal approximations but also in connection with a criterion comparing two different approximations. In this respect, the application of the analogy with probability theory fails. Therefore, in this paper, we conduct some synthetic experiments and describe the results designed to reveal whether some belief function entropy definitions described in the literature can detect optimal approximations, i.e., that achieve their minimum for an optimal approximation. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. On conditional belief functions in directed graphical models in the Dempster-Shafer theory.
- Author
-
Jiroušek, Radim, Kratochvíl, Václav, and Shenoy, Prakash P.
- Subjects
- *
DEMPSTER-Shafer theory , *MODEL theory , *PROBABILITY theory , *BAYESIAN analysis , *CONDITIONAL probability - Abstract
The primary goal is to define conditional belief functions in the Dempster-Shafer theory. We do so similarly to probability theory's notion of conditional probability tables. Conditional belief functions are necessary for constructing directed graphical belief function models in the same sense as conditional probability tables are necessary for constructing Bayesian networks. We provide examples of conditional belief functions, including those obtained by Smets' conditional embedding. Besides defining conditional belief functions, we state and prove a few basic properties of conditionals. In the belief-function literature, conditionals are defined starting from a joint belief function. Conditionals are then defined using the removal operator, an inverse of Dempster's combination operator. When such conditionals are well-defined belief functions, we show that our definition is equivalent to these definitions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.