48 results on '"f-divergences"'
Search Results
2. Refinements of discrete and integral Jensen inequalities with Jensen's gap.
- Author
-
Horváth, László
- Subjects
- *
JENSEN'S inequality , *INTEGRAL inequalities , *INFORMATION theory - Abstract
Motivated by a paper of Dragomir, we give new refinements for both discrete and integral Jensen inequalities using the Jensen's gap. As applications, we give refinements of various inequalities verifiable by Jensen's inequality. Topics covered: norms, quasi-arithmetic means, Hölder's inequality and f-divergences in information theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Robust Validation: Confident Predictions Even When Distributions Shift.
- Author
-
Cauchois, Maxime, Gupta, Suyash, Ali, Alnur, and Duchi, John C.
- Abstract
AbstractWhile the traditional viewpoint in machine learning and statistics assumes training and testing samples come from the same population, practice belies this fiction. One strategy—coming from robust statistics and optimization—is thus to build a model robust to distributional perturbations. In this article, we take a different approach to describe procedures for robust predictive inference, where a model provides uncertainty estimates on its predictions rather than point predictions. We present a method that produces prediction sets (almost exactly) giving the right coverage level for any test distribution in an
f -divergence ball around the training population. The method, based on conformal inference, achieves (nearly) valid coverage in finite samples, under only the condition that the training data be exchangeable. An essential component of our methodology is to estimate the amount of expected future data shift and build robustness to it; we develop estimators and prove their consistency for protection and validity of uncertainty estimates under shifts. By experimenting on several large-scale benchmark datasets, including Recht et al.’s CIFAR-v4 and ImageNet-V2 datasets, we provide complementary empirical results that highlight the importance of robust predictive validity. Supplementary materials for this article are available online. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
4. Sum-of-Squares Relaxations for Information Theory and Variational Inference
- Author
-
Bach, Francis
- Published
- 2024
- Full Text
- View/download PDF
5. MAUVE Scores for Generative Models: Theory and Practice.
- Author
-
Pillutla, Krishna, Lang Liu, Thickstun, John, Welleck, Sean, Swayamdipta, Swabha, Zellers, Rowan, Sewoong Oh, Yejin Choi, and Harchaoui, Zaid
- Subjects
- *
GENERATIVE artificial intelligence , *LANGUAGE models , *MODEL theory , *VECTOR quantization , *NONPARAMETRIC estimation , *PROBABILISTIC generative models - Abstract
Generative artificial intelligence has made significant strides, producing text indistinguishable from human prose and remarkably photorealistic images. Automatically measuring how close the generated data distribution is to the target distribution is central to diagnosing existing models and developing better ones. We present MAUVE, a family of comparison measures between pairs of distributions such as those encountered in the generative modeling of text or images. These scores are statistical summaries of divergence frontiers capturing two types of errors in generative modeling. We explore three approaches to statistically estimate these scores: vector quantization, non-parametric estimation, and classifier-based estimation. We provide statistical bounds for the vector quantization approach. Empirically, we find that the proposed scores paired with a range of f-divergences and statistical estimation methods can quantify the gaps between the distributions of humanwritten text and those of modern neural language models by correlating with human judgments and identifying known properties of the generated texts. We demonstrate in the vision domain that MAUVE can identify known properties of generated images on par with or better than existing metrics. In conclusion, we present practical recommendations for using MAUVE effectively with language and image modalities. [ABSTRACT FROM AUTHOR]
- Published
- 2023
6. Chain Rule Optimal Transport
- Author
-
Nielsen, Frank, Sun, Ke, Celebi, Emre, Series Editor, Chen, Jingdong, Series Editor, Gopi, E. S., Series Editor, Neustein, Amy, Series Editor, Poor, H. Vincent, Series Editor, and Nielsen, Frank, editor
- Published
- 2021
- Full Text
- View/download PDF
7. Uniform Treatment of Integral Majorization Inequalities with Applications to Hermite-Hadamard-Fejér-Type Inequalities and f-Divergences
- Author
-
László Horváth
- Subjects
majorization inequalities ,convex functions ,signed measures ,Hermite-Hadamard-Fejér-type inequalities ,refinement ,f-divergences ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In this paper, we present a general framework that provides a comprehensive and uniform treatment of integral majorization inequalities for convex functions and finite signed measures. Along with new results, we present unified and simple proofs of classical statements. To apply our results, we deal with Hermite-Hadamard-Fejér-type inequalities and their refinements. We present a general method to refine both sides of Hermite-Hadamard-Fejér-type inequalities. The results of many papers on the refinement of the Hermite-Hadamard inequality, whose proofs are based on different ideas, can be treated in a uniform way by this method. Finally, we establish a necessary and sufficient condition for when a fundamental inequality of f-divergences can be refined by another f-divergence.
- Published
- 2023
- Full Text
- View/download PDF
8. Refinements of the integral Jensen’s inequality generated by finite or infinite permutations
- Author
-
László Horváth
- Subjects
Discrete and integral Jensen’s inequalities ,Refinements ,Fejér inequality ,Quasi-arithmetic means ,f-divergences ,Mathematics ,QA1-939 - Abstract
Abstract There are a lot of papers dealing with applications of the so-called cyclic refinement of the discrete Jensen’s inequality. A significant generalization of the cyclic refinement, based on combinatorial considerations, has recently been discovered by the author. In the present paper we give the integral versions of these results. On the one hand, a new method to refine the integral Jensen’s inequality is developed. On the other hand, the result contains some recent refinements of the integral Jensen’s inequality as elementary cases. Finally some applications to the Fejér inequality (especially the Hermite–Hadamard inequality), quasi-arithmetic means, and f-divergences are presented.
- Published
- 2021
- Full Text
- View/download PDF
9. A new refinement of Jensen’s inequality with applications in information theory
- Author
-
Xiao Lei and Lu Guoxiang
- Subjects
refinements ,jensen’s inequality ,information theory ,shannon’s entropy ,f-divergences ,bounds ,26b25 ,26d15 ,94a17 ,Mathematics ,QA1-939 - Abstract
In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.
- Published
- 2020
- Full Text
- View/download PDF
10. (f,Γ)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics.
- Author
-
Birrell, Jeremiah, Dupuis, Paul, Katsoulakis, Markos A., Pantazis, Yannis, and Rey-Bellet, Luc
- Subjects
- *
STATISTICAL learning , *GENERATIVE adversarial networks , *PROBABILITY measures , *PREDICATE calculus , *PROBABILITY theory , *CONTINUOUS distributions - Abstract
We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both f-divergences and integral probability metrics (IPMs), such as the 1-Wasserstein distance. We prove under which assumptions these divergences, hereafter referred to as (f,Γ)-divergences, provide a notion of 'distance' between probability measures and show that they can be expressed as a two-stage mass-redistribution/mass-transport process. The (f,Γ)-divergences inherit features from IPMs, such as the ability to compare distributions which are not absolutely continuous, as well as from f-divergences, namely the strict concavity of their variational representations and the ability to control heavy-tailed distributions for particular choices of f. When combined, these features establish a divergence with improved properties for estimation, statistical learning, and uncertainty quantification applications. Using statistical learning as an example, we demonstrate their advantage in training generative adversarial networks (GANs) for heavy-tailed, not-absolutely continuous sample distributions. We also show improved performance and stability over gradient-penalized Wasserstein GAN in image generation. [ABSTRACT FROM AUTHOR]
- Published
- 2022
11. A New Information-Theoretical Distance Measure for Evaluating Community Detection Algorithms
- Author
-
Mariam Haroutunian, Karen Mkhitaryan, and Josiane Mothe
- Subjects
community detection ,f-divergences ,evaluation mea ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Community detection is a research area from network science dealing with the investigation of complex networks such as social or biological networks, aiming to identify subgroups (communities) of entities (nodes) that are more closely related to each other inside the community than with the remaining entities in the network. Various community detection algorithms have been developed and used in the literature however evaluating community structures that have been automatically detected is a challenging task due to varying results in different scenarios. Current evaluation measures that compare extracted community structures with the reference structure or ground truth suffer from various drawbacks; some of them having been point out in the literature. Information theoretic measures form a fundamental class in this domain and have recently received increasing interest. However even the well employed measures (NVI and NID) also share some limitations, particularly they are biased toward the number of communities in the network. The main contribution of this paper is to introduce a new measure that overcomes this limitation while holding the important properties of measures.We review the mathematical properties of our measure based on x2 divergence inspired from f-divergence measures in information theory. Theoretical properties as well as experimental results in various scenarios show the superiority of the proposed measure to evaluate community detection over the ones from the literature.
- Published
- 2019
- Full Text
- View/download PDF
12. Refinements of the integral Jensen's inequality generated by finite or infinite permutations.
- Author
-
Horváth, László
- Subjects
- *
JENSEN'S inequality , *ARITHMETIC mean , *PERMUTATIONS , *INTEGRALS , *INTEGRAL inequalities , *FINITE, The - Abstract
There are a lot of papers dealing with applications of the so-called cyclic refinement of the discrete Jensen's inequality. A significant generalization of the cyclic refinement, based on combinatorial considerations, has recently been discovered by the author. In the present paper we give the integral versions of these results. On the one hand, a new method to refine the integral Jensen's inequality is developed. On the other hand, the result contains some recent refinements of the integral Jensen's inequality as elementary cases. Finally some applications to the Fejér inequality (especially the Hermite–Hadamard inequality), quasi-arithmetic means, and f-divergences are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. Zipf–Mandelbrot law, f-divergences and the Jensen-type interpolating inequalities
- Author
-
Neda Lovričević, Ðilda Pečarić, and Josip Pečarić
- Subjects
Jensen inequality ,Zipf and Zipf–Mandelbrot law ,Csiszár divergence functional ,f-divergences ,Kullback–Leibler divergence ,Mathematics ,QA1-939 - Abstract
Abstract Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of f-divergences and distances, such are Kullback–Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), χ2 $\chi^{2}$-divergence, total variation distance and triangular discrimination. Addressing these applications, we firstly deduce general results of the type for the Csiszár divergence functional from which the listed divergences originate. When presenting the analyzed inequalities for the Zipf–Mandelbrot law, we accentuate its special form, the Zipf law with its specific role in linguistics. We introduce this aspect through the Zipfian word distribution associated to the English and Russian languages, using the obtained bounds for the Kullback–Leibler divergence.
- Published
- 2018
- Full Text
- View/download PDF
14. Uniform Treatment of Integral Majorization Inequalities with Applications to Hermite-Hadamard-Fejér-Type Inequalities and f-Divergences
- Author
-
Horváth, László
- Subjects
majorization inequalities ,convex functions ,signed measures ,Hermite-Hadamard-Fejér-type inequalities ,refinement ,f-divergences - Abstract
In this paper, we present a general framework that provides a comprehensive and uniform treatment of integral majorization inequalities for convex functions and finite signed measures. Along with new results, we present unified and simple proofs of classical statements. To apply our results, we deal with Hermite-Hadamard-Fejér-type inequalities and their refinements. We present a general method to refine both sides of Hermite-Hadamard-Fejér-type inequalities. The results of many papers on the refinement of the Hermite-Hadamard inequality, whose proofs are based on different ideas, can be treated in a uniform way by this method. Finally, we establish a necessary and sufficient condition for when a fundamental inequality of f-divergences can be refined by another f-divergence.
- Published
- 2023
- Full Text
- View/download PDF
15. Delta Divergence: A Novel Decision Cognizant Measure of Classifier Incongruence.
- Author
-
Kittler, Josef and Zor, Cemre
- Abstract
In pattern recognition, disagreement between two classifiers regarding the predicted class membership of an observation can be indicative of an anomaly and its nuance. Since, in general, classifiers base their decisions on class a posteriori probabilities, the most natural approach to detecting classifier incongruence is to use divergence. However, existing divergences are not particularly suitable to gauge classifier incongruence. In this paper, we postulate the properties that a divergence measure should satisfy and propose a novel divergence measure, referred to as delta divergence. In contrast to existing measures, it focuses on the dominant (most probable) hypotheses and, thus, reduces the effect of the probability mass distributed over the non dominant hypotheses (clutter). The proposed measure satisfies other important properties, such as symmetry, and independence of classifier confidence. The relationship of the proposed divergence to some baseline measures, and its superiority, is shown experimentally. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
16. Refinement of the Jensen integral inequality
- Author
-
Sever Dragomir Silvestru, Adil Khan Muhammad, and Abathun Addisalem
- Subjects
convex functions ,jensen’s inequality ,f-divergences ,26d15 ,94a17 ,Mathematics ,QA1-939 - Abstract
In this paper we give a refinement of Jensen’s integral inequality and its generalization for linear functionals. We also present some applications in Information Theory.
- Published
- 2016
- Full Text
- View/download PDF
17. On Relations Between the Relative Entropy and χ2-Divergence, Generalizations and Applications
- Author
-
Tomohiro Nishiyama and Igal Sason
- Subjects
relative entropy ,chi-squared divergence ,f-divergences ,method of types ,large deviations ,strong data–processing inequalities ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.
- Published
- 2020
- Full Text
- View/download PDF
18. On f-Divergences: Integral Representations, Local Behavior, and Inequalities.
- Author
-
Sason, Igal
- Subjects
- *
STATISTICS , *RENYI'S entropy , *BAYESIAN analysis , *SIGNAL processing , *ENTROPY (Information theory) - Abstract
This paper is focused on f-divergences, consisting of three main contributions. The first one introduces integral representations of a general f-divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of f-divergence inequalities, and it exemplifies their utility in the setup of Bayesian binary hypothesis testing. The last part of this paper further studies the local behavior of f-divergences. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. On Data-Processing and Majorization Inequalities for f-Divergences with Applications
- Author
-
Igal Sason
- Subjects
contraction coefficient ,data-processing inequalities ,f-divergences ,hypothesis testing ,list decoding ,majorization theory ,rényi information measures ,tsallis entropy ,tunstall trees ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper is focused on the derivation of data-processing and majorization inequalities for f-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript.
- Published
- 2019
- Full Text
- View/download PDF
20. A new refinement of Jensen’s inequality with applications in information theory
- Author
-
Guoxiang Lu and Lei Xiao
- Subjects
refinements ,General Mathematics ,lcsh:Mathematics ,010102 general mathematics ,94a17 ,010103 numerical & computational mathematics ,Information theory ,lcsh:QA1-939 ,01 natural sciences ,GeneralLiterature_MISCELLANEOUS ,26b25 ,shannon’s entropy ,26d15 ,jensen’s inequality ,0101 mathematics ,bounds ,Mathematical economics ,Jensen's inequality ,Mathematics ,information theory ,f-divergences - Abstract
In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.
- Published
- 2020
21. A Note on Reverse Pinsker Inequalities.
- Author
-
Binette, Olivier
- Subjects
- *
MATHEMATICAL equivalence , *INFORMATION theory - Abstract
A simple method is shown to provide optimal variational bounds on $f$ -divergences with possible constraints on relative information extremums. The known results are refined or proved to be optimal as particular cases. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. More Accurate Majorization Inequalities Obtained Via Superquadraticity and Convexity with Application to Entropies
- Author
-
Ivelić Bradanović, Slavica
- Published
- 2021
- Full Text
- View/download PDF
23. A Functional Perspective on Information Measures
- Author
-
Esposito, Amedeo Roberto and Gastpar, Michael Christoph
- Subjects
dual ,learning theory ,renyi's divergences ,hypothesis testing ,duality ,estimation theory ,entropy ,information theory ,information measures ,f-divergences - Abstract
Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Others were the fruit of generalisation and the mathematical genius of bright minds like Rényi, Csizsár and Sibson. These powerful objects allow us to manipulate probabilities intuitively and seem always to be somehow connected to concrete settings in communication, coding or estimation theory. A common theme is: take a problem in one of these areas, try to control (upper or lower-bound) the expected value of some function of interest (often, probabilities of error) and, with enough work, an information measure appears as a fundamental limit of the problem. The most striking example of this is in Shannon's seminal paper in 1948: his purpose was to characterise the smallest possible expected length of a uniquely decodable encoding that compresses the realisations of a random variable. As he brilliantly proved, the smallest expected length one can hope for is the Entropy of the random variable. In establishing this connection, another quantity needed to be implicitly controlled: the Kraft's sum of the code. Seemingly unrelated before, these three objects joined forces in harmony to provide a beautiful and fundamental result. But why are they related? The answer seems to be: duality. Duality is an abstract notion commonly used in linear algebra and functional analysis. It has been expanded and generalised over the years. Several incarnations have been discovered throughout mathematics. One particular instance of this involves vector spaces: given two vector spaces and a "duality pairing" one can jump from one space to the other (its dual) through Legendre-Fenchel-like transforms. In the most common settings in Information Theory, the two spaces and the pairing are, respectively: 1) the space of (probability)measures defined on X; 2) the space of bounded functions defined on X; 3) the Lebesgue integral of the function (the expected value of the function if the measure is a probability measure). Once these are set, Legendre-Fenchel-like transforms allow us to connect a) a functional acting on the space described in 1), b) a functional acting on the space described in 2) and the anchor point is c) the (expected) value described in 3). These three pieces (a), b) and c)) represent the actors of many of the results provided in Information Theory. Once they are found, one usually bounds the functional described in b) and obtains a bound connecting the expected value and the functional of measures (e.g., an information measure). Going back to Shannon's result, fixed a random variable (and thus, a probability measure) and selected the function to be the length of a code: the functional a) is the Shannon Entropy of the source; the functional b) is the Kraft sum of the code; the pairing c) is the expected length of the code. We explore this connection and this pattern throughout the thesis. We will see how it can be found in notable results like Coding Theorems for one-to-one codes, Campbell's Coding Theorem, Arikan's Guessing Theorem, Fano-like and Transportation-Cost Inequalities and so on. Moreover, unearthing the pattern allows us to generalise it to other information measures and apply the technique in a variety of fields, including Learning Theory, Estimation Theory and Hypothesis Testing.
- Published
- 2022
- Full Text
- View/download PDF
24. Variational regularisation for inverse problems with imperfect forward operators and general noise models
- Author
-
Bungert, Leon, Martin.Burger@Fau.De, Burger, Martin, Korolev, Yury, Schönlieb, Carola-Bibiane, Bungert, Leon [0000-0002-6554-9892], Burger, Martin [0000-0003-2619-2912], Korolev, Yury [0000-0002-6339-652X], and Apollo - University of Cambridge Repository
- Subjects
Paper ,F-divergences ,Discrepancy Principle ,Wasserstein Distances ,Banach Lattices ,Imperfect Forward Models ,Bregman Distances ,Kullback–leibler Divergence - Abstract
Funder: Cantab Capital Institute for the Mathematics of Information, Funder: National Physical Laboratory; doi: https://doi.org/10.13039/501100007851, Funder: Alan Turing Institute; doi: https://doi.org/10.13039/100012338, We study variational regularisation methods for inverse problems with imperfect forward operators whose errors can be modelled by order intervals in a partial order of a Banach lattice. We carry out analysis with respect to existence and convex duality for general data fidelity terms and regularisation functionals. Both for a priori and a posteriori parameter choice rules, we obtain convergence rates of the regularised solutions in terms of Bregman distances. Our results apply to fidelity terms such as Wasserstein distances, φ-divergences, norms, as well as sums and infimal convolutions of those.
- Published
- 2021
- Full Text
- View/download PDF
25. More Accurate Majorization Inequalities Obtained Via Superquadraticity and Convexity with Application to Entropies
- Author
-
Slavica Ivelić Bradanović
- Subjects
Pure mathematics ,Inequality ,General Mathematics ,media_common.quotation_subject ,010102 general mathematics ,Regular polygon ,01 natural sciences ,Convexity ,010101 applied mathematics ,Rényi entropy ,Sherman inequality ,majorization inequality ,f-divergences ,entropy ,convex function ,superquadratic function ,Converse ,Statistical dispersion ,0101 mathematics ,Convex function ,Majorization ,Mathematics ,media_common - Abstract
Different terms such as variability, inequality and dispersion, which occur in various engineering problems and scientific fields, in mathematics are most simply described by the concept of majorization, a powerful mathematical tool which allows one to see the existing connections between vectors that can be used. In majorization theory, majorization inequalities play an important role. In this paper, using known properties of superquadratic functions, extensions and improvements of majorization inequalities are obtained. Also their converse inequalities are presented. For superquadratic functions, which are not convex, results analog ones for convex functions are presented. For superquadratic functions which are convex, improvements are given. At the end, applications to $$\varphi $$ -divergences are discussed. New estimates for the Renyi entropy are derivated.
- Published
- 2021
26. On Relations between the Relative Entropy and χ2-Divergence, Generalizations and Applications
- Author
-
Igal Sason and Tomohiro Nishiyama
- Subjects
Kullback–Leibler divergence ,information contraction ,Computer Science - Information Theory ,method of types ,General Physics and Astronomy ,lcsh:Astrophysics ,02 engineering and technology ,Information theory ,01 natural sciences ,large deviations ,010305 fluids & plasmas ,chi-squared divergence ,0103 physical sciences ,lcsh:QB460-466 ,0202 electrical engineering, electronic engineering, information engineering ,Statistical physics ,Divergence (statistics) ,lcsh:Science ,Contraction (operator theory) ,Mathematics ,Lossless compression ,maximal correlation ,strong data–processing inequalities ,Markov chain ,Markov chains ,relative entropy ,020206 networking & telecommunications ,lcsh:QC1-999 ,Rate of convergence ,Large deviations theory ,lcsh:Q ,lcsh:Physics ,Mathematics - Probability ,f-divergences - Abstract
The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of $f$-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong~data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains., Comment: Published in the Entropy journal, May 18, 2020. Journal version (open access) is available at https://www.mdpi.com/1099-4300/22/5/563
- Published
- 2020
- Full Text
- View/download PDF
27. On Data-Processing and Majorization Inequalities for f-Divergences
- Author
-
Sason, Igal, Lapidoth, Amos, and Moser, Stefan M.
- Subjects
Contraction coefficient ,Tunstall trees ,Renyi information measures ,hypothesis testing ,data-processing inequalities ,f-divergences ,list decoding ,majorization ,Tsallis entropy - Abstract
International Zurich Seminar on Information and Communication (IZS 2020). Proceedings
- Published
- 2020
- Full Text
- View/download PDF
28. A GENERALIZATION OF f-DIVERGENCE MEASURE TO CONVEX FUNCTIONS DEFINED ON LINEAR SPACES.
- Author
-
DRAGOMIR, S. S.
- Subjects
- *
GENERALIZATION , *DIVERGENCE theorem , *MEASURE theory , *CONVEX functions , *VECTOR spaces - Abstract
In this paper we generalize the concept of f-divergence to a convex function defined on a convex cone in a linear space. Some fundamental results are established. Applications for some well known divergence measures are provided as well. [ABSTRACT FROM AUTHOR]
- Published
- 2013
29. REVERSIBILITY CONDITIONS FOR QUANTUM OPERATIONS.
- Author
-
Jenˇov´, Anna
- Subjects
- *
QUANTUM theory , *MATHEMATICAL mappings , *LEBESGUE-Radon-Nikodym theorems , *DERIVATIVES (Mathematics) , *STATISTICAL hypothesis testing , *FISHER information - Abstract
We give a list of equivalent conditions for reversibility of the adjoint of a unital Schwarz map, with respect to a set of quantum states. A large class of such conditions is given by preservation of distinguishability measures: F-divergences, L1-distance, quantum Chernoff and Hoeffding distances. Here we summarize and extend the known results. Moreover, we prove a number of conditions in terms of the properties of a quantum Radon Nikodym derivative and factorization of states in the given set. Finally, we show that reversibility is equivalent to preservation of a large class of quantum Fisher infor-mations and x2-divergences. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
30. Lower Bounds for the Minimax Risk Using f-Divergences, and Applications.
- Author
-
Guntuboyina, Adityanand
- Subjects
- *
CHEBYSHEV approximation , *PROBABILITY theory , *ESTIMATION theory , *MATHEMATICAL inequalities , *ENTROPY , *ELECTRIC noise , *MATHEMATICAL functions - Abstract
Lower bounds involving f-divergences between the underlying probability measures are proved for the minimax risk in estimation problems. Our proofs just use simple convexity facts. Special cases and straightforward corollaries of our bounds include well known inequalities for establishing minimax lower bounds such as Fano's inequality, Pinsker's inequality and inequalities based on global entropy conditions. Two applications are provided: a new minimax lower bound for the reconstruction of convex bodies from noisy support function measurements and a different proof of a recent minimax lower bound for the estimation of a covariance matrix.ref refid="fn1"/ id="fn1" asterisk="no"paraAfter acceptance of this manuscript, Prof. A. Gushchin pointed out that Theorem II.1 appears in his paper refid="ref14"/. Specifically, in a different notation, inequality
(5) appears as Theorem 1 and inequality(4) appears in refid="ref14", Sec. 4.3. The proof of Theorem II.1 presented in Section II is different from that in refid="ref14"/. Also, except for Theorem II.1 and the observation that Fano's inequality is a special case of Theorem II.1 (see Example II.4), there is no other overlap between this paper and refid="ref14"/.para [ABSTRACT FROM AUTHOR]- Published
- 2011
- Full Text
- View/download PDF
31. Robust utility maximization for complete and incomplete market models.
- Author
-
Gundel, Anne
- Subjects
UTILITY functions ,MATHEMATICAL models ,FINANCE ,EXPECTED utility ,EXPECTED returns - Abstract
We investigate the problem of maximizing the robust utility functional. We give the dual characterization for its solution for both a complete and an incomplete market model. To this end, we introduce the new notion of reverse f-projections and use techniques developed for f-divergences. This is a suitable tool to reduce the robust problem to the classical problem of utility maximization under a certain measure: the reverse f-projection. Furthermore, we give the dual characterization for a closely related problem, the minimization of expenditures given a minimum level of expected utility in a robust setting and for an incomplete market. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
32. A New Information-Theoretical Distance Measure for Evaluating Community Detection Algorithms
- Author
-
Josiane Mothe, Karen Mkhitaryan, Mariam Haroutunian, National Academy of Sciences of the Republic of Armenia [Yerevan] (NAS RA), Systèmes d’Informations Généralisées (IRIT-SIG), Institut de recherche en informatique de Toulouse (IRIT), Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Centre National de la Recherche Scientifique (CNRS)-Institut National Polytechnique (Toulouse) (Toulouse INP), Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées, Université Toulouse - Jean Jaurès (UT2J), Centre National de la Recherche Scientifique - CNRS (FRANCE), Institut National Polytechnique de Toulouse - Toulouse INP (FRANCE), Université Toulouse III - Paul Sabatier - UT3 (FRANCE), Université Toulouse - Jean Jaurès - UT2J (FRANCE), Université Toulouse 1 Capitole - UT1 (FRANCE), National Academy of Sciences of the Republic of Armenia (ARMENIA), Institut de Recherche en Informatique de Toulouse - IRIT (Toulouse, France), and Institut National Polytechnique de Toulouse - INPT (FRANCE)
- Subjects
Community detection ,Evaluation measure ,Electronic computers. Computer science ,[INFO.INFO-IR]Computer Science [cs]/Information Retrieval [cs.IR] ,Recherche d'information ,QA75.5-76.95 ,evaluation mea ,f-divergences - Abstract
International audience; Community detection is a research area from network science dealing withthe investigation of complex networks such as social or biological networks, aimingto identify subgroups (communities) of entities (nodes) thatare more closely relatedto each other inside the community than with the remaining entities in the network.Various community detection algorithms have been developed and used in the literaturehowever evaluating community structures that have been automatically detected isa challenging task due to varying results in different scenarios.Current evaluationmeasures that compare extracted community structures with the reference structure orground truth suffer from various drawbacks; some of them having beenpoint out in theliterature. Information theoretic measures form a fundamental classin this domain andhave recently received increasing interest. However even the well employed measures(NVI and NID) also share some limitations, particularly they are biased toward thenumber of communities in the network. The main contribution ofthis paper is tointroduce a new measure that overcomes this limitation while holding the importantproperties of measures. We review the mathematical properties of our measure based on¿2divergence inspired fromf-divergence measures in information theory. Theoreticalproperties as well as experimental results in various scenarios show the superiority of theproposed measure to evaluate community detection over the ones from the literature.
- Published
- 2019
33. Refinement of the Jensen integral inequality
- Author
-
Muhammad Adil Khan, Addisalem Abathun, and Silvestru Sever Dragomir
- Subjects
Kantorovich inequality ,convex functions ,Young's inequality ,General Mathematics ,010102 general mathematics ,94a17 ,01 natural sciences ,GeneralLiterature_MISCELLANEOUS ,010101 applied mathematics ,Linear inequality ,26d15 ,QA1-939 ,Calculus ,jensen’s inequality ,Log sum inequality ,Rearrangement inequality ,0101 mathematics ,Convex function ,Mathematical economics ,Jensen's inequality ,Mathematics ,f-divergences ,Karamata's inequality - Abstract
In this paper we give a refinement of Jensen’s integral inequality and its generalization for linear functionals. We also present some applications in Information Theory.
- Published
- 2016
34. On $f$-Divergences: Integral Representations, Local Behavior, and Inequalities
- Author
-
Igal Sason
- Subjects
Rényi divergence ,FOS: Computer and information sciences ,Inequality ,Computer science ,Computer Science - Information Theory ,media_common.quotation_subject ,Bayesian probability ,General Physics and Astronomy ,02 engineering and technology ,Article ,local behavior ,DeGroot statistical information ,0202 electrical engineering, electronic engineering, information engineering ,FOS: Mathematics ,f-divergences ,relative information spectrum ,media_common ,Binary hypothesis testing ,Information Theory (cs.IT) ,Spectrum (functional analysis) ,Probability (math.PR) ,020206 networking & telecommunications ,Algebra ,020201 artificial intelligence & image processing ,Mathematics - Probability - Abstract
This paper is focused on $f$-divergences, consisting of three main contributions. The first one introduces integral representations of a general $f$-divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of $f$-divergence inequalities, and it exemplifies their utility in the setup of Bayesian binary hypothesis testing. The last part of this paper further studies the local behavior of $f$-divergences., Comment: Final edits before publication. To appear in the Entropy journal, special issue on Entropy and Information Inequalities, May 2018
- Published
- 2018
- Full Text
- View/download PDF
35. Viewpoint-based simplification using f-divergences
- Author
-
Castelló, P., Sbert, M., Chover, M., and Feixas, M.
- Subjects
- *
POLYGONS , *PROBABILITY theory , *MATHEMATICS education , *MATHEMATICAL analysis - Abstract
Abstract: We propose a new viewpoint-based simplification method for polygonal meshes, driven by several f-divergences such as Kullback–Leibler, Hellinger and Chi-Square. These distances are a measure of discrimination between probability distributions. The Kullback–Leibler distance between the projected and the actual area distributions of the polygons in the scene already has been used as a measure of viewpoint quality. In this paper, we use the variation in those viewpoint distances to determine the error introduced by an edge collapse. We apply the best half-edge collapse as a decimation criterion. The approximations produced by our method are close to the original model in terms of both visual and geometric criteria. Unlike many pure visibility-driven methods, our new approach does not completely remove hidden interiors in order to increase the visual quality of the simplified models. This makes our approach more suitable for applications which require exact geometry tolerance but also require high visual quality. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
36. Variational regularisation for inverse problems with imperfect forward operators and general noise models.
- Author
-
Bungert L, Burger M, Korolev Y, and Schönlieb CB
- Abstract
We study variational regularisation methods for inverse problems with imperfect forward operators whose errors can be modelled by order intervals in a partial order of a Banach lattice. We carry out analysis with respect to existence and convex duality for general data fidelity terms and regularisation functionals. Both for a priori and a posteriori parameter choice rules, we obtain convergence rates of the regularised solutions in terms of Bregman distances. Our results apply to fidelity terms such as Wasserstein distances, φ -divergences, norms, as well as sums and infimal convolutions of those., (© 2020 The Author(s). Published by IOP Publishing Ltd.)
- Published
- 2020
- Full Text
- View/download PDF
37. A Cramer-Rao Type Inequality for a Convex Loss Function
- Author
-
Liese, Friedrich, Kubík, Stanislav, editor, Víšek, Jan Ámos, editor, and Vísek, J. A.
- Published
- 1988
- Full Text
- View/download PDF
38. On Relations Between the Relative Entropy and χ 2 -Divergence, Generalizations and Applications.
- Author
-
Nishiyama T and Sason I
- Abstract
The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f -divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.
- Published
- 2020
- Full Text
- View/download PDF
39. Minimal Entropy Martingale Measure for Lévy Processes
- Author
-
Krol, Katja and Küchler, Uwe
- Subjects
mathematical finance ,martingale measures ,incomplete markets ,Lévy processes ,relative entropy ,minimal entropy martingale measure ,510 Mathematik ,ddc:510 ,Eshscer martingale transform ,f-divergences - Abstract
Let X be a real-valued Lévy process under P in its natural filtration. The minimal entropy martingale measure is defined as an absolutely continuous martingale measure that minimizes the relative entropy with respect to P. We show in this paper that the sufficient conditions for its existence, known in literature, are also necessary and give an explicit formula for the infimum of the relative entropy.
- Published
- 2011
40. Divergence measures for statistical data processing
- Author
-
Basseville, Michèle, Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-Télécom Bretagne-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), CentraleSupélec-Télécom Bretagne-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Université de Bretagne Sud (UBS)-Centre National de la Recherche Scientifique (CNRS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)
- Subjects
divergencebased statistical inference ,spectral divergence measures ,[INFO.INFO-OH]Computer Science [cs]/Other [cs.OH] ,Distance measures ,divergences ,barycenters ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,Bregman divergences ,f-divergences - Abstract
This note provides a bibliography of investigations based on or related to divergence measures for theoretical and applied inference problems.
- Published
- 2010
41. On surrogate loss functions and f-divergences
- Author
-
XuanLong Nguyen, Michael I. Jordan, and Martin J. Wainwright
- Subjects
Statistics and Probability ,FOS: Computer and information sciences ,68Q32 ,Computer Science - Information Theory ,Binary number ,Mathematics - Statistics Theory ,02 engineering and technology ,Statistics Theory (math.ST) ,01 natural sciences ,010104 statistics & probability ,Bayes' theorem ,statistical machine learning ,Discriminant function analysis ,Consistency (statistics) ,62K05 ,Covariate ,0202 electrical engineering, electronic engineering, information engineering ,FOS: Mathematics ,Applied mathematics ,Empirical risk minimization ,Ali-Silvey divergences ,0101 mathematics ,62G10, 68Q32, 62K05 (Primary) ,Binary classification ,Mathematics ,Information Theory (cs.IT) ,Mathematical statistics ,020206 networking & telecommunications ,discriminant analysis ,Bayes consistency ,nonparametric decentralized detection ,Statistics, Probability and Uncertainty ,surrogate losses ,f-divergences ,quantizer design ,62G10 - Abstract
The goal of binary classification is to estimate a discriminant function $\gamma$ from observations of covariate vectors and corresponding binary labels. We consider an elaboration of this problem in which the covariates are not available directly but are transformed by a dimensionality-reducing quantizer $Q$. We present conditions on loss functions such that empirical risk minimization yields Bayes consistency when both the discriminant function and the quantizer are estimated. These conditions are stated in terms of a general correspondence between loss functions and a class of functionals known as Ali-Silvey or $f$-divergence functionals. Whereas this correspondence was established by Blackwell [Proc. 2nd Berkeley Symp. Probab. Statist. 1 (1951) 93--102. Univ. California Press, Berkeley] for the 0--1 loss, we extend the correspondence to the broader class of surrogate loss functions that play a key role in the general theory of Bayes consistency for binary classification. Our result makes it possible to pick out the (strict) subset of surrogate loss functions that yield Bayes consistency for joint estimation of the discriminant function and the quantizer., Comment: Published in at http://dx.doi.org/10.1214/08-AOS595 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org)
- Published
- 2009
42. Viewpoint-based simplification using f-divergences
- Author
-
Miguel Chover, Pascual Castelló, Mateu Sbert, and Miquel Feixas
- Subjects
Information Systems and Management ,media_common.quotation_subject ,Collapse (topology) ,Measure (mathematics) ,Simplification ,Theoretical Computer Science ,Level-of-detail ,Artificial Intelligence ,Quality (business) ,Polygon mesh ,media_common ,Mathematics ,Decimation ,business.industry ,Viewpoint selection ,Pattern recognition ,Computer Science Applications ,Control and Systems Engineering ,Probability distribution ,Artificial intelligence ,Enhanced Data Rates for GSM Evolution ,business ,Algorithm ,Software ,Level of detail ,f-divergences - Abstract
We propose a new viewpoint-based simplification method for polygonal meshes, driven by several f-divergences such as Kullback-Leibler, Hellinger and Chi-Square. These distances are a measure of discrimination between probability distributions. The Kullback-Leibler distance between the projected and the actual area distributions of the polygons in the scene already has been used as a measure of viewpoint quality. In this paper, we use the variation in those viewpoint distances to determine the error introduced by an edge collapse. We apply the best half-edge collapse as a decimation criterion. The approximations produced by our method are close to the original model in terms of both visual and geometric criteria. Unlike many pure visibility-driven methods, our new approach does not completely remove hidden interiors in order to increase the visual quality of the simplified models. This makes our approach more suitable for applications which require exact geometry tolerance but also require high visual quality. © 2008 Elsevier Inc. All rights reserved.
- Published
- 2008
43. Robust utility maximization, f-projections, and risk constraints
- Author
-
Gundel, Anne, Schied, Alexander, Föllmer, Hans, and Schweizer, Martin
- Subjects
27 Mathematik ,Risiko-Nebenbedingung ,f-projections ,utility maximization ,510 Mathematik ,Modellunsicherheit ,robuste Nutzenfunktionale ,robust utility functionals ,f-Projektionen ,risk constraints ,model uncertainty ,ddc:510 ,f-Divergenzen ,Nutzenmaximierung ,f-divergences - Abstract
Ein wichtiges Gebiet der Finanzmathematik ist die Bestimmung von Auszahlungsprofilen, die den erwarteten Nutzen eines Agenten unter einer Budgetrestriktion maximieren. Wir charakterisieren optimale Auszahlungsprofile für einen Agenten, der unsicher ist in Bezug auf das genaue Marktmodell. Der hier benutzte Dualitätsansatz führt zu einem Minimierungsproblem für bestimmte konvexe Funktionale über zwei Mengen von Wahrscheinlichkeitsmaßen, das wir zunächst lösen müssen. Schließlich führen wir noch eine zweite Restriktion ein, die das Risiko beschränkt, das der Agent eingehen darf. Wir gehen dabei wie folgt vor: Kapitel 1. Wir betrachten das Problem, die f-Divergenz f(P|Q) über zwei Mengen von Wahrscheinlichkeitsmaßen zu minimieren, wobei f eine konvexe Funktion ist. Wir zeigen, dass unter der Bedingung "f( undendlich ) / undendlich = undendlich" Minimierer existieren, falls die erste Menge abgeschlossen und die zweite schwach kompakt ist. Außerdem zeigen wir, dass unter der Bedingung "f( undendlich ) / undendlich = 0" ein Minimierer in einer erweiterten Klasse von Martingalmaßen existiert, falls die zweite Menge schwach kompakt ist. Kapitel 2. Die Existenzresultate aus dem ersten Kapitel implizieren die Existenz eines Auszahlungsprofils, das das robuste Nutzenfunktional inf E_Q[u(X)] über eine Menge von finanzierbaren Auszahlungen maximiert, wobei das Infimum über eine Menge von Modellmaßen betrachtet wird. Die entscheidende Idee besteht darin, die minimierenden Maße aus dem ersten Kapitel als gewisse "worst-case"-Maße zu identifizieren. Kapitel 3. Schließlich fordern wir, dass das Risiko der Auszahlungsprofile beschränkt ist. Wir lösen das robuste Problem in einem unvollständigen Marktmodell für Nutzenfunktionen, die nur auf der positiven Halbachse definiert sind. In einem Beispiel vergleichen wir das optimale Auszahlungsprofil unter der Risikorestriktion mit den optimalen Auszahlungen ohne eine solche Restriktion und unter einer Value-at-Risk-Nebenbedingung. Finding payoff profiles that maximize the expected utility of an agent under some budget constraint is a key issue in financial mathematics. We characterize optimal contingent claims for an agent who is uncertain about the market model. The dual approach that we use leads to a minimization problem for a certain convex functional over two sets of measures, which we first have to solve. Finally, we incorporate a second constraint that limits the risk that the agent is allowed to take. We proceed as follows: Chapter 1. Given a convex function f, we consider the problem of minimizing the f-divergence f(P|Q) over these two sets of measures. We show that, if the first set is closed and the second set is weakly compact, a minimizer exists if f( infinity ) / infinity = infinity. Furthermore, we show that if the second set of measures is weakly compact and f( infinifty ) / infinity = 0, then there is a minimizer in a class of extended martingale measures. Chapter 2. The existence results in Chapter 1 lead to the existence of a contingent claim which maximizes the robust utility functional inf E_Q[u(X)] over some set of affordable contingent claims, where the infimum is taken over a set of subjective or modell measures. The key idea is to identify the minimizing measures from the first chapter as certain worst case measures. Chapter 3. Finally, we require the risk of the contingent claims to be bounded. We solve the robust problem in an incomplete market for a utility function that is only defined on the positive halfline. In an example we compare the optimal claim under this risk constraint with the optimal claims without a risk constraint and under a value-at-risk constraint.
- Published
- 2006
44. Information theoretic refinement criteria for image synthesis
- Author
-
Rigau Vilalta, Jaume, Sbert, Mateu, Feixas Feixas, Miquel, and Universitat Politècnica de Catalunya. Departament de Llenguatges i Sistemes Informàtics
- Subjects
radiosity ,ray trancing ,Informàtica [Àrees temàtiques de la UPC] ,computer graphics ,Ordinadors -- Aplicacions científiques ,refinement criteria ,adaptive sampling ,complexity ,mutual information ,entropy ,information theory ,f-divergences - Abstract
Aquest treball està enmarcat en el context de gràfics per computador partint de la intersecció de tres camps: rendering, teoria de la informació, i complexitat.Inicialment, el concepte de complexitat d'una escena es analitzat considerant tres perspectives des d'un punt de vista de la visibilitat geomètrica: complexitat en un punt interior, complexitat d'una animació, i complexitat d'una regió. L'enfoc principal d'aquesta tesi és l'exploració i desenvolupament de nous criteris de refinament pel problema de la il·luminació global. Mesures de la teoria de la informació basades en la entropia de Shannon i en la entropia generalitzada de Harvda-Charvát-Tsallis, conjuntament amb les f-divergències, són analitzades com a nuclis del refinement. Mostrem com ens aporten una rica varietat d'eficients i altament discriminatòries mesures que són aplicables al rendering en els seus enfocs de pixel-driven (ray-tracing) i object-space (radiositat jeràrquica).Primerament, basat en la entropia de Shannon, es defineixen un conjunt de mesures de qualitat i contrast del pixel. S'apliquen al supersampling en ray-tracing com a criteris de refinement, obtenint un algorisme nou de sampleig adaptatiu basat en entropia, amb un alt rati de qualitat versus cost. En segon lloc, basat en la entropia generalitzada de Harvda-Charvát-Tsallis, i en la informació mutua generalitzada, es defineixen tres nous criteris de refinament per la radiositat jeràrquica. En correspondencia amb tres enfocs clàssics, es presenten els oracles basats en la informació transportada, el suavitzat de la informació, i la informació mutua, amb resultats molt significatius per aquest darrer. Finalment, tres membres de la familia de les f-divergències de Csiszár's (divergències de Kullback-Leibler, chi-square, and Hellinger) son analitzats com a criteris de refinament mostrant bons resultats tant pel ray-tracing com per la radiositat jeràrquica., This work is framed within the context of computer graphics starting out from the intersection of three fields: rendering, information theory, and complexity.Initially, the concept of scene complexity is analysed considering three perspectives from a geometric visibility point of view: complexity at an interior point, complexity of an animation, and complexity of a region. The main focus of this dissertation is the exploration and development of new refinement criteria for the global illumination problem. Information-theoretic measures based on Shannon entropy and Harvda-Charvát-Tsallis generalised entropy, together with f-divergences, are analysed as kernels of refinement. We show how they give us a rich variety of efficient and highly discriminative measures which are applicable to rendering in its pixel-driven (ray-tracing) and object-space (hierarchical radiosity) approaches.Firstly, based on Shannon entropy, a set of pixel quality and pixel contrast measures are defined. They are applied to supersampling in ray-tracing as refinement criteria, obtaining a new entropy-based adaptive sampling algorithm with a high rate quality versus cost. Secondly, based on Harvda-Charvát-Tsallis generalised entropy, and generalised mutual information, three new refinement criteria are defined for hierarchical radiosity. In correspondence with three classic approaches, oracles based on transported information, information smoothness, and mutual information are presented, with very significant results for the latter. And finally, three members of the family of Csiszár's f-divergences (Kullback-Leibler, chi-square, and Hellinger divergences) are analysed as refinement criteria showing good results for both ray-tracing and hierarchical radiosity.
- Published
- 2006
45. Zipf-Mandelbrot law, f -divergences and the Jensen-type interpolating inequalities.
- Author
-
Lovričević N, Pečarić Ð, and Pečarić J
- Abstract
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf-Mandelbrot law applied to various types of f -divergences and distances, such are Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), [Formula: see text]-divergence, total variation distance and triangular discrimination. Addressing these applications, we firstly deduce general results of the type for the Csiszár divergence functional from which the listed divergences originate. When presenting the analyzed inequalities for the Zipf-Mandelbrot law, we accentuate its special form, the Zipf law with its specific role in linguistics. We introduce this aspect through the Zipfian word distribution associated to the English and Russian languages, using the obtained bounds for the Kullback-Leibler divergence., Competing Interests: The authors declare that they have no competing interests.
- Published
- 2018
- Full Text
- View/download PDF
46. Information theoretic refinement criteria for image synthesis
- Author
-
Universitat Politècnica de Catalunya. Departament de Llenguatges i Sistemes Informàtics, Sbert, Mateu, Feixas Feixas, Miquel, Rigau Vilalta, Jaume, Universitat Politècnica de Catalunya. Departament de Llenguatges i Sistemes Informàtics, Sbert, Mateu, Feixas Feixas, Miquel, and Rigau Vilalta, Jaume
- Abstract
Aquest treball està enmarcat en el context de gràfics per computador partint de la intersecció de tres camps: rendering, teoria de la informació, i complexitat. Inicialment, el concepte de complexitat d'una escena es analitzat considerant tres perspectives des d'un punt de vista de la visibilitat geomètrica: complexitat en un punt interior, complexitat d'una animació, i complexitat d'una regió. L'enfoc principal d'aquesta tesi és l'exploració i desenvolupament de nous criteris de refinament pel problema de la il·luminació global. Mesures de la teoria de la informació basades en la entropia de Shannon i en la entropia generalitzada de Harvda-Charvát-Tsallis, conjuntament amb les f-divergències, són analitzades com a nuclis del refinement. Mostrem com ens aporten una rica varietat d'eficients i altament discriminatòries mesures que són aplicables al rendering en els seus enfocs de pixel-driven (ray-tracing) i object-space (radiositat jeràrquica). Primerament, basat en la entropia de Shannon, es defineixen un conjunt de mesures de qualitat i contrast del pixel. S'apliquen al supersampling en ray-tracing com a criteris de refinement, obtenint un algorisme nou de sampleig adaptatiu basat en entropia, amb un alt rati de qualitat versus cost. En segon lloc, basat en la entropia generalitzada de Harvda-Charvát-Tsallis, i en la informació mutua generalitzada, es defineixen tres nous criteris de refinament per la radiositat jeràrquica. En correspondencia amb tres enfocs clàssics, es presenten els oracles basats en la informació transportada, el suavitzat de la informació, i la informació mutua, amb resultats molt significatius per aquest darrer. Finalment, tres membres de la familia de les f-divergències de Csiszár's (divergències de Kullback-Leibler, chi-square, and Hellinger) son analitzats com a criteris de refinament mostrant bons resultats tant pel ray-tracing com per la radiositat jeràrquica., This work is framed within the context of computer graphics starting out from the intersection of three fields: rendering, information theory, and complexity. Initially, the concept of scene complexity is analysed considering three perspectives from a geometric visibility point of view: complexity at an interior point, complexity of an animation, and complexity of a region. The main focus of this dissertation is the exploration and development of new refinement criteria for the global illumination problem. Information-theoretic measures based on Shannon entropy and Harvda-Charvát-Tsallis generalised entropy, together with f-divergences, are analysed as kernels of refinement. We show how they give us a rich variety of efficient and highly discriminative measures which are applicable to rendering in its pixel-driven (ray-tracing) and object-space (hierarchical radiosity) approaches. Firstly, based on Shannon entropy, a set of pixel quality and pixel contrast measures are defined. They are applied to supersampling in ray-tracing as refinement criteria, obtaining a new entropy-based adaptive sampling algorithm with a high rate quality versus cost. Secondly, based on Harvda-Charvát-Tsallis generalised entropy, and generalised mutual information, three new refinement criteria are defined for hierarchical radiosity. In correspondence with three classic approaches, oracles based on transported information, information smoothness, and mutual information are presented, with very significant results for the latter. And finally, three members of the family of Csiszár's f-divergences (Kullback-Leibler, chi-square, and Hellinger divergences) are analysed as refinement criteria showing good results for both ray-tracing and hierarchical radiosity., Postprint (published version)
- Published
- 2006
47. Robust utility maximization, f-projections, and risk constraints
- Author
-
Schied, Alexander, Föllmer, Hans, Schweizer, Martin, Gundel, Anne, Schied, Alexander, Föllmer, Hans, Schweizer, Martin, and Gundel, Anne
- Abstract
Ein wichtiges Gebiet der Finanzmathematik ist die Bestimmung von Auszahlungsprofilen, die den erwarteten Nutzen eines Agenten unter einer Budgetrestriktion maximieren. Wir charakterisieren optimale Auszahlungsprofile für einen Agenten, der unsicher ist in Bezug auf das genaue Marktmodell. Der hier benutzte Dualitätsansatz führt zu einem Minimierungsproblem für bestimmte konvexe Funktionale über zwei Mengen von Wahrscheinlichkeitsmaßen, das wir zunächst lösen müssen. Schließlich führen wir noch eine zweite Restriktion ein, die das Risiko beschränkt, das der Agent eingehen darf. Wir gehen dabei wie folgt vor: Kapitel 1. Wir betrachten das Problem, die f-Divergenz f(P|Q) über zwei Mengen von Wahrscheinlichkeitsmaßen zu minimieren, wobei f eine konvexe Funktion ist. Wir zeigen, dass unter der Bedingung "f( undendlich ) / undendlich = undendlich" Minimierer existieren, falls die erste Menge abgeschlossen und die zweite schwach kompakt ist. Außerdem zeigen wir, dass unter der Bedingung "f( undendlich ) / undendlich = 0" ein Minimierer in einer erweiterten Klasse von Martingalmaßen existiert, falls die zweite Menge schwach kompakt ist. Kapitel 2. Die Existenzresultate aus dem ersten Kapitel implizieren die Existenz eines Auszahlungsprofils, das das robuste Nutzenfunktional inf E_Q[u(X)] über eine Menge von finanzierbaren Auszahlungen maximiert, wobei das Infimum über eine Menge von Modellmaßen betrachtet wird. Die entscheidende Idee besteht darin, die minimierenden Maße aus dem ersten Kapitel als gewisse "worst-case"-Maße zu identifizieren. Kapitel 3. Schließlich fordern wir, dass das Risiko der Auszahlungsprofile beschränkt ist. Wir lösen das robuste Problem in einem unvollständigen Marktmodell für Nutzenfunktionen, die nur auf der positiven Halbachse definiert sind. In einem Beispiel vergleichen wir das optimale Auszahlungsprofil unter der Risikorestriktion mit den optimalen Auszahlungen ohne eine solche Restriktion und unter einer Value-at-Risk-Nebenbedingung., Finding payoff profiles that maximize the expected utility of an agent under some budget constraint is a key issue in financial mathematics. We characterize optimal contingent claims for an agent who is uncertain about the market model. The dual approach that we use leads to a minimization problem for a certain convex functional over two sets of measures, which we first have to solve. Finally, we incorporate a second constraint that limits the risk that the agent is allowed to take. We proceed as follows: Chapter 1. Given a convex function f, we consider the problem of minimizing the f-divergence f(P|Q) over these two sets of measures. We show that, if the first set is closed and the second set is weakly compact, a minimizer exists if f( infinity ) / infinity = infinity. Furthermore, we show that if the second set of measures is weakly compact and f( infinifty ) / infinity = 0, then there is a minimizer in a class of extended martingale measures. Chapter 2. The existence results in Chapter 1 lead to the existence of a contingent claim which maximizes the robust utility functional inf E_Q[u(X)] over some set of affordable contingent claims, where the infimum is taken over a set of subjective or modell measures. The key idea is to identify the minimizing measures from the first chapter as certain worst case measures. Chapter 3. Finally, we require the risk of the contingent claims to be bounded. We solve the robust problem in an incomplete market for a utility function that is only defined on the positive halfline. In an example we compare the optimal claim under this risk constraint with the optimal claims without a risk constraint and under a value-at-risk constraint.
- Published
- 2006
48. On Surrogate Loss Functions and f-Divergences
- Author
-
Nguyen, XuanLong, Wainwright, Martin J., and Jordan, Michael I.
- Published
- 2009
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.