10 results on '"generalized divergence"'
Search Results
2. Image Segmentation Using Level Set Driven by Generalized Divergence.
- Author
-
Dai, Ming, Zhou, Zhiheng, Wang, Tianlei, and Guo, Yongfan
- Subjects
- *
IMAGE segmentation , *LEVEL set methods , *VISUAL fields , *DISTRIBUTION (Probability theory) , *COMPUTER vision - Abstract
Image segmentation is an important analysis tool in the field of computer vision. In this paper, on the basis of the traditional level set method, a novel segmentation model using generalized divergences is proposed. The main advantage of generalized divergences is their smooth connection performance among various kinds of well-known and frequently used fundamental divergences with one formula. Therefore, the discrepancy between two probability distributions of segmented image parts can be measured by generalized divergences. We also found a solution to determine the optimal divergence automatically for different images. Experimental results on a variety of synthetic and natural images are presented, which demonstrate the potential of the proposed method. Compared with the previous active contour models formulated to solve the same nonparametric statistical segmentation problem, our method performs better both qualitatively and quantitatively. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
3. Properties of a Generalized Divergence Related to Tsallis Generalized Divergence.
- Author
-
Vigelis, Rui F., Andrade, Luiza H. F. De, and Cavalcante, Charles C.
- Subjects
- *
DIVERGENCE theorem , *EXPONENTIAL functions , *STATISTICAL learning , *INFORMATION theory , *ENTROPY (Information theory) - Abstract
In this paper, we investigate the partition inequality, joint convexity, and Pinsker’s inequality, for a divergence that generalizes the Tsallis Relative Entropy and Kullback–Leibler divergence. The generalized divergence is defined in terms of a deformed exponential function, which replaces the Tsallis $q$ -exponential. We also constructed a family of probability distributions related to the generalized divergence. We found necessary and sufficient conditions for the partition inequality to be satisfied. A sufficient condition for the joint convexity was established. We proved that the generalized divergence satisfies the partition inequality, and is jointly convex, if, and only if, it coincides with the Tsallis relative entropy. As an application of partition inequality, a criterion for the Pinsker’s inequality was found. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
4. Isotropic discretization methods of Laplacian and generalized divergence operators in phase field models.
- Author
-
Tang, C., Wu, D.T., and Quek, S.S.
- Subjects
- *
DISCRETIZATION methods , *FOURIER analysis , *PHASE space , *PHENOMENOLOGICAL theory (Physics) , *SOLIDIFICATION , *QUANTITATIVE research - Abstract
[Display omitted] Phase field models have been extensively used to address various physical phenomena. However, discretization-induced anisotropy remains a longstanding challenge for phase field models. By using a hexagonal mesh in 2D, we describe isotropic discretization methods for the computation of Laplacian and generalized divergence operators. Quantitative analyses derived from discrete Fourier analysis prove that our methods are more isotropic than commonly used approaches, including isotropic methods for square mesh. To compare the performance of conventional and our discretization methods, a specific phase field model of alloy solidification was selected to perform benchmark simulations. Various 2D simulations using different discretization methods were carried out to verify the accuracy and efficiency of the improved numerical methods in a hexagonal mesh. We emphasize that the improved numerical methods using a hexagonal mesh are general and may be equally applied to other physical models that include the same operators. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Studying Lexical Dynamics and Language Change via Generalized Entropies: The Problem of Sample Size
- Author
-
Alexander Koplenig, Sascha Wolfer, and Carolin Müller-Spitzer
- Subjects
generalized entropy ,generalized divergence ,Jensen–Shannon divergence ,sample size ,text length ,Zipf’s law ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Recently, it was demonstrated that generalized entropies of order α offer novel and important opportunities to quantify the similarity of symbol sequences where α is a free parameter. Varying this parameter makes it possible to magnify differences between different texts at specific scales of the corresponding word frequency spectrum. For the analysis of the statistical properties of natural languages, this is especially interesting, because textual data are characterized by Zipf’s law, i.e., there are very few word types that occur very often (e.g., function words expressing grammatical relationships) and many word types with a very low frequency (e.g., content words carrying most of the meaning of a sentence). Here, this approach is systematically and empirically studied by analyzing the lexical dynamics of the German weekly news magazine Der Spiegel (consisting of approximately 365,000 articles and 237,000,000 words that were published between 1947 and 2017). We show that, analogous to most other measures in quantitative linguistics, similarity measures based on generalized entropies depend heavily on the sample size (i.e., text length). We argue that this makes it difficult to quantify lexical dynamics and language change and show that standard sampling approaches do not solve this problem. We discuss the consequences of the results for the statistical analysis of languages.
- Published
- 2019
- Full Text
- View/download PDF
6. Duality in a maximum generalized entropy model.
- Author
-
Shinto Eguchi, Osamu Komori, and Atsumi Ohara
- Subjects
- *
GAUSSIAN distribution , *MAXIMUM entropy method , *GENERALIZATION , *MATHEMATICAL models , *DUALITY theory (Mathematics) , *PROBABILITY density function - Abstract
This paper discusses a possible generalization for the maximum entropy principle. A class of generalized entropy is introduced by that of generator functions, in which the maximum generalized distribution model is explicitly derived including q-Gaussian distributions, Wigner semicircle distributions and Pareto distributions. We define a totally geodesic subspace in the total space of all probability density functions in a framework of information geometry. The model of maximum generalized entropy distributions is shown to be totally geodesic. The duality of the model and the estimation in the maximum generalized principle is elucidated to give intrinsic understandings from the point of information geometry. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
7. Universal Simulation With Fidelity Criteria.
- Author
-
Merhav, Neri and Weinberger, Marcelo J.
- Subjects
- *
INFORMATION theory , *HIGH-fidelity sound systems , *MEASUREMENT of distances , *MARKOV processes , *DISTRIBUTION (Probability theory) , *ENTROPY (Information theory) - Abstract
We consider the problem of universal simulation of a memoryless source (with some partial extensions to Markov sources), based on a training sequence emitted from the source. The objective is to maximize the conditional entropy of the simulated sequence given the training sequence, subject to a certain distance constraint between the probability distribution of the output sequence and the probability distribution of the input, training sequence. We derive, for several distance criteria, single-letter expressions for the maximum attainable conditional entropy as well as corresponding universal simulation schemes that asymptotically attain these maxima. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
8. Conditions for the existence of a generalization of Rényi divergence.
- Author
-
Vigelis, Rui F., de Andrade, Luiza H.F., and Cavalcante, Charles C.
- Subjects
- *
STIMULUS generalization , *EXPONENTIAL functions , *NATURAL numbers , *ENTROPY , *MAXIMUM entropy method - Abstract
We give necessary and sufficient conditions for the existence of a generalization of Rényi divergence, which is defined in terms of a deformed exponential function. If the underlying measure μ is non-atomic, we found that not all deformed exponential functions can be used in the generalization of Rényi divergence; a condition involving the deformed exponential function is provided. In the case μ is purely atomic (the counting measure on the set of natural numbers), we show that any deformed exponential function can be used in the generalization. • Necessary and sufficient conditions for the existence of generalized Rényi entropy. • Wider class of probability distribution family. • Examples of functions which allow a more general entropy model. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
9. Studying Lexical Dynamics and Language Change via Generalized Entropies: The Problem of Sample Size.
- Author
-
Koplenig, Alexander, Wolfer, Sascha, and Müller-Spitzer, Carolin
- Subjects
LINGUISTIC change ,ENTROPY (Information theory) ,PROBLEM solving ,SAMPLE size (Statistics) ,PARAMETER estimation ,NATURAL language processing - Abstract
Recently, it was demonstrated that generalized entropies of order α offer novel and important opportunities to quantify the similarity of symbol sequences where α is a free parameter. Varying this parameter makes it possible to magnify differences between different texts at specific scales of the corresponding word frequency spectrum. For the analysis of the statistical properties of natural languages, this is especially interesting, because textual data are characterized by Zipf's law, i.e., there are very few word types that occur very often (e.g., function words expressing grammatical relationships) and many word types with a very low frequency (e.g., content words carrying most of the meaning of a sentence). Here, this approach is systematically and empirically studied by analyzing the lexical dynamics of the German weekly news magazine Der Spiegel (consisting of approximately 365,000 articles and 237,000,000 words that were published between 1947 and 2017). We show that, analogous to most other measures in quantitative linguistics, similarity measures based on generalized entropies depend heavily on the sample size (i.e., text length). We argue that this makes it difficult to quantify lexical dynamics and language change and show that standard sampling approaches do not solve this problem. We discuss the consequences of the results for the statistical analysis of languages. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
10. Generalized Entropies and Legendre Duality
- Author
-
TOHOKU GAKUIN UNIV SENDAI (JAPAN), Uohashi, Keiko, TOHOKU GAKUIN UNIV SENDAI (JAPAN), and Uohashi, Keiko
- Abstract
They have shown that a simple and computationally efficient algorithm can be derived to construct the alpha-Voronoi diagrams on the space of discrete probability distributions to make use of conformally flattened structure of alpha-geometry. They also studied 1) geometry for q-exponential families which are related with alpha-geometry, and its statistical applications, and 2) conformal flatness of level surfaces in Hessian domains. Especially they studied harmonic maps between level surfaces of Hessian domains and its relation to conformally flat structure.
- Published
- 2012
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.