Back to Search
Start Over
Exponential weighted entropy and exponential weighted mutual information
- Source :
- Neurocomputing. 249:86-94
- Publication Year :
- 2017
- Publisher :
- Elsevier BV, 2017.
-
Abstract
- In this paper, the exponential weighted entropy (EWE) and exponential weighted mutual information (EWMI) are proposed as the more generalized forms of Shannon entropy and mutual information (MI), respectively. They are position-related and causal systems that redefine the foundations of information-theoretic metrics. As the special forms of the weighted entropy and the weighted mutual information, EWE and EWMI have been proved that they preserve nonnegativity and concavity properties similar to Shannon frameworks. They can be adopted as the information measures in spatial interaction modeling. Paralleling with the normalized mutual information (NMI), the normalized exponential weighted mutual information (NEWMI) is also investigated. Image registration experiments demonstrate that EWMI and NEWMI algorithms can achieve higher aligned accuracy than MI and NMI algorithms.
- Subjects :
- Mathematical optimization
Cognitive Neuroscience
020206 networking & telecommunications
02 engineering and technology
Mutual information
Pointwise mutual information
Joint entropy
Computer Science Applications
Exponential function
Rényi entropy
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
Entropy (information theory)
Applied mathematics
020201 artificial intelligence & image processing
Transfer entropy
Variation of information
Mathematics
Subjects
Details
- ISSN :
- 09252312
- Volume :
- 249
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi...........c6b9c1b2b285faceeadfdbef096a4e3d