Back to Search
Start Over
A link of extropy to entropy for continuous random variables via the generalized <italic>ϕ</italic>–entropy.
- Source :
-
Communications in Statistics: Theory & Methods . Jul2024, p1-19. 19p. 6 Illustrations. - Publication Year :
- 2024
-
Abstract
- AbstractThe concepts of entropy and divergence, along with their past, residual, and interval variants are revisited in a reliability theory context and generalized families of them that are based on <italic>ϕ</italic>-functions are discussed. Special emphasis is given in the parametric family of entropies and divergences of Cressie and Read. For non-negative and absolutely continuous random variables, the dual to Shannon entropy measure of uncertainty, the extropy, is considered and its link to a specific member of the <italic>ϕ</italic>-entropies family is shown. A number of examples demonstrate the implementation of the generalized entropies and divergences, exhibiting their utility. [ABSTRACT FROM AUTHOR]
- Subjects :
- *UNCERTAINTY (Information theory)
*ENGINEERING reliability theory
Subjects
Details
- Language :
- English
- ISSN :
- 03610926
- Database :
- Academic Search Index
- Journal :
- Communications in Statistics: Theory & Methods
- Publication Type :
- Academic Journal
- Accession number :
- 178311167
- Full Text :
- https://doi.org/10.1080/03610926.2024.2363869