Back to Search Start Over

Error Exponents in the Bee Identification Problem.

Authors :
Tamir, Ran
Merhav, Neri
Source :
IEEE Transactions on Information Theory. Oct2021, Vol. 67 Issue 10, p6564-6582. 19p.
Publication Year :
2021

Abstract

The bee identification problem is a problem of properly recognizing a massive amount of data (a numerous amount of bees in a beehive, for example) which have been mixed and corrupted by noise. We derive various error exponents in the bee identification problem under two different decoding rules. Under naïve decoding, which decodes each bee independently of the others, we analyze a general discrete memoryless channel and a relatively wide family of stochastic decoders. Upper and lower bounds to the random coding error exponent are derived and proved to be equal at relatively high coding rates. Then, we propose a lower bound on the error exponent of the typical random code, which improves upon the random coding exponent at low coding rates. We also derive a third bound, which is related to expurgated codes, which turns out to be strictly higher than the other bounds, also at relatively low rates. We show that the universal maximum mutual information decoder is optimal with respect to the typical random code and the expurgated code. Moving further, we derive error exponents under optimal decoding, the relatively wide family of symmetric channels, and the maximum likelihood decoder. We first propose a random coding lower bound, and then, an improved bound which stems from an expurgation process. We show numerically that our second bound strictly improves upon the random coding bound at an intermediate range of coding rates, where a bound derived in a previous work no longer holds. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
67
Issue :
10
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
153710472
Full Text :
https://doi.org/10.1109/TIT.2021.3091762