36 results on '"Venkat, Kartik"'
Search Results
2. Mutual Information, Relative Entropy and Estimation Error in Semi-martingale Channels
3. Beyond Maximum Likelihood: from Theory to Practice
4. Minimax Estimation of Functionals of Discrete Distributions
5. Maximum Likelihood Estimation of Functionals of Discrete Distributions
6. Relations between Information and Estimation in Discrete-Time L\'evy Channels
7. Information Measures: the Curious Case of the Binary Alphabet
8. Justification of Logarithmic Loss via the Benefit of Side Information
9. Information, Estimation, and Lookahead in the Gaussian channel
10. Reference Based Genome Compression
11. Pointwise Relations between Information and Estimation in Gaussian Noise
12. Relations Between Information and Estimation in the Presence of Feedback
13. Relations Between Information and Estimation in the Presence of Feedback
14. Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels
15. Maximum Likelihood Estimation of Functionals of Discrete Distributions
16. Relations Between Information and Estimation in Discrete-Time Lévy Channels
17. Information, Estimation, and Lookahead in the Gaussian Channel
18. Mutual information, relative entropy and estimation error in semi-martingale channels
19. Some Old and New Relations between Information and Estimation
20. Maximum Likelihood Estimation of information measures
21. Minimax estimation of information measures
22. CaMoDi: a new method for cancer module discovery
23. Information Measures: The Curious Case of the Binary Alphabet
24. Information divergences and the curious case of the binary alphabet
25. Justification of logarithmic loss via the benefit of side information
26. Relations between information and estimation in scalar Lévy channels
27. Minimax estimation of information measures.
28. Maximum Likelihood Estimation of information measures.
29. The role of lookahead in estimation under Gaussian noise
30. Pointwise relations between information and estimation in the Poisson channel
31. On information, estimation and lookahead
32. Pointwise Relations Between Information and Estimation in Gaussian Noise
33. Joint source-channel coding of one random variable over the Poisson channel
34. Justification of Logarithmic Loss via the Benefit of Side Information.
35. Minimax Estimation of Functionals of Discrete Distributions.
36. Pointwise relations between information and estimation in Gaussian noise.
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.