Search

Your search keyword '"Information theory"' showing total 254 results

Search Constraints

Start Over You searched for: Descriptor "Information theory" Remove constraint Descriptor: "Information theory" Topic entropy Remove constraint Topic: entropy Topic entropy (information theory) Remove constraint Topic: entropy (information theory)
254 results on '"Information theory"'

Search Results

1. A new approach to the entropy of a transitive BE-algebra with countable partitions.

2. On the Supposed Mass of Entropy and That of Information.

3. Entropy for hydrological applications: A review.

4. Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information.

5. The use of entropy and information analysis to estimate the milk productivity of the Black-and-White dairy breed cows depending on their lineal affiliation.

6. Philosophy and Meanings of the Information Entropy Analysis of Road Safety: Case Study of Russian Cities.

7. Sensor Management Method of Giving Priority to Confirmed Identified Targets.

8. Information theory: A foundation for complexity science.

9. Second law of information dynamics.

10. GEOENT: A Toolbox for Calculating Directional Geological Entropy.

11. Asymptotic Properties of the Plug-in Estimator of the Discrete Entropy Under Dependence.

12. Entropy and Relative Entropy From Information-Theoretic Principles.

13. The Birthday Problem and Zero-Error List Codes.

14. NEUTROSOPHIC ENTROPY MEASURES FOR THE NORMAL DISTRIBUTION: THEORY AND APPLICATIONS.

15. Generalized information entropy and generalized information dimension.

16. Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms.

17. Information Entropy-Based Leakage Profiling.

18. A Scale-Invariant Generalization of the Rényi Entropy, Associated Divergences and Their Optimizations Under Tsallis’ Nonextensive Framework.

19. Recursive Algorithm to Verify Quasi-Uniform Entropy Vectors and its Applications.

20. Dynamic evolution law analysis of overburden separation and water flowing fracture under mining based on distributed optical fiber and information entropy theory.

21. Information Theory Based Probabilistic Approach to Blade Damage Detection of Turbomachine Using Sensor Data.

22. Conditional Entropy and Data Processing: An Axiomatic Approach Based on Core-Concavity.

23. Entropy vs. human ontogeny: the Shannon information measure; a reliable general indicator of human health?

24. Entropy Measures in Machine Fault Diagnosis: Insights and Applications.

25. Properties of a Generalized Divergence Related to Tsallis Generalized Divergence.

26. An Entropy Lower Bound for Non-Malleable Extractors.

27. Measuring Sample Path Causal Influences With Relative Entropy.

28. Local Entropy Statistics for Point Processes.

29. Entropy and Compression: A Simple Proof of an Inequality of Khinchin-Ornstein-Shields.

30. The Entropy Rate of Some Pólya String Models.

31. Max-flow min-cut theorems on dispersion and entropy measures for communication networks.

32. Expressions for the Entropy of Basic Discrete Distributions.

33. Determining the Number of Samples Required to Estimate Entropy in Natural Sequences.

34. A new sampling scheme combining maximum entropy and moment matching techniques for reactor physics uncertainty quantification.

35. Entropy and monotonicity in artificial intelligence.

36. Conditional entropy based classifier chains for multi-label classification.

37. Stock market daily volatility and information measures of predictability.

38. Information flow between Ibovespa and constituent companies.

39. Information theory to tachycardia therapy: electrogram entropy predicts diastolic microstructure of reentrant ventricular tachycardia.

40. Structural Analysis of Medical-Terminology Hashtag versus Lay-Language Hashtag Tweet Collections: An Information Theoretical Method with Entropy Matrix.

41. Discriminating imagined and non-imagined tasks in the motor cortex area: Entropy-complexity plane with a wavelet decomposition.

42. Inferring information flow in spike-train data sets using a trial-shuffle method.

43. Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels.

44. Recovery and the Data Processing Inequality for Quasi-Entropies.

45. Demystifying Fixed $k$ -Nearest Neighbor Information Estimators.

46. Comparing Entropy Rates on Finite and Infinite Rooted Trees.

47. Extended Gray–Wyner System With Complementary Causal Side Information.

48. Bounds on Information Combining With Quantum Side Information.

49. The Shortest Possible Return Time of $\beta$ -Mixing Processes.

50. Intraday Trading Volume and Non-Negative Matrix Factorization.

Catalog

Books, media, physical & digital resources