0x302 Information Theory

Summary of Elements of Information Theory [1]

Introduction

Definition (entropy) The entropy $H(X)$ of a discrete random variable $X$ is defined by

$$H(X) = -\sum_{x \in \mathcal{X}} p(x) \log p(x)$$

Reference

[1] Cover, Thomas M., and Joy A. Thomas. Elements of information theory. John Wiley & Sons, 2012.