Entropy
Information theory (Shannon)
Information provides an answer to a question (eg. whether a coin will land heads or tails).
The information conveyed by a message x depends on its probability p(x) and can be measured in bits (using log base 2).
Shannon’s definition of information
Entropy
- Cross Entropy
- Relative Entropy
- Entropy | Cross Entropy | KL Divergence
- Title: Entropy
- Author: wy
- Created at : 2023-06-21 11:29:05
- Updated at : 2023-07-07 10:49:37
- Link: https://yuuee-www.github.io/blog/2023/06/21/Entropy/
- License: This work is licensed under CC BY-NC-SA 4.0.
Comments