Entropy
Information theory (Shannon)
-
Information provides an answer to a question (eg. whether a coin will land heads or tails).
-
The information conveyed by a message x depends on its probability p(x) and can be measured in bits (using log base 2).
-
Shannon’s definition of information
Entropy
Cross Entropy
Relative Entropy
Entropy | Cross Entropy | KL Divergence
Title: Entropy
Author: wy
Created at
: 2023-06-21 11:29:05**Updated at :** 2023-07-07 10:49:37**Link:** https://yuuee-www.github.io/blog/2023/06/21/Entropy/** License: ** This work is licensed under [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0). [personal Prev posts](/2023/07/03/personal/) [ID & OOD Next posts](/2023/06/13/OOD/) Comments On this page Entropy
-
© 2022 - 2024 [wy](/) 24 posts in total VISITOR COUNT TOTAL PAGE VIEWS POWERED BY [Hexo](https://hexo.io) THEME [Redefine v2.6.4](https://github.com/EvanNotFound/hexo-theme-redefine) Blog up for days hrs Min Sec
-
-
-
-
-
-
-
- Title: Entropy
- Author: wy
- Created at : 2023-06-21 03:29:05
- Updated at : 2023-07-07 02:49:37
- Link: https://yue-ruby-w.site/2023/06/21/2023-06-21-Entropy/
- License: This work is licensed under CC BY-NC-SA 4.0.