This work aims to help people study information theory and understand its applicability and usefulness. This repository consists of a collection of knowledge where I gather a lot of information to understand the realm of information theory. The main content is the summary "A Not-Too-Short, Not-Quite-Long Summary of Information Theory." I have summarized a lot of content from canonical books and added some codes and animations/visualizations using Python to better understand how information theory works.
To use this summary just access the following link: https://github.com/igor17400/information-theory-174/blob/main/summary/main.pdf
This summary includes, but is not limited to, the following signal processing topics:
- Basic Concepts in Probability Theory
- Information Measures
- Data Compression
- Noisy-Channel Coding Theorem
- Entropy and Mutual Information
- Channel Capacity
- Hypothesis Testing
- Rate-Distortion Theory
- Source Coding Theorem
Contributions to improve this summary are welcome! If you'd like to contribute:
- Fork the repository
- Create a new branch (
git checkout -b feature/your-feature-name
) - Make your changes
- Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin feature/your-feature-name
) - Create a new Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
I hope this proves to be helpful to someone other than just me!
✧⁺⸜(^-^)⸝⁺✧