Problem-solving strategies for entropy in information theory
Use this skill when working on entropy problems in information theory.
Shannon Entropy
scipy.stats.entropy(p, base=2) for discreteEntropy Properties
z3_solve.py prove "entropy_nonnegative"Joint and Conditional Entropy
Differential Entropy (Continuous)
sympy_compute.py integrate "-f(x)*log(f(x))" --var xMaximum Entropy Principle
uv run python -c "from scipy.stats import entropy; p = [0.25, 0.25, 0.25, 0.25]; H = entropy(p, base=2); print('Entropy:', H, 'bits')"
uv run python -c "from scipy.stats import entropy; p = [0.5, 0.5]; q = [0.9, 0.1]; kl = entropy(p, q); print('KL divergence:', kl)"
uv run python -m runtime.harness scripts/sympy_compute.py simplify "-p*log(p, 2) - (1-p)*log(1-p, 2)"
From indexed textbooks:
See .claude/skills/math-mode/SKILL.md for full tool documentation.