Problem-solving strategies for channel capacity in information theory
Use this skill when working on channel-capacity problems in information theory.
Mutual Information
scipy.stats.entropy(p) + scipy.stats.entropy(q) - joint_entropyChannel Model
Channel Capacity
Common Channels
| Channel | Capacity |
|---|---|
| Binary Symmetric (BSC) |
| 1 - H(p) where p = crossover prob |
| Binary Erasure (BEC) | 1 - epsilon where epsilon = erasure prob |
| AWGN | 0.5 * log2(1 + SNR) |
Blahut-Arimoto Algorithm
z3_solve.py prove "capacity_upper_bound"uv run python -c "from scipy.stats import entropy; p = [0.5, 0.5]; q = [0.6, 0.4]; H_X = entropy(p, base=2); H_Y = entropy(q, base=2); print('H(X)=', H_X, 'H(Y)=', H_Y)"
uv run python -m runtime.harness scripts/sympy_compute.py simplify "1 + p*log(p, 2) + (1-p)*log(1-p, 2)"
uv run python -m runtime.harness scripts/z3_solve.py prove "I(X;Y) <= H(X)"
From indexed textbooks:
See .claude/skills/math-mode/SKILL.md for full tool documentation.