Back to list
parcadei

channel-capacity

by parcadei

Context management for Claude Code. Hooks maintain state via ledgers and handoffs. MCP execution without context pollution. Agent orchestration with isolated context windows.

3,352🍴 252📅 Jan 23, 2026

SKILL.md


name: channel-capacity description: "Problem-solving strategies for channel capacity in information theory" allowed-tools: [Bash, Read]

Channel Capacity

When to Use

Use this skill when working on channel-capacity problems in information theory.

Decision Tree

  1. Mutual Information

    • I(X;Y) = H(X) + H(Y) - H(X,Y)
    • I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)
    • Symmetric: I(X;Y) = I(Y;X)
    • scipy.stats.entropy(p) + scipy.stats.entropy(q) - joint_entropy
  2. Channel Model

    • Input X, output Y, channel P(Y|X)
    • Channel matrix: rows = inputs, columns = outputs
    • Element (i,j) = P(Y=j | X=i)
  3. Channel Capacity

    • C = max_{p(x)} I(X;Y)
    • Maximize over input distribution
    • Achieved by capacity-achieving distribution
  4. Common Channels

    ChannelCapacity
    Binary Symmetric (BSC)1 - H(p) where p = crossover prob
    Binary Erasure (BEC)1 - epsilon where epsilon = erasure prob
    AWGN0.5 * log2(1 + SNR)
  5. Blahut-Arimoto Algorithm

    • Iterative algorithm to compute capacity
    • Alternates between optimizing p(x) and p(y|x)
    • Converges to capacity
    • z3_solve.py prove "capacity_upper_bound"

Tool Commands

Scipy_Mutual_Info

uv run python -c "from scipy.stats import entropy; p = [0.5, 0.5]; q = [0.6, 0.4]; H_X = entropy(p, base=2); H_Y = entropy(q, base=2); print('H(X)=', H_X, 'H(Y)=', H_Y)"

Sympy_Bsc_Capacity

uv run python -m runtime.harness scripts/sympy_compute.py simplify "1 + p*log(p, 2) + (1-p)*log(1-p, 2)"

Z3_Capacity_Bound

uv run python -m runtime.harness scripts/z3_solve.py prove "I(X;Y) <= H(X)"

Key Techniques

From indexed textbooks:

  • [Elements of Information Theory] Elements of Information Theory -- Thomas M_ Cover & Joy A_ Thomas -- 2_, Auflage, New York, NY, 2012 -- Wiley-Interscience -- 9780470303153 -- 2fcfe3e8a16b3aeefeaf9429fcf9a513 -- Anna’s Archive. Using a randomly generated code, Shannon showed that one can send information at any rate below the capacity C of the channel with an arbitrarily low probability of error. The idea of a randomly generated code is very unusual.

Cognitive Tools Reference

See .claude/skills/math-mode/SKILL.md for full tool documentation.

Score

Total Score

95/100

Based on repository quality metrics

SKILL.md

SKILL.mdファイルが含まれている

+20
LICENSE

ライセンスが設定されている

+10
説明文

100文字以上の説明がある

+10
人気

GitHub Stars 1000以上

+15
最近の活動

1ヶ月以内に更新

+10
フォーク

10回以上フォークされている

+5
Issue管理

オープンIssueが50未満

+5
言語

プログラミング言語が設定されている

+5
タグ

1つ以上のタグが設定されている

+5

Reviews

💬

Reviews coming soon