Back to list
parcadei

entropy

by parcadei

Context management for Claude Code. Hooks maintain state via ledgers and handoffs. MCP execution without context pollution. Agent orchestration with isolated context windows.

3,352🍴 252📅 Jan 23, 2026

Use Cases

🔗

MCP Server Integration

AI tool integration using Model Context Protocol. Using entropy.

🔗

API Integration

Easily build API integrations with external services.

🔄

Data Synchronization

Automatically sync data between multiple systems.

SKILL.md


name: entropy description: "Problem-solving strategies for entropy in information theory" allowed-tools: [Bash, Read]

Entropy

When to Use

Use this skill when working on entropy problems in information theory.

Decision Tree

  1. Shannon Entropy

    • H(X) = -sum p(x) log2 p(x)
    • Maximum for uniform distribution: H_max = log2(n)
    • Minimum = 0 for deterministic (one outcome certain)
    • scipy.stats.entropy(p, base=2) for discrete
  2. Entropy Properties

    • Non-negative: H(X) >= 0
    • Concave in p
    • Chain rule: H(X,Y) = H(X) + H(Y|X)
    • z3_solve.py prove "entropy_nonnegative"
  3. Joint and Conditional Entropy

    • H(X,Y) = -sum sum p(x,y) log2 p(x,y)
    • H(Y|X) = H(X,Y) - H(X)
    • H(Y|X) <= H(Y) with equality iff independent
  4. Differential Entropy (Continuous)

    • h(X) = -integral f(x) log f(x) dx
    • Can be negative!
    • Gaussian: h(X) = 0.5 * log2(2pie*sigma^2)
    • sympy_compute.py integrate "-f(x)*log(f(x))" --var x
  5. Maximum Entropy Principle

    • Given constraints, max entropy distribution is least biased
    • Uniform for no constraints
    • Exponential for E[X] = mu constraint
    • Gaussian for E[X], Var[X] constraints

Tool Commands

Scipy_Entropy

uv run python -c "from scipy.stats import entropy; p = [0.25, 0.25, 0.25, 0.25]; H = entropy(p, base=2); print('Entropy:', H, 'bits')"

Scipy_Kl_Div

uv run python -c "from scipy.stats import entropy; p = [0.5, 0.5]; q = [0.9, 0.1]; kl = entropy(p, q); print('KL divergence:', kl)"

Sympy_Entropy

uv run python -m runtime.harness scripts/sympy_compute.py simplify "-p*log(p, 2) - (1-p)*log(1-p, 2)"

Key Techniques

From indexed textbooks:

  • [Elements of Information Theory] Elements of Information Theory -- Thomas M_ Cover & Joy A_ Thomas -- 2_, Auflage, New York, NY, 2012 -- Wiley-Interscience -- 9780470303153 -- 2fcfe3e8a16b3aeefeaf9429fcf9a513 -- Anna’s Archive. What is the channel capacity of this channel? This is the multiple-access channel solved by Liao and Ahlswede.

Cognitive Tools Reference

See .claude/skills/math-mode/SKILL.md for full tool documentation.

Score

Total Score

95/100

Based on repository quality metrics

SKILL.md

SKILL.mdファイルが含まれている

+20
LICENSE

ライセンスが設定されている

+10
説明文

100文字以上の説明がある

+10
人気

GitHub Stars 1000以上

+15
最近の活動

1ヶ月以内に更新

+10
フォーク

10回以上フォークされている

+5
Issue管理

オープンIssueが50未満

+5
言語

プログラミング言語が設定されている

+5
タグ

1つ以上のタグが設定されている

+5

Reviews

💬

Reviews coming soon