Back to list
pluginagentmarketplace

messaging

by pluginagentmarketplace

Backend development plugin for Claude AI - FastAPI, database management, API design, and server-side development tools

1🍴 0📅 Jan 7, 2026

SKILL.md


name: messaging description: Message queues and event-driven backend architecture. RabbitMQ, Kafka, pub/sub patterns, and async communication. sasmp_version: "2.0.0" bonded_agent: 04-architecture-patterns bond_type: SECONDARY_BOND

=== PRODUCTION-GRADE SKILL CONFIG (SASMP v2.0.0) ===

atomic_operations:

  • QUEUE_SETUP
  • EVENT_PUBLISHING
  • CONSUMER_CONFIGURATION
  • DLQ_HANDLING

parameter_validation: query: type: string required: true minLength: 5 maxLength: 2000 broker: type: string enum: [rabbitmq, kafka, redis, sqs] required: false

retry_logic: max_attempts: 3 backoff: exponential initial_delay_ms: 1000

logging_hooks: on_invoke: "skill.messaging.invoked" on_success: "skill.messaging.completed" on_error: "skill.messaging.failed"

exit_codes: SUCCESS: 0 INVALID_INPUT: 1 CONNECTION_ERROR: 2

Messaging & Event-Driven Skill

Bonded to: architecture-patterns-agent (Secondary)


Quick Start

# Invoke messaging skill
"Set up RabbitMQ for my microservices"
"Implement event-driven order processing"
"Configure Kafka for high-throughput messaging"

Message Broker Comparison

BrokerBest ForThroughputOrdering
RabbitMQTask queues, RPCMediumPer queue
KafkaEvent streaming, logsVery highPer partition
Redis Pub/SubReal-time, simpleHighNone
SQSAWS serverlessMediumFIFO optional

Examples

RabbitMQ Producer/Consumer

import pika
import json

# Producer
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
channel.queue_declare(queue='orders', durable=True)

def publish_order(order):
    channel.basic_publish(
        exchange='',
        routing_key='orders',
        body=json.dumps(order),
        properties=pika.BasicProperties(delivery_mode=2)  # persistent
    )

# Consumer
def process_order(ch, method, properties, body):
    order = json.loads(body)
    print(f"Processing order: {order['id']}")
    ch.basic_ack(delivery_tag=method.delivery_tag)

channel.basic_qos(prefetch_count=1)
channel.basic_consume(queue='orders', on_message_callback=process_order)
channel.start_consuming()

Kafka Event Streaming

from kafka import KafkaProducer, KafkaConsumer
import json

# Producer
producer = KafkaProducer(
    bootstrap_servers=['localhost:9092'],
    value_serializer=lambda v: json.dumps(v).encode('utf-8')
)

producer.send('user-events', {'type': 'USER_CREATED', 'user_id': '123'})

# Consumer
consumer = KafkaConsumer(
    'user-events',
    bootstrap_servers=['localhost:9092'],
    group_id='notification-service',
    auto_offset_reset='earliest',
    value_deserializer=lambda m: json.loads(m.decode('utf-8'))
)

for message in consumer:
    print(f"Event: {message.value}")

Patterns

Dead Letter Queue (DLQ)

def process_with_retry(message, max_retries=3):
    retry_count = message.headers.get('x-retry-count', 0)

    try:
        process(message)
    except Exception as e:
        if retry_count < max_retries:
            # Republish with incremented retry count
            republish_with_delay(message, retry_count + 1)
        else:
            # Send to DLQ
            publish_to_dlq(message, str(e))

Troubleshooting

IssueCauseSolution
Message lossNo persistenceEnable durable queues
Consumer lagSlow processingScale consumers, batch processing
Duplicate processingNo idempotencyImplement idempotent consumers

Resources

Score

Total Score

75/100

Based on repository quality metrics

SKILL.md

SKILL.mdファイルが含まれている

+20
LICENSE

ライセンスが設定されている

+10
説明文

100文字以上の説明がある

+10
人気

GitHub Stars 100以上

0/15
最近の活動

3ヶ月以内に更新

+5
フォーク

10回以上フォークされている

0/5
Issue管理

オープンIssueが50未満

+5
言語

プログラミング言語が設定されている

+5
タグ

1つ以上のタグが設定されている

+5

Reviews

💬

Reviews coming soon