
adding-models
by letta-ai
The memory-first coding agent
SKILL.md
name: adding-models description: Guide for adding new LLM models to Letta Code. Use when the user wants to add support for a new model, needs to know valid model handles, or wants to update the model configuration. Covers models.json configuration, CI test matrix, and handle validation.
Adding Models
This skill guides you through adding a new LLM model to Letta Code.
Quick Reference
Key files:
src/models.json- Model definitions (required).github/workflows/ci.yml- CI test matrix (optional)src/tools/manager.ts- Toolset detection logic (rarely needed)
Workflow
Step 1: Find Valid Model Handles
Query the Letta API to see available models:
curl -s https://api.letta.com/v1/models/ | jq '.[] | .handle'
Or filter by provider:
curl -s https://api.letta.com/v1/models/ | jq '.[] | select(.handle | startswith("google_ai/")) | .handle'
Common provider prefixes:
anthropic/- Claude modelsopenai/- GPT modelsgoogle_ai/- Gemini modelsgoogle_vertex/- Vertex AIopenrouter/- Various providers
Step 2: Add to models.json
Add an entry to src/models.json:
{
"id": "model-shortname",
"handle": "provider/model-name",
"label": "Human Readable Name",
"description": "Brief description of the model",
"isFeatured": true, // Optional: shows in featured list
"updateArgs": {
"context_window": 180000,
"temperature": 1.0 // Optional: provider-specific settings
}
}
Field reference:
id: Short identifier used with--modelflag (e.g.,gemini-3-flash)handle: Full provider/model path from the API (e.g.,google_ai/gemini-3-flash-preview)label: Display name in model selectordescription: Brief description shown in selectorisFeatured: If true, appears in featured models sectionupdateArgs: Model-specific configuration (context window, temperature, reasoning settings, etc.)
Provider prefixes:
anthropic/- Anthropic (Claude models)openai/- OpenAI (GPT models)google_ai/- Google AI (Gemini models)google_vertex/- Google Vertex AIopenrouter/- OpenRouter (various providers)
Step 3: Test the Model
Test with headless mode:
bun run src/index.ts --new --model <model-id> -p "hi, what model are you?"
Example:
bun run src/index.ts --new --model gemini-3-flash -p "hi, what model are you?"
Step 4: Add to CI Test Matrix (Optional)
To include the model in automated testing, add it to .github/workflows/ci.yml:
# Find the headless job matrix around line 122
model: [gpt-5-minimal, gpt-4.1, sonnet-4.5, gemini-pro, your-new-model, glm-4.6, haiku]
Toolset Detection
Models are automatically assigned toolsets based on provider:
openai/*→codextoolsetgoogle_ai/*orgoogle_vertex/*→geminitoolset- Others →
defaulttoolset
This is handled by isGeminiModel() and isOpenAIModel() in src/tools/manager.ts. You typically don't need to modify this unless adding a new provider.
Common Issues
"Handle not found" error: The model handle is incorrect. Run the validation script to see valid handles.
Model works but wrong toolset: Check src/tools/manager.ts to ensure the provider prefix is recognized.
Score
Total Score
Based on repository quality metrics
SKILL.mdファイルが含まれている
ライセンスが設定されている
100文字以上の説明がある
GitHub Stars 500以上
3ヶ月以内に更新
10回以上フォークされている
オープンIssueが50未満
プログラミング言語が設定されている
1つ以上のタグが設定されている
Reviews
Reviews coming soon