Back to list
letta-ai

db-migrations-and-schema-changes

by letta-ai

Letta is the platform for building stateful agents: AI with advanced memory that can learn and self-improve over time.

20,768🍴 2,169📅 Jan 23, 2026

SKILL.md


name: DB migrations and schema changes description: >- Workflows and commands for managing Alembic database migrations and schema changes in the letta-cloud core app, including using uv, just, LETTA_PG_URI, and switching between SQLite and Postgres.

DB migrations and schema changes (letta-cloud core)

Use this skill whenever you need to change the database schema or debug Alembic migrations in apps/core of the letta-cloud repo.

This skill assumes:

  • Working directory: apps/core
  • Migrations: Alembic in apps/core/alembic
  • Python runner: uv
  • Helper: just ready for environment + DB setup

Quick start

  1. Ensure environment is ready:
    • just ready
  2. For Postgres migrations, set:
    • export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
  3. Make your ORM/schema change.
  4. Autogenerate migration:
    • uv run alembic revision --autogenerate -m "<short_message>"
  5. Apply migration:
    • uv run alembic upgrade head

See references/migration-commands.md for exact commands and variants.

Standard workflows

1. Add or modify a column (ORM-first)

  1. Identify the ORM model and table.
  2. Update the SQLAlchemy model in letta/orm/...:
    • Prefer using mixins (e.g. ProjectMixin) when available instead of duplicating columns.
  3. Run just ready if dependencies or environment may have changed.
  4. Ensure LETTA_PG_URI is set if you want the migration to target Postgres.
  5. Autogenerate Alembic revision with uv.
  6. Inspect the generated file under alembic/versions/:
    • Confirm op.add_column / op.alter_column match expectations.
  7. Apply migrations with uv run alembic upgrade head.

Use this pattern for changes like adding project_id columns via ProjectMixin.

2. Data backfill / one-off data migration

  1. Make sure the schema change (if any) is already represented in ORM + Alembic.
  2. Create a new Alembic revision without autogenerate (or edit an autogen file) and add Python logic in upgrade() that:
    • Uses op.get_bind() and SQLAlchemy Core/SQL to backfill data.
  3. Keep downgrade() simple and safe (ideally reversible).
  4. Run against Postgres with LETTA_PG_URI set, using uv run alembic upgrade head.

3. Fixing a bad migration

Typical cases:

  • Migration fails only on SQLite (ALTER constraint limitations).
  • Migration was generated while pointing at SQLite instead of Postgres.

Workflow:

  1. Identify the failing revision in alembic/versions/.
  2. If failure is SQLite-specific, prefer running migrations against Postgres by exporting LETTA_PG_URI and re-running upgrade.
  3. If logic is wrong, create a new migration that fixes the problem rather than editing an applied revision (especially in shared environments).
  4. For purely local/dev history, you can delete and regenerate migrations but only if no one else depends on them.

See references/sqlite-vs-postgres-gotchas.md for SQLite-specific issues.

4. Switching between SQLite and Postgres

Alembic picks the engine based on letta.settings.DatabaseChoice and environment variables.

General rules:

  • For local dev stateful runs, just ready handles baseline migrations.
  • For schema design and production-like migrations, prefer Postgres and set LETTA_PG_URI.

Workflow for Postgres-targeted migration:

  1. export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
  2. From apps/core:
    • uv run alembic upgrade head
    • uv run alembic revision --autogenerate -m "..."

5. Resetting local Postgres for clean migration generation

If your local Postgres database has drifted from main (e.g., applied migrations that no longer exist, or has stale schema), you can reset it to generate a clean migration.

From the repo root (/Users/sarahwooders/repos/letta-cloud):

# 1. Remove postgres data directory
rm -rf ./data/postgres

# 2. Stop the running postgres container
docker stop $(docker ps -q --filter ancestor=ankane/pgvector)

# 3. Restart services (creates fresh postgres)
just start-services

# 4. Wait a moment for postgres to be ready, then apply all migrations
cd apps/core
export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
uv run alembic upgrade head

# 5. Now generate your new migration
uv run alembic revision --autogenerate -m "your migration message"

This ensures the migration is generated against a clean database state matching main, avoiding spurious diffs from local-only schema changes.

Troubleshooting

  • "Target database is not up to date" when autogenerating
    • First run uv run alembic upgrade head (with appropriate engine/URI).
  • SQLite NotImplementedError about ALTER CONSTRAINT
    • Switch to Postgres by setting LETTA_PG_URI and rerun.
  • Autogenerated migration missing expected changes
    • Ensure ORM imports and metadata (Base.metadata) are correct and that the changed model is imported in Alembic env context.
  • Autogenerated migration has unexpected drops/renames
    • Review model changes; consider explicit operations instead of relying on autogenerate. Reset local Postgres (see workflow 5) to get a clean baseline.

References

  • references/migration-commands.md — canonical commands for uv, Alembic, and just.
  • references/sqlite-vs-postgres-gotchas.md — engine-specific pitfalls and how to avoid them.

Score

Total Score

90/100

Based on repository quality metrics

SKILL.md

SKILL.mdファイルが含まれている

+20
LICENSE

ライセンスが設定されている

+10
説明文

100文字以上の説明がある

+10
人気

GitHub Stars 1000以上

+15
最近の活動

1ヶ月以内に更新

+10
フォーク

10回以上フォークされている

+5
Issue管理

オープンIssueが50未満

0/5
言語

プログラミング言語が設定されている

+5
タグ

1つ以上のタグが設定されている

+5

Reviews

💬

Reviews coming soon