
db-migrations-and-schema-changes
by letta-ai
Letta is the platform for building stateful agents: AI with advanced memory that can learn and self-improve over time.
SKILL.md
name: DB migrations and schema changes description: >- Workflows and commands for managing Alembic database migrations and schema changes in the letta-cloud core app, including using uv, just, LETTA_PG_URI, and switching between SQLite and Postgres.
DB migrations and schema changes (letta-cloud core)
Use this skill whenever you need to change the database schema or debug Alembic
migrations in apps/core of the letta-cloud repo.
This skill assumes:
- Working directory:
apps/core - Migrations: Alembic in
apps/core/alembic - Python runner:
uv - Helper:
just readyfor environment + DB setup
Quick start
- Ensure environment is ready:
just ready
- For Postgres migrations, set:
export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
- Make your ORM/schema change.
- Autogenerate migration:
uv run alembic revision --autogenerate -m "<short_message>"
- Apply migration:
uv run alembic upgrade head
See references/migration-commands.md for exact commands and variants.
Standard workflows
1. Add or modify a column (ORM-first)
- Identify the ORM model and table.
- Update the SQLAlchemy model in
letta/orm/...:- Prefer using mixins (e.g.
ProjectMixin) when available instead of duplicating columns.
- Prefer using mixins (e.g.
- Run
just readyif dependencies or environment may have changed. - Ensure
LETTA_PG_URIis set if you want the migration to target Postgres. - Autogenerate Alembic revision with
uv. - Inspect the generated file under
alembic/versions/:- Confirm
op.add_column/op.alter_columnmatch expectations.
- Confirm
- Apply migrations with
uv run alembic upgrade head.
Use this pattern for changes like adding project_id columns via ProjectMixin.
2. Data backfill / one-off data migration
- Make sure the schema change (if any) is already represented in ORM + Alembic.
- Create a new Alembic revision without autogenerate (or edit an
autogen file) and add Python logic in
upgrade()that:- Uses
op.get_bind()and SQLAlchemy Core/SQL to backfill data.
- Uses
- Keep
downgrade()simple and safe (ideally reversible). - Run against Postgres with
LETTA_PG_URIset, usinguv run alembic upgrade head.
3. Fixing a bad migration
Typical cases:
- Migration fails only on SQLite (ALTER constraint limitations).
- Migration was generated while pointing at SQLite instead of Postgres.
Workflow:
- Identify the failing revision in
alembic/versions/. - If failure is SQLite-specific, prefer running migrations against Postgres by
exporting
LETTA_PG_URIand re-running upgrade. - If logic is wrong, create a new migration that fixes the problem rather than editing an applied revision (especially in shared environments).
- For purely local/dev history, you can delete and regenerate migrations but only if no one else depends on them.
See references/sqlite-vs-postgres-gotchas.md for SQLite-specific issues.
4. Switching between SQLite and Postgres
Alembic picks the engine based on letta.settings.DatabaseChoice and
environment variables.
General rules:
- For local dev stateful runs,
just readyhandles baseline migrations. - For schema design and production-like migrations, prefer Postgres and set
LETTA_PG_URI.
Workflow for Postgres-targeted migration:
export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core- From
apps/core:uv run alembic upgrade headuv run alembic revision --autogenerate -m "..."
5. Resetting local Postgres for clean migration generation
If your local Postgres database has drifted from main (e.g., applied migrations that no longer exist, or has stale schema), you can reset it to generate a clean migration.
From the repo root (/Users/sarahwooders/repos/letta-cloud):
# 1. Remove postgres data directory
rm -rf ./data/postgres
# 2. Stop the running postgres container
docker stop $(docker ps -q --filter ancestor=ankane/pgvector)
# 3. Restart services (creates fresh postgres)
just start-services
# 4. Wait a moment for postgres to be ready, then apply all migrations
cd apps/core
export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
uv run alembic upgrade head
# 5. Now generate your new migration
uv run alembic revision --autogenerate -m "your migration message"
This ensures the migration is generated against a clean database state matching main, avoiding spurious diffs from local-only schema changes.
Troubleshooting
- "Target database is not up to date" when autogenerating
- First run
uv run alembic upgrade head(with appropriate engine/URI).
- First run
- SQLite NotImplementedError about ALTER CONSTRAINT
- Switch to Postgres by setting
LETTA_PG_URIand rerun.
- Switch to Postgres by setting
- Autogenerated migration missing expected changes
- Ensure ORM imports and metadata (
Base.metadata) are correct and that the changed model is imported in Alembic env context.
- Ensure ORM imports and metadata (
- Autogenerated migration has unexpected drops/renames
- Review model changes; consider explicit operations instead of relying on autogenerate. Reset local Postgres (see workflow 5) to get a clean baseline.
References
references/migration-commands.md— canonical commands foruv, Alembic, andjust.references/sqlite-vs-postgres-gotchas.md— engine-specific pitfalls and how to avoid them.
Score
Total Score
Based on repository quality metrics
SKILL.mdファイルが含まれている
ライセンスが設定されている
100文字以上の説明がある
GitHub Stars 1000以上
1ヶ月以内に更新
10回以上フォークされている
オープンIssueが50未満
プログラミング言語が設定されている
1つ以上のタグが設定されている
Reviews
Reviews coming soon

