This skill should be used when the user asks about "Atlas migrate commands", "plan migrations", "apply migrations", "lint migrations", "atlas migrate diff", "atlas migrate apply", "rollback migrations", "migration workflow", "versioned migrations", "declarative migrations", "atlas.hcl configuration", or needs guidance on executing Atlas migration commands and workflows
Learn how to plan, apply, lint, and manage database migrations using Atlas commands.
Compare current database state with desired state and generate migration plan:
# Basic plan - shows SQL without applying
atlas migrate diff --env local
# Plan with specific schema
atlas migrate diff migration_name --env local
# Dry-run: plan changes but don't apply
atlas migrate plan --env local --dry-run
# View generated migration file
cat migrations/20240115120000_create_users.sql
# Expected output:
-- alter table "users" to add column "email"
ALTER TABLE users ADD COLUMN email VARCHAR(255) NOT NULL;
CREATE UNIQUE INDEX idx_email ON users(email);
# Apply all pending migrations
atlas migrate apply --env local
# Apply with verbose output
atlas migrate apply --env local --verbose
# Dry-run before applying
atlas migrate apply --env local --dry-run
# Apply to specific database
atlas migrate apply --url "mysql://user:pass@localhost/mydb"
# Check migration status
atlas migrate status --env local
# Output shows applied and pending migrations
# Applied: 3 migrations
# Pending: 0 migrations
# Create empty migration with timestamp
atlas migrate new create_users --env local
# Output:
# Created migration: migrations/20240115_120000_create_users.sql
# Edit the migration file and add SQL
cat > migrations/20240115_120000_create_users.sql << 'EOF'
-- Create users table
CREATE TABLE users (
id INT PRIMARY KEY AUTO_INCREMENT,
email VARCHAR(255) NOT NULL UNIQUE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
EOF
# Generate migration from schema definition
atlas migrate plan "add_posts_table" --env local
# Atlas compares schema.hcl with database state
# Creates migration file automatically
Validate migrations for errors and best practices:
# Lint all migrations
atlas migrate lint --env local
# Lint specific migration
atlas migrate lint --env local --latest 2
# Check for:
# - Syntax errors
# - Unsafe operations
# - Data loss risks
# - Best practice violations
Example lint output:
Migration 001_create_users.sql:
⚠ WARNING: Dropping column "temp" without checking if data exists
✓ All other checks passed
# Apply all pending migrations
atlas migrate apply --env local
# Apply latest 2 migrations
atlas migrate apply --env local --latest 2
# Apply with transaction wrapping
atlas migrate apply --env local --allow-dirty
# Create down migration (SQL to revert changes)
atlas migrate down --env local
# Revert to specific migration version
atlas migrate down --env local --to 20240101120000
env "local" {
# Database connection
url = "postgres://user:password@localhost/mydb"
# Development database for planning
dev = "postgres://user:password@localhost/mydb_dev"
# Schema definition
migration {
dir = "file://migrations"
format = sql
}
# Desired schema source
schema {
src = "file://schema.hcl"
}
# Output formatting
format {
migrate {
apply = "{{ sql . }}"
plan = "{{ sql . }}"
}
}
}
env "prod" {
url = "postgres://user:[email protected]/mydb"
migration {
dir = "file://migrations"
format = sql
auto_approve = false # Require manual approval
}
format {
migrate {
apply = "-- Migration applied\n{{ sql . }}"
}
}
}
| Command | Purpose |
|---|---|
atlas migrate diff | Plan declarative migrations |
atlas migrate apply | Execute migrations |
atlas migrate status | Show migration status |
atlas migrate lint | Validate migrations |
atlas migrate new | Create versioned migration |
atlas migrate down | Generate rollback migration |
atlas schema inspect | Inspect current schema |
atlas schema diff | Compare two schemas |
// atlas.hcl
env "dev" {
url = "postgres://localhost/dev_db"
migration { dir = "file://migrations" }
}
env "staging" {
url = "postgres://staging.example.com/staging_db"
migration { dir = "file://migrations" }
}
env "prod" {
url = "postgres://prod.example.com/prod_db"
migration {
dir = "file://migrations"
auto_approve = false
}
}
Running migrations per environment:
atlas migrate apply --env dev
atlas migrate apply --env staging
atlas migrate apply --env prod
# Apply despite warnings
atlas migrate apply --allow-dirty --env local
# Review specific migration
atlas migrate lint --env local --latest 1
-- 001_create_users.sql
CREATE TABLE users (id INT PRIMARY KEY);
-- 001_create_users.down.sql
DROP TABLE users;
Apply down migration:
atlas migrate down --env local
Modify schema.hcl to previous state:
// schema.hcl - Remove table definition
// Atlas generates a DROP TABLE migration
Then plan and apply:
atlas migrate diff rollback --env local
atlas migrate apply --env local
# Check migration status
atlas migrate status --env local
# Reset to specific version (careful!)
atlas migrate set --env local 20240101120000
# Inspect actual database schema
atlas schema inspect --env local
# Compare with expected schema
atlas schema diff --env local
For complete migration workflow documentation, see:
references/atlas-docs-full/docs.md - Migration overview (declarative & versioned)references/atlas-docs-full/guides/modern-database-ci-cd.md - Modern CI/CD patternsreferences/atlas-docs-full/guides/migration-dirs/template-directory.md - Migration directory templatesreferences/README.md - Full documentation index