Confiture Overview
All 4 Mediums and when to use each one.
The Sync medium copies data from one environment to another with built-in PII anonymization. Debug with realistic data without exposing personal information.
# Sync with anonymizationconfiture sync --from production --to local --anonymize✓ Syncing production → local → Anonymizing PII columns... → tb_user: 12,450 rows (6,500 rows/sec with anonymization) → tb_post: 45,230 rows (70,000 rows/sec) → tb_comment: 128,900 rows (70,000 rows/sec)✓ Sync complete in 4.2s# Full sync with anonymizationconfiture sync --from production --to local --anonymize
# Sync specific tables onlyconfiture sync --from production --to staging --tables users,posts
# Sync without anonymization (staging to staging)confiture sync --from staging --to local
# Resume an interrupted syncconfiture sync --resume --checkpoint sync.json
# Dry run — show what would be syncedconfiture sync --from production --to local --anonymize --dry-runConfigure anonymization rules in confiture.yaml:
sync: anonymize: tb_user: email: email # alice@example.com → user_a1b2c3@example.com name: name # Alice Johnson → User A1B2 phone: phone # +1-555-1234 → +1-555-4567 bio: redact # Any value → [REDACTED] password_hash: hash # Consistent hash for testing tb_payment: card_number: redact billing_address: redact| Strategy | Input | Output | Use Case |
|---|---|---|---|
email | alice@example.com | user_a1b2c3@example.com | Email fields |
name | Alice Johnson | User A1B2 | Name fields |
phone | +1-555-1234 | +1-555-4567 | Phone numbers |
redact | Any value | [REDACTED] | Sensitive text, addresses |
hash | Any value | Consistent hash | Passwords, tokens (preserves uniqueness) |
| Mode | Throughput | Use Case |
|---|---|---|
| Without anonymization | ~70,000 rows/sec | Staging-to-staging, non-sensitive data |
| With anonymization | ~6,500 rows/sec | Production-to-local, PII data |
For large syncs (millions of rows), use checkpoints to resume if interrupted:
# Start sync with checkpoint fileconfiture sync --from production --to local --anonymize --checkpoint sync.json
# Resume if interruptedconfiture sync --resume --checkpoint sync.jsonConfiture Overview
All 4 Mediums and when to use each one.
Build from DDL
For fresh databases — fast setup from DDL files in under 1 second.
Incremental Migrations
For schema evolution on existing databases with data.
Schema-to-Schema
Zero-downtime refactoring for major production changes.