Import & Export
TravStats keeps your data portable on purpose — bulk import to seed a fresh instance, round-trip CSV/XLSX for quick mass-edits in Excel, and a Postgres-level backup when you want a complete snapshot.
Bulk import (JSON / CSV / XLSX)
Section titled “Bulk import (JSON / CSV / XLSX)”Flights → Import → File takes any of three formats through the same
shared pipeline (parseImportFile):
| Format | Use it for |
|---|---|
| JSON | Restoring a TravStats export to another instance, scripted bulk imports |
| CSV | Migrating from FlightDiary, OpenFlights, or any spreadsheet |
| XLSX | Round-tripping flights through Excel for fast mass edits |
Round-trip XLSX (since v1.3.0)
Section titled “Round-trip XLSX (since v1.3.0)”Export the same flight list to .xlsx, edit in Excel, re-import the
same file:
- Rows with an
idcolumn are matched against existing flights and updated. - Rows without
idare created as new flights. - Empty cells leave the existing value alone — they don’t blank fields.
Useful for one-off corrections like fixing a misspelled aircraft type on 200 flights or applying a tag to a year of trips.
CSV column layout
Section titled “CSV column layout”The same columns regardless of source:
date, departureIata, arrivalIata, flightNumber, airline, aircraft,departureLocal, arrivalLocal, depTimezone, arrTimezone, seat, class,bookingReference, tagsTimes are local in the depTimezone / arrTimezone IANA zone
(e.g. Europe/Berlin, America/New_York). The server converts them
to canonical UTC on save — the v1.2.0 contract change means raw
departureTime strings are no longer accepted on writes.
The FlightDiary comparison page has a column-by-column mapping for migrating in.
Personal export
Section titled “Personal export”Settings → Data → Export downloads everything you own from the running instance:
- All flights with full metadata (and any cruise voyages once v2 lands)
- Tags, trips, achievements
- Personal settings (UI language, units, default tags) — but not instance-level admin settings or other users’ data
The export is a single JSON file that can be re-imported into another TravStats instance via the bulk-import endpoint.
Database backups
Section titled “Database backups”Personal exports are convenient; full Postgres dumps are durable. TravStats has automated backups built in.
Built-in scheduler
Section titled “Built-in scheduler”Admin → Settings → Backups:
- Schedule — cron expression (default: daily at 02:00 UTC)
- Retention — how many backup files to keep on disk
- Location —
/app/data/backupsinside the container, persisted via the named volume - Optional WebDAV sync — Nextcloud, HiDrive, or any WebDAV-compatible target receives a copy off-host
Backups are gzipped pg_dump outputs (full schema + data) named with
the timestamp. Restore them with the dump→psql workflow below.
Manual pg_dump
Section titled “Manual pg_dump”For a fresh dump on demand, no scheduler involved:
docker exec travstats-db pg_dump -U flights flights | gzip > flights-$(date +%F).sql.gzTo restore into a fresh container:
# Wipe the existing public schema (DESTRUCTIVE — has a backup ready)docker exec -i travstats-db psql -U flights -d flights -c \ "DROP SCHEMA public CASCADE; CREATE SCHEMA public;"
# Replay the dumpgunzip -c flights-2026-05-01.sql.gz | docker exec -i travstats-db psql -U flights flights
# Restart the app to re-run migrations against the restored DBdocker compose -f docker-compose.prod.yml restart appThe app’s first boot after restore re-applies any migrations that the dump pre-dates. Idempotent.
Migrating between instances
Section titled “Migrating between instances”Same-version migration (TravStats 1.3.0 → TravStats 1.3.0):
pg_dumpthe source instance (above).- Stand up the target instance with the same major.minor version.
- Restore the dump on the target via the steps above.
That’s it — the JWT secret, encryption keys, and instance settings are part of the dump and travel with the data.
Cross-version migration (TravStats 1.2.x → TravStats 1.3.0):
- Restore the older dump into the older version first, upgrade that instance to the new version (which runs migrations against your real data), and only then dump again to migrate hardware.
- Don’t restore a 1.2.x dump directly into a 1.3.0 fresh instance and let migrations run on top of it — the migrations expect to evolve a real-data DB, not import legacy schema. The order matters.
The CHANGELOG flags any release that changes the export format. Cross-version imports between versions where the format hasn’t changed are schema-stable and require no special steps.