FORK of subdirectory https://github.com/schickling/dockerfiles/tree/master/postgres-backup-s3
Why? Because the version on docker hub is very outdated. Extracting it as single-image-repo allows using convenient github actions build flows and publish the image to ghcr.io
Backup PostgresSQL to S3 (supports periodic backups)
Docker:
$ docker run -e S3_ACCESS_KEY_ID=key -e S3_SECRET_ACCESS_KEY=secret -e S3_BUCKET=my-bucket -e S3_PREFIX=backup -e POSTGRES_DATABASE=dbname -e POSTGRES_USER=user -e POSTGRES_PASSWORD=password -e POSTGRES_HOST=localhost schickling/postgres-backup-s3
Docker Compose:
postgres:
image: postgres
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
pgbackups3:
image: schickling/postgres-backup-s3
depends_on:
- postgres
links:
- postgres
environment:
SCHEDULE: '@daily'
S3_REGION: region
S3_ACCESS_KEY_ID: key
S3_SECRET_ACCESS_KEY: secret
S3_BUCKET: my-bucket
S3_PREFIX: backup
POSTGRES_BACKUP_ALL: "false"
POSTGRES_HOST: host
POSTGRES_DATABASE: dbname
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_EXTRA_OPTS: '--schema=public --blobs'
You can additionally set the SCHEDULE
environment variable like -e SCHEDULE="@daily"
to run the backup automatically.
More information about the scheduling can be found here.
You can backup all available databases by setting POSTGRES_BACKUP_ALL="true"
.
Single archive with the name all_<timestamp>.sql.gz
will be uploaded to S3
An Endpoint is the URL of the entry point for an AWS web service or S3 Compitable Storage Provider.
You can specify an alternate endpoint by setting S3_ENDPOINT
environment variable like protocol:https://endpoint
Note: S3 Compitable Storage Provider requires S3_ENDPOINT
environment variable