Giter VIP home page Giter VIP logo

db_zip2cloud's Introduction

Backup Service: zip2cloud

The zip2cloud application is a shell script that manages backup dumps, compresses them into zip files, compares them with existing backups in remote storage, and uploads any new or updated backups to the remote storage.

Environment Variables

The script uses the following environment variables:

Variable Description Default
COMPRESSION_LEVEL Compression level for 7z files 0
DELETE_DUMP Optionally deletes exports under $DUMP_BASE when done compressing Unused
DUMP_BASE Base directory for dumps /dump/full_backup
DUMP_RETENTION Retention policy for dumps 3
ENABLE_UPLOAD Remote storage details true
REMOTE Remote storage details remote:${BUCKET}/${BUCKETPATH}
SECRET Encryption key for 7z files
SLACK_CHANNEL Slack channel for notifications Unused
SLACK_WEBHOOK Slack webhook for notifications Unused
ZIP_BASE Base name for zip files backup_full
ZIP_DIR Directory for zip files /zip
ZIP_RETENTION Retention policy for zip files 4

Workflow

The script performs the following steps:

  1. Cleanup: Removes old zip files and backup dumps based on the retention policies set in the environment variables.
  2. Zip: Creates .7z archives of dump dirs (formatted as YYYY-MM-DD) in the $DUMP_BASE.
  3. Checksum: Retrieves a list of remote backups and downloads the MD5 checksums for each remote backup into a temporary directory. It then compares the checksums of local zip files against the remote MD5 checksums, adding any files that don't match to an upload list.
  4. Create Upload List: Verifies and updates the list of files to upload. For each file in the upload list, it compares the local and remote MD5 checksums. If there's a mismatch, it increments the filename and adds it to the final upload list. This incrementing process continues until it finds a filename that doesn't conflict with existing files in the remote storage.
  5. Upload: Uploads the files in the final upload list to the remote storage using the rclone command.

Dockerfile

The Dockerfile for this application is based on the alpine:latest image and includes the necessary binaries and files for the zip2cloud script. The Dockerfile uses a multi-stage build process to keep the final image size small.

GitHub Actions

The application uses GitHub Actions for continuous integration. The workflows are defined in the .github/workflows/ directory and include steps for building, tagging, and pushing Docker images, as well as scanning for vulnerabilities with Trivy.


Previous Version

db_zip2cloud

This is a simple cron container for backing up databases such as ArangoDB, compressing the backups and then synchronizing a remote S3 bucket against a local archive of the compressed backups

Operation

  1. [OPTIONAL] Perform a database dump based on environment variables provided, and place it in /dump/
  2. Use 7zip to compress and encrypt the contents of the /dump/ directory and put it in into /zip/
    • The resulting zip will have have "dump/" as the relative root directory
  3. Prune any files in /zip/ that are older than 30 days
  4. Use rclone with an AWS S3 compatible provider to synchronize /zip/ against a remote S3 bucket and directory. Currently configured for Google Cloud Storage in file rclone.conf

This container requires the following secrets to be in /var/run/secrets:

  • encryption_key - Encryption key used by 7zip for encryption of compressed files
  • gcp_backup_creds - Google service credentials JSON secret for use with rclone (see rclone.conf file for service_account_file directive)

The following environment variables need to be passed into the runtime environment

  • BUCKET - The name of the bucket to be used as the destinatio for copying the backups
  • BUCKETPATH - Path with the bucket to deposit the zipped db files

The following volumes need to be mounted into the running container:

  • /dump/ - Directory either containing existing DB dumps or which will be the destination for a DB dump.
  • /zip/ - Directory for writing the compressed/encrypted DB dumps before copying to the S3 remote

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.