rotater.py is a simple script, that solves the following problem:

Keeping daily data backups and removing old ones intelligently.

By default, it keeps a file each day for the last 14 days, one file each week for the next 12 weeks (after 14 days), and a file for each month for the next 36 months (after 12 weeks and 14 days)

E.g. you have a ton of daily SQL db backups, and you don't want to store the very old ones, but want to keep some of them.

It can guess the modification date from filename (the default behaviour), but it can be turned off via "--no-date-from-filename" param.


  • Examine, will the script delete, if launched, but without actualy deleting anything:

    ./rotater.py --test s3://bucket/dir

  • Delete old daily backups from s3://bucket/dir

    ./rotater.py s3://bucket/dir

    You can provide amazon credentials with arguments, environment variables, or if you have an ~/.s3cfg file, it will be parsed for the AWS credentials.

    See --help and the source for more details on this.

  • Delete old daily backups for EVERY DIR in s3://bucket/dir recursively

    ./rotater.py s3://bucket/dir --recurse

  • Delete all files named /var/backups/mysql/DBNAME_DUMP*, e.g. DBNAME_DUMP_2011-03-12.gz, DBNAME_DUMP_2011-03-13.gz

    1. First, we can print, which files are going to be kept:

      ./rotater.py --regex '/var/backups/mysql/DBNAME_DUMP.*' --test-keep

      In the output, you see which files will be kept, everyting else will be deleted.

    2. Then, if everyting is ok, we delete it

      ./rotater.py --regex '/var/backups/mysql/DBNAME_DUMP.*'

      You can place it in a cronjon easily.

  • Remove every database backup from /var/backups/mysql/

    for pattern in ls /var/backups/mysql/ | sed 's/(.*)-(2010|2011).*/1/' | uniq; do sudo /var/tmp/rotater.py --regex /var/backups/mysql/$pattern'*'; done

    This will parse the database name from the filenames like "/var/backups/mysql/wikidb_dump-2011-01-20.gz" and run the script for each of the patterns.