Snippets
Created by
Matthew Pope
last modified
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 | <?php /* WHAT IS THIS? Automated backup script that pulls the database, compresses, and syncs that plus any user uploaded files over to an S3 bucket. Designed for Platform.sh SETUP: - Ensure that monolog is available, if not, add via composer.json - Add the AWS PHP SDK (aws/aws-sdk-php) and Platform.sh config reader (platformsh/config-reader) via composer.json - Ensure that AWS AMI user is created and has access to read/write the ftusa-site-backups S3 bucket - Add backups directory to .platform.app.yaml mounts: "/backups": "shared:files/backups" - Add environmental variables in Platform.sh - env:AWS_ACCESS_KEY_ID - env:AWS_SECRET_ACCESS_KEY - env:LOGGLY_TOKEN (note: get from loggly > source setup > tokens) - env:FILES_TO_BACKUP (optional: only add if you have user uploaded files to back up -- if added use, full path [e.g. /app/storage/app/uploads]) - Deploy and test using: php ./jobs/db_backup.php - Add cron task to .platform.app.yml db_backup: spec: "0 0 * * *" cmd: "php ./jobs/db_backup.php" Adapted by https://github.com/kaypro4 from an example by https://github.com/JGrubb - Thanks John! */ $home_dir = getenv('PLATFORM_DIR'); require_once $home_dir . '/vendor/autoload.php'; $bucket = 'ftusa-site-backups'; $fixedBranch = strtolower(preg_replace('/[\W\s\/]+/', '-', getenv('PLATFORM_BRANCH'))); $baseDirectory = 'platform/' . getenv('PLATFORM_APPLICATION_NAME') . '/' . $fixedBranch; $branchAndProject = getenv('PLATFORM_APPLICATION_NAME') . ' > ' . $fixedBranch; use Monolog\Logger; use Monolog\Handler\LogglyHandler; use Monolog\Formatter\LogglyFormatter; $logger = new Logger('backup_logger'); $logger->pushHandler(new LogglyHandler(getenv('LOGGLY_TOKEN') . '/tag/backup_logger', Logger::INFO)); $psh = new Platformsh\ConfigReader\Config(); if($psh->isAvailable()) { //backup the db try { $sql_filename = date('Y-m-d_H:i:s') . '.gz'; $backup_path = $home_dir . "/backups/"; $database = $psh->relationships['database'][0]; putenv("MYSQL_PWD={$database['password']}"); exec("mysqldump --opt -h {$database['host']} -u {$database['username']} {$database['path']} | gzip > $backup_path$sql_filename"); $s3 = new Aws\S3\S3Client([ 'version' => 'latest', 'region' => 'us-west-1', 'credentials' => [ 'key' => getenv('AWS_ACCESS_KEY_ID'), 'secret' => getenv('AWS_SECRET_ACCESS_KEY') ] ]); $s3->putObject([ 'Bucket' => $bucket, 'Key' => "$baseDirectory/database/$sql_filename", 'Body' => fopen($backup_path.$sql_filename, 'r') ]); //remove local backup files that are older than 5 days $fileSystemIterator = new FilesystemIterator($backup_path); $now = time(); foreach ($fileSystemIterator as $file) { if ($now - $file->getCTime() >= 60 * 60 * 24 * 5) unlink($backup_path.$file->getFilename()); } $logger->addInfo("Successfully backed up database $sql_filename for $branchAndProject"); } catch (Exception $e) { $logger->addError("Database backup error for $branchAndProject: " . $e->getMessage()); } if (getenv('FILES_TO_BACKUP') !== false) { //backup any user uploaded files using sync if the environmental variable //exists for the environment try { $s3 = new Aws\S3\S3Client([ 'version' => 'latest', 'region' => 'us-west-1', 'credentials' => [ 'key' => getenv('AWS_ACCESS_KEY_ID'), 'secret' => getenv('AWS_SECRET_ACCESS_KEY') ] ]); //sync the files from one directory $s3->uploadDirectory(getenv('FILES_TO_BACKUP'), "$bucket/$baseDirectory/files"); $logger->addInfo("Successfully backed up files " . getenv('FILES_TO_BACKUP') . " for $branchAndProject"); } catch (Exception $e) { $logger->addError("Files backup error for $branchAndProject: " . $e->getMessage()); } } } |
Comments (2)
You can clone a snippet to your computer for local editing. Learn more.
Thanks for the script!
As a note, I think you also need to require the following with composer: platformsh/config-reader
As it is used on line 45: $psh = new Platformsh\ConfigReader\Config();
Great catch Gareth, thanks! Updated the snippet to add it in setup.