This is a small script I wrote to backup a postgres database to S3 via cron. It runs nightly and saves 30 days of databases. The README has a good bit of information about what is required to set it up. I will copy some of it here though.
.pgpass
A .pgpass
file is required for this script as a password cannot be passed in. The format is listed below. It should be only accessible to the user running the script and will be located in the home folder. See the postgress wiki for more information.
hostname:port:database:username:password
Cron
This is the command I used for cron, it sets up the environment for rbenv.
0 0 * * * /bin/bash -c 'PATH=/opt/rbenv/shims:/opt/rbenv/bin:$PATH RBENV_ROOT=/opt/rbenv ruby /home/deploy/pg-to-s3/backup.rb'
backup.rb
#!/usr/bin/env ruby
require 'time'
require 'aws-sdk'
require 'fileutils'
# .pgpass file required, it is in the following format
# hostname:port:database:username:password
pg_user = ENV["POSTGRES_USERNAME"] || "postgres"
pg_host = ENV["POSTGRES_HOST"] || "localhost"
pg_port = ENV["POSTGRES_PORT"] || "5432"
pg_database = ENV["POSTGRES_DATABASE"]
bucket_name = ENV["BACKUP_BUCKET_NAME"]
project_name = ENV["PROJECT_NAME"]
# backup pg
time = Time.now.strftime("%Y-%m-%d")
filename = "backup.#{Time.now.to_i}.#{time}.sql.dump"
`pg_dump -Fc --username=#{pg_user} --no-password --host #{pg_host} --port #{pg_port} #{pg_database} > #{filename}`
# verify file exists and file size is > 0 bytes
unless File.exists?(filename) && File.new(filename).size > 0
raise "Database was not backed up"
end
s3 = AWS.s3
bucket = s3.buckets[bucket_name]
object = bucket.objects["#{project_name}/#{filename}"]
object.write(Pathname.new(filename), {
:acl => :private,
})
if object.exists?
FileUtils.rm(filename)
end
DAYS_30 = 30 * 24 * 60 * 60
objects = bucket.objects.select do |object|
time = Time.at(object.key.split("/").last.split(".")[1].to_i)
time < Time.now - DAYS_30
end
objects.each do |object|
object.delete
end