Jump to content
Froxlor Forum
  • 0

[Poll] How are you Doing your Backups


nicname

Question

Hey Guys, after thinking about my Backup routine i thought of asking you how you are doing your Backups.

The Problem with mine is that its taking imho to much time.

Performance Stats will follow.

 

So lets start:

MySQL

its basicaly that a MySQL Query collects every Database and dumps every Database to a file for every Database.

 

BACKUPDIR="/var/backup/local/backups/mysql"

function pwait() {
if [ "$#" -eq 0 ]; then
MAXPROC=12
else
MAXPROC=$1
fi

while [ $(jobs -p | wc -l) -ge $MAXPROC ]; do
sleep 1
done
}

   mysqlcheck -u -p --check-only-changed --check-upgrade --medium-check --auto-repair --all-databases --silent

mkdir -p ${BACKUPDIR}/`date +%d%m%Y`

cd ${BACKUPDIR}/`date +%d%m%Y`

DATABASES="$(mysql -u backup -puvut5otu20 -h localhost -Bse 'show databases')"

for db in $DATABASES
do
   mysqldump --routines --single-transaction --add-drop-table -u -p -h localhost  $db | bzip2 -c -9 > ${BACKUPDIR}/`date +%d%m%Y`/$db.$NOW-$(date +"%T").bz2 &
pwait 4
done
wait

Pros: it does 4 Databases at the same time to safe time etc.

Cons: it still takes time, imho to much to backup my 55 DB's with 844 MB at all.

 

 

Webspace:

keep increments of every file for 14 days, this way i can undo failures i did some days ago

rdiff-backup --force --remove-older-than 14D /var/backup/local/backups/webs/
rdiff-backup --no-file-statistics --exclude-special-files  --no-compression /var/customers/webs /var/backup/local/backups/webs/

quite simple:

Pro: its easy, really easy, it keeps increments of the files so it uses not very much space (75 GB Backup for 70 GB of Files)

cons: rdiff-backup uses only one cpu core and in this case the cpu is the bottleneck so its not fast enough for me

 

So this is the important part of my Backup Routine, the rest (Users, homedir, /etc/, installed Packages, is not really important here).

 

 

So how are you handling your Backups ?

Greetings,

Nicolas

Link to comment
Share on other sites

9 answers to this question

Recommended Posts

Backup DB's:

 

For MySQL DB's I use a simple shellscript, which backup every file to a compressed file, too.

Cronjob: 25 1 * * 7 /root/tools/backup_mysql.sh

 

A quick&dirty solution, but at the moment it's enough for me.

 

 

 

web:

 

For files I'm using Tartarus, because it's very simple and makes his job very well. If you have any issues or feature requests, you can contact the author easily via tartarus-mailing-list and other.

 

Example config (local backup):

/etc/tartarus/backup_local_root.conf

      MAXAGE="180" 

      DIRECTORY="/root"
      NAME="root"
      STORAGE_FILE_DIR="/var/backups/tartarusbackup/root"

      source /etc/tartarus/local/local_generic.inc

 

/etc/tartarus/local/local_generic.inc

       STORAGE_METHOD="FILE"
      COMPRESSION_METHOD="gzip"
STAY_IN_FILESYSTEM="yes"

      /usr/sbin/charon.local --maxage "$MAXAGE" --dir "$STORAGE_FILE_DIR" --profile "$NAME"

 

 

 

Example (backup to FTP server):

 

/etc/tartarus/ftp/backup_ftp_root.conf

       MAXAGE="60"

      DIRECTORY="/root"
      NAME="root"
STORAGE_FTP_SERVER="****.your-backup.de/root"

      source /etc/tartarus/ftp/ftp_generic.inc

 

/etc/tartarus/ftp/ftp_generic.inc


STORAGE_METHOD="FTP"
STORAGE_FTP_SERVER_HOST="*****.your-backup.de"
STORAGE_FTP_USER="*****"
STORAGE_FTP_PASSWORD="*********"
COMPRESSION_METHOD="gzip"
STAY_IN_FILESYSTEM="yes"



      TARTARUS_POST_PROCESS_HOOK() {
          echo -n "$STORAGE_FTP_PASSWORD" | /usr/sbin/charon.ftp \
          --host "$STORAGE_FTP_SERVER_HOST" \
          --user "$STORAGE_FTP_USER" --readpassword \
          --maxage "$MAXAGE" \
          --dir "$NAME" --profile "$NAME"
      }

 

 

Cronjobs:

 

 

## LOCAL BACKUPS ##

25 3 1 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_etc.conf
45 4 6 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_root.conf
45 3 15 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_usr.conf
30 4 15 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_customers_logs.conf
30 3 17 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_customers_mail.conf
30 4 17 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_customers_webs.conf
30 4 26 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_customers_webs.conf
30 3 20 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_log.conf
30 4 20 * * /usr/sbin/tartarus /etc/tartarus/local/backup_local_var_www.conf


## FTP BACKUPS ##

15 5 2 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_etc.conf
45 5 2 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_home.conf
 5 5 7 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_root.conf
15 5 16 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_usr.conf
45 5 16 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_var_customers_logs.conf
05 5 18 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_var_customers_mail.conf
30 5 18 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_var_customers_webs.conf
15 5 21 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_var_log.conf
30 5 21 * * /usr/sbin/tartarus /etc/tartarus/ftp/backup_ftp_var_www.conf 

 

 

Yes, there are a lot of directorys to backup :)

 

 

EDIT: May mysqlhotcopy is a nice alternative for mysqldump?

Link to comment
Share on other sites

@doop any performance stats? or recommendations about it?

 

Bacula is high configurable (it's not a one-cronjob-backup solution!) therefore I cannot give you any performance stats.

Recommendation---well, Bacula is no 'starter'-tool and i don't know about your administration-skills so i can't give any recommendations. Everyone likes something else...i just wanted to contribute a 'non-cronjob'-solution to all these scripts here.

Link to comment
Share on other sites

I have a production server and a backup server in the same server farm. My customers are under /home/ (DirectAdmin/CentOS)

and I have a 'backups' diretory under each client (say /home/comp_a/backups).

 

I am using a "automysqlbackup.sh" script I found in the net which dumps/gzips daily, weekly and monthly databases and

they all end up in the backups directory. The script can be found here:

 

http://sourceforge.net/projects/automysqlbackup/

 

The backup server makes an rsync backup from the production one using ssh key pairs for authentication. The things

I copy is everything under /home/comp_a/ excluding stats.

 

3 0 * * * rsync --exclude "stats" -avz --delete -e "ssh -p 55000 -i /home/comp_a/cron/thishost-rsync-key" comp_a@prodserver.com:/home/comp_a/* /home/comp_a/

 

This is the actual cron task at the backup server. The tedious part is to create the ssh keys and update all files and permissions at both ends. If the first

servers goes down all I need to do is update the DNS records at a third-party DNS service. I used to do DNS before but now rely on external services.

 

Additionally, I have a low-end secondary backup server which gets an incremental rsync backup from the backup server /home/ every now and then.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.



×
×
  • Create New...