Working with archives
From Oxxus Wiki
On Unix/Linux servers, the most used archivers are tar and GunZip.
You can use the following example to make an archive with tar:
tar cf home.tar /home
This command would pack the /home directory into home.tar archive and create it in your current working directory.
To extract it, you need to type:
<niowiki>tar xvf home.tar</nowiki>
and it will extract the directory in your current working directory
Switches are:
- x - extract
- v - verbose
- f - file archive
You can also use tar command to make archives that are first tarred and then GunZipped just by adding z switch. That would shrink the archive significantly. For example:
tar czf home.tar.gz /home
would pack /home folder into home.tar.gz archive and
tar zxvf home.tar.gz
would extract the archive.
Using tar and making a shell script and using crontab to execute the shell script at the given time, can make a perfect backup script on daily or weekly basis. There are a lot of online tutorials on this topic and here's an example of how to set one up.
First create a backup folder:
mkdir /backup
Then create a file /usr/sbin/backup.sh which contains the following script, for example:
#!/bin/sh backuphome="/backup" backupdir="/home" d=`date "+%m-%d-%y"` cd $backuphome tar czf backup-$d.tar.gz $backupdir
Write the file and execute the following command to make it executable:
chmod +x /usr/sbin/backup.sh
Then set a crontab to run, for example, once a day. Following command would take care of this:
echo "0 0 * * * root /usr/sbin/backup.sh 2>/dev/null >/dev/null" >>/etc/crontab
And that's all. Every day you will make/have a backup of the home folder.
Notes
Please note that these scripts could eventually fill out the disk space and can make your server unusable. To avoid this problem, delete old backups regularly or simply add automatic upload to some of your remote FTP servers, if you have one, and delete them instantly from the disk.