Like everything in computing, there are always multiple ways to do something. I’m going to show you how to back up an entire website with the TAR command available in every distribution of Linux. It’s fast, simple and effective. I’ll also show you how to extract it. Again, it’s fast, simple and effective.
The only problem you’ll have is where to store the backup file. I download mine by FTP and I don’t do it often. Because my website stores its data in a database, the only thing I need regularly is a database dump file.
This is the script I use to back up a single website. Only the information within the brackets needs to be changed (along with the brackets themselves]:
<?php chdir( '[parent-directory']; shell_exec('tar -czpvf [child-directory'].tar.gz [child-directory]'); ?>
I could have used a shell script, but I like using what I’m familiar with and I’m familiar with PHP. The only reason you need any script at all is because the TAR command doesn’t like forward slashes. The safest way is to move one directory above the one you want to archive and then run the command.
The same thing applies with TAR for restoring a website, so the script will be similar:
<?php chdir( '[parent-directory']; shell_exec('tar -xzpvf [child-directory'].tar.gz); ?>
I keep these scripts in my root directory and I run them from the command line in an SSH terminal session. I rarely do it and it’s usually when I’m getting ready to change a bunch of stuff. It’s even rarer when I take the time to download it.
The archive can get quite large. The last time I created a file, it was over 300 megabytes compressed with gzip (it would still be big with bzip). It’s not something I want to automatically create or download.