Back to General Articles

Ultimate Vbulletin Backup Guide || With or Without ROOT ACCESS with over 5GB Backups
by intensecool 05 Jan 2011

Hey Guys,

I own a VBULLETIN , its kind of big one at this time.
Here are the stats as i write the guide :
Threads :: 979,845
Posts :: 1,159,042
Members :: 181,012

All of us know, BACKING up a big board or even a medium one is literally a big pain.
All of us need a system to backup forums to a place , of course AUTOMATICALLY.


I searched a lot of thing, tried everything possible to backup things.
But yes, i faced many issues like these :
  1. Incomplete or Faulty backups due to database being continuously used by traffic.
  2. The website is already loading the servers & with the MYSQL Being dumped makes the servers heavy leaving the sites inaccessible for longer time before it becomes normal.
  3. If you do this manually means when the website is getting hits, you end up with a backup that many times wont restore properly.
Yes, it has happened with me so many times that i was fed up with it and finally decided to make a short working script for myself. Now, its confirmed working i am sharing it here with you guys.

Considering the comments , the script is now working :

1. With SSH LOGIN to Backup Servers.
2. Without SSH Login to Backup Servers
3. Can be RUN VIA CRON Setup, scroll down for instructions.


Mission :

We will backup the database locally and then finally back it up to the REMOTE BACKUP server.

With SSH LOGIN :

Requirements :

1. Dedicated Servers.
2. Root access of course.
3. A BackUp Server with root access.


Process :

As everything is to be done AUTOMATED, we need to set up the SOURCE SERVER & DEST SERVER to have PASSWORD LESS logins.

How to do that :

Lets say you want to copy between two hosts host_src and host_dest. host_src is the host where you would run the scp, ssh or rsyn command, irrespective of the direction of the file copy!

1.On host_src, run this command as the user that runs scp/ssh/rsync

$ ssh-keygen -t rsa

This will prompt for a passphrase. Just press the enter key. It'll then generate an identification (private key) and a public key. Do not ever share the private key with anyone! ssh-keygen shows where it saved the public key. This is by default ~/.ssh/id_rsa.pub:

Your public key has been saved in <your_home_dir>/.ssh/id_rsa.pub

2. Transfer the id_rsa.pub file to host_dest by either ftp, scp, rsync or any other method.

3. On host_dest, login as the remote user which you plan to use when you run scp, ssh or rsync on host_src.

4. Copy the contents of id_rsa.pub to ~/.ssh/authorized_keys

$ cat id_rsa.pub >>~/.ssh/authorized_keys
$ chmod 700 ~/.ssh/authorized_keys


If this file does not exists, then the above command will create it. Make sure you remove permission for others to read this file. If its a public key, why prevent others from reading this file? Probably, the owner of the key has distributed it to a few trusted users and has not placed any additional security measures to check if its really a trusted user.

5. Note that ssh by default does not allow root to log in. This has to be explicitly enabled on host_dest. This can be done by editing /etc/ssh/sshd_config and changing the option of PermitRootLogin from no to yes. Don't forget to restart sshd so that it reads the modified config file. Do this only if you want to use the root login.

Well, thats it. Now you can run scp, ssh and rsync on host_src connecting to host_dest and it won't prompt for the password. Note that this will still prompt for the password if you are running the commands on host_dest connecting to host_src. You can reverse the steps above (generate the public key on host_dest and copy it to host_src) and you have a two way setup ready!
I prefer to get a TWO WAY SETUP Done cause sometimes You may need it.

Now, we move to Backups :

All of us know about the MYSQL DUMP Command, so here is the command that you can use directly :

mysqldump --user Your_DB_User_Here --password=Your_DB_PASS_here Your_DB_NAME_Here > /path/to/your/backup/directory/backup_YourSiteName__`date '+%m-%d-%Y'`.sql
backup_YourSiteName__`date '+%m-%d-%Y'`.sql
Backup SQL file with website name & date of backups.

So, this command will back up your database to the BACKUP directory & include your Website name & DATE of course you can change the file name as you want to.

Now, we need to take the backups in a way that :
  1. It Does not OVERLOAD the servers.
  2. It does not allow the DB access at all else the backups are ruined.
To solve this, we can close the forums , put the SITE UNDER MAINTENANCE Pages, BUT EVERYTHING failed for me.. I did all this, it gives me backup but of course loads the servers as the SERPS traffic keep on visiting the forums and you can not stop them at all.

So, i finally decided to let the FORUMS BECOME UNREACHABLE to take the backups & its a matter of minutes to back up things.
Now, to automate this thing, i just wrote a small script :

cd /path/to/your/backup/directory/
cp -r /home/backup/* /home/backup_backup/
service httpd stop
rm ./*
mysqldump --user Your_DB_User_Here --password=Your_DB_PASS_here Your_DB_NAME_Here > /path/to/your/backup/directory/backup_YourSiteName__`date '+%m-%d-%Y'`.sql
service httpd start
scp *.sql email:/backup/
Now,what are these steps doing :

1. cd /path/to/your/backup/directory/
Goes to your backup directory.

2. cp -r /home/backup/* /home/backup_backup/
Copies the last done backup to ANOTHER BACKUP DIRECTORY, IN Case of my forums, i prefer this, you can leave it out.

3. service httpd stop
STOPS THE HTTPD Services leaving the sites inaccessible & REDUCING THE SERVER LOADS TO BARE MINIMUM.

4. rm ./*
mysqldump --user Your_DB_User_Here --password=Your_DB_PASS_here Your_DB_NAME_Here > /path/to/your/backup/directory/backup_YourSiteName__`date '+%m-%d-%Y'`.sql


Removes the OLD BACK in that directory, takes a new backup, stores the new backup as a file with DATE.
I DID NOT TAKE A GZIP backup as i have sometimes got CORRUPT BACKUPs with it, SQL files always worked for me, its your choice.

5. service httpd start
Starts the server httpd again, making the sites accessible.

6. scp *.sql email:/backup/

Now, we did the PASSWORD LOGIN SETUP ABOVE, this command copies your backup securely to the backup servers via SCP , so no worries.

Advantages :

1. The backups are always good & worked fine for me. I never had a faulty backup or corrupt sql file with this method.

2. The server loads NEVER SHOOT SKY HIGH as only MYSQLDUMP works at a time before the servers are open again.

3. Yes, the site becomes inaccessible for a couple of minutes in my case 1 Minute 20 seconds at present. But, its better then to show TIMED OUT ERRORS for 10 minutes in which SERVER LOAD is so high that the site lags badly.

4. Backups are safe in local storage as well as the REMOTE STORAGE.

WITHOUT SSH Login:


How Do I Setup a Cron Job To Backup Data Automatically?


Just add cron job as per your requirements:
13 0 * * * /home/admin/bin/ftpbackup.sh >/dev/null 2>&1

Easy to remember format:
* * * * * command to be executed
- - - - -
| | | | |
| | | | ----- Day of week (0 - 7) (Sunday=0 or 7)
| | | ------- Month (1 - 12)
| | --------- Day of month (1 - 31)
| ----------- Hour (0 - 23)
------------- Minute (0 - 59)

By default the output of a command or a script (if any produced), will be email to your local email account. To stop receiving email output from crontab you need to append >/dev/null 2>&1.
For example:
0 3 * * * /root/backup.sh >/dev/null 2>&1

To mail output to particular email account let us say email you need to define MAILTO variable to your cron job:

MAILTO="email"
0 3 * * * /root/backup.sh >/dev/null 2>&1

How can i use the FTP CLIENT on this ..??


If you guys want to use FTP Clients for the back up & do not have ROOT access to set up SCP access, you can use this one :

1. Install the NCFTP Client.

Use yum or apt-get package manager to install ftp client called ncftp.

2. Replace the command for SCP with NCFTP :


Use ncftp command instead of SCP :

Code:
ncftp -u"your_ftp_user" -p"your_ftp_password" your_FTP_server
cd /path/to/your/backup/directory
mput *
quit
This will login via NCFTP Client , then move to your backup directory & use the MPUT command to dump the backup to the FTP & on completion will quit the session.


And about GRACEFULLY CLOSING THE Vbulletin :


If your VB gets around medium traffic , it can be tried but I will always prefer you to SHUT DOWN the HTTPD for just few minutes.

I know it will make the website inaccessible but will ENSURE THAT Backups are complete and can be SAFELY restored without any errors , as you know when the dump stats the DATABASE is still in use for one or other queries & this also lags the backups & may lead to corrupt backups.

The script worked for me , so i just shared with you.
Of course, you can take my help in anything above i will gladly help you with things. But I WONT BE RESPONSIBLE FOR ANY DATA LOSS occurring by application of the above things.

I hope you will use the script and if you do, I WOULD REQUEST A WORD "THANKS" in this post.

I also provide professional VBULLETIN HOSTING & Solutions, now backups too.
You can reach me at : email

Enjoy with VB.
Regards

vblts.ru supports vBulletin®, 2022-2024