If you are running windows, then you'd set up a scheduled-task and if you are running Linux then you set up a cron job :
Windows :
http://windows.microsoft.com/en-US/w...chedule-a-task
Linux
http://www.cyberciti.biz/faq/how-do-...-or-unix-oses/
https://help.ubuntu.com/community/CronHowto
Normally, for safety's sake, I suggest using an entirely separate physical drive / mount point / volume for the backups. It's a bad idea to keep the backups in the same location as the actual data. But sometimes there is no alternative.
Firstly, get into mysql via
Navicat,
MySQLcc,
Toad for MySQL or the command-line client (called
mysql). Enter this command :
Code:
show variables like "log_bin"
If it says "
OFF" then go to your mysql configuration file (my.cnf, just my, or my.ini...) and add a line to the
[mysqld] block (or uncomment the one there already) :
That enables binary logging.
Then restart the database service : net restart mysql for Windows or service mysql restart
For the nightly MAIN backup CRON/scheduled task run a command like :
Code:
mysqldump -h192.168.0.12 -u root --password=xxxx --single-transaction --all-databases --extended-insert > new_backup.sql
For the nightly easy backups try either splitting that above new_backup.sql with (
suggested ):
http://kedar.nitty-witty.com/blog/my...p-shell-script
OR
Code:
mysqldump -h192.168.0.12 -u root --password=xxxx -D peqdb --skip-extended-insert --tables= account > BKP_account.sql
(one such command for each table -- ).
It can be a script created manually, but each time you add or delete a table in peqdb, you have to modify it OR use a perl script like :
Code:
#!/usr/bin/perl -w
#
#my $backupdir = "/home/eqemu/db_backups/";
#
#
#my $this_day = `date`;
#open (LOGFILE, '>>/home/sql_backup/backup.log');
#print LOGFILE "NEW BACKUP - ${this_day}\n";
#
#open (DATABASELIST, "/usr/bin/mysqlshow -u root -pTOPSECRETPW_HERE | ");
#
#while (<DATABASELIST>) {
# chomp();
# #print "$_\n";
#
# my ($dash1, $table, $dash2) = split(" ",$_);
# $timestamp = `date`;
# if (defined($table) && ($table ne "Databases")) {
# print LOGFILE "${timestamp} : backing up ${table}\n";
# `/usr/bin/mysqldump -u root -pTOPSECRETPW_HERE --database $table > ${backupdir}bkup_${table}.sql`
# }
#}
#close (LOGFILE);
Here is a caveat : either command will "lock" the tables for so long as the data is being "read" by the mysqldump program (not very long). But that second method might not be transaction-safe -- so if someone changes account info while you are up to backing up player_corpses, then it won't be in there (with the second method).
That second method is JUST to make your retreival easier.
Ok, that's easy.
Now, make a new task/cron job to run every hour, just do a script where you copy the new bin-logs to that backup directory as :
cp -fu /var/lib/mysql/binlog* /home/eqemu/db_backups/
( the -f does force to overwrite and the u is to copy NEWER files).
Does that help?