Go Back   EQEmulator Home > EQEmulator Forums > Support > Support::Windows Servers

Support::Windows Servers Support forum for Windows EQEMu users.

Reply
 
Thread Tools Display Modes
  #1  
Old 03-13-2012, 12:16 AM
chrsschb's Avatar
chrsschb
Dragon
 
Join Date: Nov 2008
Location: GA
Posts: 904
Default Question about databases & restoring backups

When restoring a backup, how do you select specific rows out of a table?

For instance, I want to restore the corpses of someone whose corpses rotted. How can I restore only their items from the inventory table without overwriting everyone else's?

Using Navicat 8 for what it's worth.
Reply With Quote
  #2  
Old 03-13-2012, 09:24 AM
ProSkeptic
Fire Beetle
 
Join Date: Feb 2012
Location: New York and Charlotte
Posts: 17
Default

I don't believe that Navicat has that ability.

Assuming that you have a MySQLdump (text-based backup) of the old data, do a GREP for INSERT, and for (either INVENTORY, or PLAYER_CORPSES, and for the char_id of that specific player) -- case insensitive.

If you don't have a text backup, then if you do have binary logging enabled, then the old data could still be in the binlogs. Check to see the date-time of the binlogs. Do (for instance) a mysqlbinlog binlog.000001 > new-backup.sql and see if the data might still be in there and you may be able to grep that. Of course, that would only have the current inserts/deletes for the database from the time you last purged the logs, up to the expire dates (if you have binlogs).

Then manually delete the current rows from those tables for that char-id and re-insert those GREPped versions.

I have a trigger on player_corpses and on inventory (on update, insert and delete) so it inserts into an auditing table (called aud_player_corpses and aud_inventory) with just char_id, slot_id, item_id, charges and _when (date-time stamp. Anything older than 90 days is deleted from the audit tables. But I can go back to any point in time with the audit-tables.
Reply With Quote
  #3  
Old 03-13-2012, 09:40 AM
chrsschb's Avatar
chrsschb
Dragon
 
Join Date: Nov 2008
Location: GA
Posts: 904
Default

Doesn't sound like that is much different than what I did.

I have SQL backups that I dumped the inventory table to text.

I went through the text file searching for ('xx',' because the charid is the first value, I then copied all those lines to a separate text file. When I was finished I inserted those back into the inventory table.

That worked, but it's hella time consuming copy and pasting every item, plus taking the time to delete duplicates.
Reply With Quote
  #4  
Old 03-13-2012, 12:41 PM
ProSkeptic
Fire Beetle
 
Join Date: Feb 2012
Location: New York and Charlotte
Posts: 17
Default Save some work...

You have several alternatives by which a great deal of effort may best be avoided.

Firstly, arrange a backup process at two levels :

First, set-up a cron-task to do a bulk dump (mysqldump --all-databases --extended-insert) each night. Also, every hour, backup the bin-logs.

Also, every night try to do a mysqldump for each table, separately into their own files, with --skip-extended-insert (so there is one line in the file per row in the table). This let's you get back data far more quickly.

Lastly, as I mentioned above -- consider triggers for auditing, so you can keep some backups and trails inside the database.
Reply With Quote
  #5  
Old 03-13-2012, 10:36 PM
chrsschb's Avatar
chrsschb
Dragon
 
Join Date: Nov 2008
Location: GA
Posts: 904
Default

Can you instruct me, or point me in the direction, where I can learn how to enable these different methods?
Reply With Quote
  #6  
Old 03-14-2012, 12:58 AM
ProSkeptic
Fire Beetle
 
Join Date: Feb 2012
Location: New York and Charlotte
Posts: 17
Default General

If you are running windows, then you'd set up a scheduled-task and if you are running Linux then you set up a cron job :

Windows : http://windows.microsoft.com/en-US/w...chedule-a-task
Linux http://www.cyberciti.biz/faq/how-do-...-or-unix-oses/
https://help.ubuntu.com/community/CronHowto

Normally, for safety's sake, I suggest using an entirely separate physical drive / mount point / volume for the backups. It's a bad idea to keep the backups in the same location as the actual data. But sometimes there is no alternative.

Firstly, get into mysql via Navicat, MySQLcc, Toad for MySQL or the command-line client (called mysql). Enter this command :
Code:
show variables like "log_bin"
If it says "OFF" then go to your mysql configuration file (my.cnf, just my, or my.ini...) and add a line to the [mysqld] block (or uncomment the one there already) :
Code:
log_bin
That enables binary logging.

Then restart the database service : net restart mysql for Windows or service mysql restart

For the nightly MAIN backup CRON/scheduled task run a command like :
Code:
mysqldump -h192.168.0.12 -u root --password=xxxx --single-transaction --all-databases --extended-insert > new_backup.sql
For the nightly easy backups try either splitting that above new_backup.sql with ( suggested ):
http://kedar.nitty-witty.com/blog/my...p-shell-script
OR

Code:
mysqldump -h192.168.0.12 -u root --password=xxxx -D peqdb --skip-extended-insert --tables= account > BKP_account.sql
(one such command for each table -- ).

It can be a script created manually, but each time you add or delete a table in peqdb, you have to modify it OR use a perl script like :

Code:
#!/usr/bin/perl -w
#
#my $backupdir = "/home/eqemu/db_backups/";
#
#
#my $this_day = `date`;
#open (LOGFILE, '>>/home/sql_backup/backup.log');
#print LOGFILE "NEW BACKUP - ${this_day}\n";
#
#open (DATABASELIST, "/usr/bin/mysqlshow -u root -pTOPSECRETPW_HERE | ");
#
#while (<DATABASELIST>) {
#        chomp();
#        #print "$_\n";
#        
#        my ($dash1, $table, $dash2) = split(" ",$_);
#        $timestamp = `date`;
#        if (defined($table) && ($table ne "Databases")) {
#                print LOGFILE "${timestamp} : backing up ${table}\n";
#                `/usr/bin/mysqldump -u root -pTOPSECRETPW_HERE --database $table > ${backupdir}bkup_${table}.sql`
#        }
#}
#close (LOGFILE);
Here is a caveat : either command will "lock" the tables for so long as the data is being "read" by the mysqldump program (not very long). But that second method might not be transaction-safe -- so if someone changes account info while you are up to backing up player_corpses, then it won't be in there (with the second method).

That second method is JUST to make your retreival easier.

Ok, that's easy.

Now, make a new task/cron job to run every hour, just do a script where you copy the new bin-logs to that backup directory as :
cp -fu /var/lib/mysql/binlog* /home/eqemu/db_backups/
( the -f does force to overwrite and the u is to copy NEWER files).

Does that help?
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

   

All times are GMT -4. The time now is 09:30 AM.


 

Everquest is a registered trademark of Daybreak Game Company LLC.
EQEmulator is not associated or affiliated in any way with Daybreak Game Company LLC.
Except where otherwise noted, this site is licensed under a Creative Commons License.
       
Powered by vBulletin®, Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Template by Bluepearl Design and vBulletin Templates - Ver3.3