Script for auto-update SQL updates (linux only)

Hi, guys.

I’m working on a shell script that works to autoupdate the SQL updates from TrinityCore after a update from GIT.

Working and running about some weeks, with no errors.

If someone has some correction, let us to know.


#!/bin/sh

trap "" 1 2 3 15


# Default Mysql values - Change this like your config values

AUTH="auth" # DBNAME for AUTH DB

CHARACTERS="characters" # DBNAME for CHARACTERS DB

WORLD="world" # DBNAME for WORLD DB


TRINITYUSER="trinityMYSQLuser" # USERNAME for MySQL TRINITYUSER

TRINITYPASS="triniyMYSQLpass" # PASSWORD fo MySQL TRINITYUSER


# Some variables

CAT=`whereis -b cat | cut -d " " -f 2`

ECHO=`whereis -b echo | cut -d " " -f 2`

MV=`whereis -b mv | cut -d " " -f 2`

MYSQL=`whereis -b mysql | cut -d " " -f 2`

RM=`whereis -b rm | cut -d " " -f 2`

STAT=`whereis -b stat | cut -d " " -f 2`

TOUCH=`whereis -b touch | cut -d " " -f 2`


# Some functions

sqlinfo () {

        TSTAMP=`$STAT -c %y $1 | cut -d . -f 1`

        $ECHO "" >> $2

        $ECHO "-- ###################################################################################################################" >> $2

        $ECHO "-- # "$1" ("$TSTAMP")" >> $2

        $ECHO "-- ###################################################################################################################" >> $2

        }


$TOUCH auth

$TOUCH characters

$TOUCH world


for FILE in `ls -d -1 ~/TrinityCore/sql/updates/**/*.sql`

do

        DATABASE=`echo $FILE | awk -F"updates/" '{print $2}' | cut -d / -f 1`

        sqlinfo $FILE $DATABASE

        $CAT $FILE >> $DATABASE

        $RM $FILE

done


$MYSQL --user=$TRINITYUSER --database=$AUTH --password=$TRINITYPASS < auth

$MYSQL --user=$TRINITYUSER --database=$CHARACTERS --password=$TRINITYPASS < characters

$MYSQL --user=$TRINITYUSER --database=$WORLD --password=$TRINITYPASS < world


$MV auth auth_`date +%Y%m%d%H%M%S`.bkp

$MV characters characters_`date +%Y%m%d%H%M%S`.bkp

$MV world world_`date +%Y%m%d%H%M%S`.bkp

How that work:

Running on Trinity account default home (~/).

The script go on sql/updates and look for all updates (ending with .sql).

Write then on separated files (auth, characters and world).

Do an MySQL load that files.

Save thats file with YYYYMMDDHHMMSS.

All tips will be appreciate.

This deserves a sticky! Great job. =)

bad idea!

It is a very bad idea to run the same SQL file twice on the DB. And from i understand from that script, you will run the same SQL over and over again because you import it to the file everytime you run the script. You are asking me how can that be? well… when you do a git update, all deleted files from the script are created again, so you’ll import them again.

My compile script does the same thing, but it saves the last date/time git updated and create the SQL file with the updates from that date/time.

When I update from git, I do with:

git pull origin master

With that command, my sql scripts are don’t created again. Only new sql updates appear.

Maybe, I’m doing wrong the updates?

Hum.

I do not try running the script twice, like you say.

I will try and update, if need.

Athena, I do the test, like that:

Run a ‘git pull origin master’;

Do make;

After, install with ‘make install’;

Run the script, to update the MySQL with the sql_updates get from git repository.

After that, I run a update from git again

git pull origin master

Nothing happen. The sql updates don’t appear, after be removed by the script.

What happens if I run this script? I can corrupt the database with 350 user?

because I accidentally delete the folder trinity and I can not remember which was the last SQL date.

Now I will download TrinityCore again with git. but this would affect my users ? or break the system?

WoW!!! your database grown up 150 users in only a couples of minutes?! Who are you? google or something? /emoticons/default_biggrin.png

What happens with this script if you run it in a normal way (not controling git), is that you’ll get many errors on a second pass.

example: a SQL update drop a column of a table… if you run it twice, the 1st one works like a charm, the 2nd one throws a error.

I remember 1 update long time ago, that when you run it twice, it will empty the table… Don’t remember well, but i think it delete a table if exist and recreate it again, then it will parse the data from another table to populate this one, and on the end, delete the 2nd table… So, if you run it twice, it will delete the new table, recreate it, but can’t populate it because the old table doesn’t exist.

You can delete and re-create the world database as many times as you want and it will not affect any of the characters.

I think he was refering to characters DB! /emoticons/default_dry.png

That was my dude!! you are a great master!!! Thanks you now , i can rest in peace!!

Here’s a script I wrote and use all the time, I’ve been adding some features here and there. Basically it’s really just an easy way for me to keep up with changes when I update the source.

To make it easy I put this in either /usr/bin or /usr/sbin and then you can call it globally. It doesn’t use sudo/root so it expects you have a non-root user for your SQL database, by default “trinity”. You can set your paths and variables at the top of the script. Keep in mind I’ve tried to add a lot of conditional checks to prepare for any scenario, but since I’m the only one using it I typically do it the same way every time.

You can name it whatever, but by default calling “import_mysql --help” will display available parameters and some options.

You can override the username/password with “-u username” and “-p password” respectively.

–force and --no-backup don’t work at the same time, which is kinda a bug because it’s in a select case, but in all reality it’s not bad that they don’t work at the same time.

My typical usage is:

import_mysql *.sql -d trinity_world -v

Which calls the script, selects the trinity_world database “-d” and outputs verbosely “-v”

Prior to using my script, please backup your databases. Also, please create a test database just until you get familiar with how it works, because this is really designed for maintaining not “creating”. So seriously just back everything up. I’m planning some additional features and things but if it helps anyone as is that’s great! I also typically only use this on the world database, and be very careful when doing this to characters/auth.

#!/bin/bash

Written by dark1911

ImportMysql v0.5 2014.04.19

Import MySQL Files to a MySQL Database server and store logs

#=============================== Global Settings ===============================

dbName=“” # Name of database to be used. Default: Set by user.
logDir=“” # Path to log files. Default: Home directory
backupDir=“” # Path of backup directory. Default: Home directory
dbPass=“trinity” # Password to connect to the database. Default: trinity
dbUser=“trinity” # Username to connect to the database. Default: trinity

declare -a files # Implicitly declare ‘files’ as an array.

showHelp() { # Show Help may move to the select case.
echo “Imports MySQL files as a one-db-per-file basis.”
echo “-?, --help Shows this help”
echo “-p, --password [password] Password to connect to server.”
echo “-d, --database [name] Database to use.”
echo “-u, --user [name] Username to connect to server.”
echo “-a, --after [file] Add selected file and only files newer. NOT YET IMPLEMENTED”
echo “-v, --verbose Be verbose.”
echo “–force Ignore log file and import anyways. DON’T USE THIS UNLESS YOU HAVE A REASON”
echo “–no-backup Do not backup database. I RECOMMEND YOU ALLOW THE BACKUP”
echo
}
getFiles() {
files=(“${files[@]}” “$1”) # Store files to be imported in ‘files’ array.
}
db(){
if [ $1 == “get” ]; then # Method parameter ‘get’: Gets database name.
echo
read -p “Enter Database Name:” dbName # Get database name.
elif [ $1 == “getpass” ] || [ -z $dbPass ]; then # Method parameter ‘getpass’: Gets database password.
read -p “Enter Database Password:” dbPass
until mysql -u $dbUser -p$dbPass -e ";" ; do # Until we can connect harass user for correct password. I type mine wrong all the time.
read -p “Failed to connect. Please re-enter password:” dbPass
done
fi
dbExists=$(mysql -u$dbUser -p$dbPass --skip-column-names -e “SHOW DATABASES LIKE '”$dbName"';“) # Mysql call returns empty/null if search ends up null/empty.
if [ -z “$dbExists” ]; then # Check to see result of mysql call.
echo -e “nDatabase e[1;31m$dbNamee[0m not found in sql server.” # Because red is a very serious color… ya know.
db get # Call db get to change name.
fi
return 0
}
backupDB(){
dumpFile=”$backupDirdate +"%Y_%m_%d"_$dbName.sql"
echo -e “nBegining backup of $dbName to:”
echo $dumpFile
mysqldump -u$dbUser -p$dbPass $dbName > $dumpFile
echo -e “nDatabase backup complete.”
return 0
}
compareLog(){
echo “Please wait comparing files to log…”
index=0
while read line ; do # Loop until all lines in the file have been read.
previousFiles[$index]=echo $line | awk '{print $4;}' # Throw logged files into an array. Position 4 tracks filesnames.
let ++index;
done < $logFile;
if [[ $previousFiles ]]; then # Check that log file is not empty.
sortedFiles=" ${files[li]} " # stringify the array[/li] for item in ${previousFiles[@]}; do # Iterate over items in previousFiles from log
sortedFiles=${sortedFiles/ ${item} / } # Remove items in log from array
sortedFiles=${sortedFiles[@]} | sed ‘s/ /n/g’ | sort # In the event files were manually added to the log etc. Sort the array alphabetically.
files=( $sortedFiles ) # Update files[] to reflect the above change.
done
fi
}

End of method declarations.

Begin Script:

if [ -z “$1” ]; then # If nothing is entered in the command line show the usage parameters.
echo “Usage: $0 [SQL File/s] COMMAND”
echo -e “nExample: $0 *.sql -d mydatabase -u $USER -v”
exit 1
fi

while [[ $# != 0 ]] ; do # Iterate over files. i.e .sql .
case $1 in
-[?]|–help) # Display help and exit
echo “Usage: $0 [SQL Files] [Command Line Arguments]”
showHelp
exit 1
;;
-[d]|–database) # Set database via command line option
dbName=$2
shift
;;
–force)
forceLogOverride=true # Script will ignore logfile and import regardless
shift # This is really more if you create a new DB and haven’t deleted the log yet
;; # It’s really not a best practice scenario, but it’s your option.
–no-backup)
explicitNoBackup=true # Script will not perform automatic database backup.
shift
;;
-[p]|–password)
dbPass=$2 # Set database password via command line option.
shift
;;
-[u]|–user) # Set database user via command line option.
dbUser=$2
shift
;;
-[a]|–after) # Select files after selected assuming files are dated correctly.
echo $2 # Script will ignore files with date prefixes before log the new files
updateOnly $2 # and will prompt you to add the other files to the log to prevent duplicates.
shift
;;
-[v]|–verbose) # Print additional information in terminal.
verbose=true
;;
-
) # Check if an argument has been entered, which is preceded by -
getArgs $1
echo “Invalid Argument”
exit 127
;;
*) # All other input call getFiles() and pass input.
getFiles $1
;;
esac
shift
done
if [[ -z $dbPass ]]; then db getpass; fi # Check if database password has been set. If not prompt user to set it.
if [[ -z $dbName ]]; then db get; else db check; fi # If no database is selected, prompt user to set one. Otherwise, check server to make sure it exists.
if [[ -z $logDir ]]; then logDir=$HOME; fi # If no custom log directory defined, then set it to /home/[USER]/
logFile=$logDir/$dbName"_import.log" # Set logfile name to [DBNAME]_import.log. You can modify this format if you wish.
if [ -e $logFile ] && [ -z $forceLogOverride ]; then compareLog; fi # If log file is present, perform redundancy check, unless --force option specified, and forceLogOverride not null.

if [[ $verbose ]]; then
echo “Added:” ${previousFiles[${#previousFiles[@]}-1]} # Position 4 of log file tracks filename.
echo “Database:” $dbName
echo “On Date:”
echo -e “nThe following files have been added to the database:”
echo ${previousFiles[@]} | tr ’ ’ ‘n’ # Display list of previous added files from the log.
echo -e “nThe following NEW files will be added to the database:”
echo ${files[@]} | tr ’ ’ ‘n’ # Display list of files that will be added.
echo
fi

if [[ -z ${files} ]]; then # Should the compared arrays result in an empty array
echo -e “nAll selected files have already been imported according to the log file.”
echo “Should you wish to force this update use --force or clear the entries from the log and re-run this script.”
echo -e “nImport cancelled.” >&2
exit 1
fi

dbVerify=$(echo -e “e[1;31m$dbNamee[0m”) # Declare dbVerify just to use the red color for read prompt, cause ya know it’s red and stuff.
read -p “Continue importing files to $dbVerify?” -n 1 -r # Verify user wishes to continue with restoration process.

if [[ $REPLY =~ ^[Yy]$ ]]; then
if [[ -z $backupDir ]]; then backupDir=“$HOME/”; fi # If no custom backup directory defined, then set it to /home/[USER]"
if [ -z $explicitNoBackup ]; then backupDB; fi # Unless no backup is explicitly defined, perform backup of this database.
echo -e “nAdding files…”
for file in “${files[@]}” ; do # Iterate over selected files and perform checks below.
if [[ ! -f $file ]] ; then # Check if file exists.
echo “Skipping ‘$FILE’ (not a file)”
NAME=${file##*/}
elif [[ $file == *.[sS][qQ][lL] ]]; then
echo “date +"%F %T" $dbName $file” >> ${logFile} # Display files being imported.
mysql -u$dbUser -p$dbPass $dbName < $file # Call mysql command line and import files.
if [[ $verbose ]]; then echo “$dbName < $file”; fi
else echo “Skipping non-sql file $file”
fi
done
echo “All done.”
exit 0
else
echo -e “nDatabase file import aborted. Have a nice day.” >&2
exit 0
fi

Any questions concerns etc, feel free. I figured I’d share since it saves me a lot of hassles trying to manually import updates.