Unanswered: how to free up mysql server memory after large batch job
i am in a linux environment using mysql server version with around max 4.5 memory allocated to mysql on a virtual server that has up to 7 gb.
I run a script that does the following every night:
#import data from zip-file to database. Its around 6GB of data. Takes around 25 minutes.
Job1: zcat mysql_backup_file_sql.gz 2> error.log | mysql -u myuser -pmypassword mydatabase -hmy_database_server 2>> error.log
Job2:i have this sql script that i run, with a lot of sql-statements that aggregates,updates and inserts data in new tables.
My problem is that after Job1 has finished, the mysql server does not free up its memory, so i get out of memory error when i try to run job2. How can i free up the mysql server memory after job 1 has run?
For the time beeing i restart the mysql server (sudo /etc/init.d/mysql restart), so i can run job2....but that might not be the preffered way or is it?