I have a program that consists in a loop whish never ends (can't use crontab): So, within the loop,
- I connect to the DB
- select count(*)
- if 0 result, write in log file, disconnect AND sleep
- if some results, delete the lines, disconnect AND sleep
the result of the select is usually 0 and it takes more and more cpu as I can't free memory.
Is my problem clear?
Has anyone any idea to avoid the program to take more and more cpu?
First of all, I suggest that your program should not "disconnect" each time. Simply sleep. I don't know if your DBMS provides an "event" mechanism that you can sleep on, or if you simply must use a timer.
A database connection takes an insignificant amount of resources. It's cheaper in the long run to keep the connection than to constantly tear it down and rebuild it.
As you describe your algorithm, it seems that you intend for the process to "sleep" in either case, so that if it does not, there must be some programming error.