Hey all

I'm hoping you can help me as I drew a blank over at the MySQL forums with this.

I have two database servers, one production server with the main database being Postgres and the second an offline reporting server running on MySQL.

Periodically a specific database on the Postgres server is populated with new data, I am trying to pipe this data into the offline MySQL database for reporting // stats // monitoring but the only way that I have been able to do this so far is use SELECT (from Postgres) // INSERT (into MySQL) this is very in-efficient, also the column names can change on the source because we run different versions of the application that populates Postgres DB.

Is there any way to:

1. Automatically pipe the NEW data into MySQL (i.e. transactionally) each time changes occur

2. If no to the above, how can I just select ALL columns on the Postgres database (without having to name them) and then insert ALL columns into MySQL? (the only way I thought of this is to use an array to get the column names from Postgres and then use this array as the insert column names pointer for MySQL)

Can anyone help?

Thanks all