Results 1 to 1 of 1
  1. #1
    Join Date
    Sep 2016
    Posts
    1

    Question Unanswered: Data transfer using transactions and triggers

    Hi all,

    I am using Postgresql 9.5.4 and am looking for a way to transfer large amounts of data from one table (table1) to another (table2). This transfer needs to be from local (cRIO - controller) to remote server. Once the data is copied over to table2, the data from table1 should be cleared to save space on the controller.

    So far I've been able to copy the data over from table1 to table2 and delete the row in table1 using triggers- the condition being a new row is inserted on table1. Here's an example of what I've done:

    Created the copy function, which will copy name, family_name and age into the new database, db2

    Code:
    CREATE FUNCTION save_table2 ()
        RETURNS TRIGGER AS $$
        BEGIN
        INSERT INTO table2
        VALUES (NEW.name,NEW.family_name,NEW.age)
        RETURN NULL;
        END $$ LANGUAGE plpgsql;
    Created the copy trigger - the above function will execute whenever a new row is inserted into table1:

    Code:
    CREATE TRIGGER data_trigger
        AFTER INSERT OR UPDATE ON table1
        FOR EACH ROW EXECUTE PROCEDURE save_table2();
    Created a clear function - to delete rows on table1 where the id number on table1 equals the id number of table2:

    Code:
    CREATE FUNCTION clear_table1 ()
        RETURNS TRIGGER AS $$
        BEGIN
        DELETE FROM table1 WHERE id=NEW.id;
        RETURN NULL
        END $$ LANGUAGE plpgsql;
    The above will execute when the row is inserted in table2:

    Code:
    CREATE TRIGGER data_clear
        AFTER INSERT ON table2
        FOR EACH ROW EXECUTE PROCEDURE clear_table1 ();
    This works well for small tables, but I fear it might not be able to handle large ones. The problem with this code is that if there is an error, fail or any mistakes the data may be lost forever.

    Transactions seems to be a great way to avoid loosing unnecessary data. Is there a way to use triggers and functions with transactions so that there is some protection against deleting the wrong information forever? Simply putting begin and commit around the above code won't do anything as the functions and triggers would already have been set by this point. I'm looking to find something that can manage the data correctly row by row.

    Or does anyone have a better idea of how to transfer large amounts of data from one database to another automatically and delete the data on the first?

    Any help would be much appreciated.

    Thanks!

    Lisa
    Last edited by lisam88; 09-14-16 at 10:39.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •