Results 1 to 4 of 4
  1. #1
    Join Date
    May 2003
    Posts
    23

    Question Unanswered: Trigger on several million row update

    I have a trigger which updates a column whenever a particular other column in the row changes. I am opening a cursor on the "inserted" table, because many rows could be updated in a transaction.

    I have 2 questions.
    1. What happens if I update 1,000,000 rows from Query Analyzer -- is there a default number of rows after a commit is issued? I don't see a setting in Q.A.
    2. When will the trigger fire? After each commit? If 1,000,000 rows are updated, then committed, will SQL Server have built 1,000,000 rows in the "inserted" table, then perform the trigger 1,000,000 times?

    Until recently, I expected a trigger to fire each time a row is updated, if applicable, but discovered (with some help here) that is not the case.

  2. #2
    Join Date
    Feb 2002
    Location
    Houston, TX
    Posts
    809
    1. Default is after all rows updated but before committed.
    2. Triggers fire before the commit, otherwise you couldn't roll back an insert/update/delete. If you use an AFTER trigger (Default), a transaction will have been started and all the rolws updated before your trigger is fired. Your trigger does it's thing and unless your trigger has interviened an implicit commit is executed. If you use an INSTEAD OF trigger dito above but the trigger fires in place of the triggering SQL statment.
    Paul Young
    (Knowledge is power! Get some!)

  3. #3
    Join Date
    May 2003
    Posts
    23
    Paul,
    Where are the "inserted" table rows for the trigger stored -- transaction log or tempdb? If plenty of storage available, is there any performance issue from updating 1 or several millions rows then executing same number of triggers?
    Is there a way from Query Analyzer to set something like a batch size when executing one sql statement which might update one million or more rows?
    TIA

  4. #4
    Join Date
    Feb 2002
    Location
    Houston, TX
    Posts
    809
    The inserted and deleted tables reside in memory.

    Not sure I understand your question "if plenty of storage available, is there any performance issue from updating 1 or several millions rows then executing same number of triggers?" Theonly thing you need to worry about when updating a large number of records is that you have plenty of log space.

    There is not setting that I am aware of that will break a long running transcation into multipule batches.
    Paul Young
    (Knowledge is power! Get some!)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •