Results 1 to 2 of 2
  1. #1
    Join Date
    Apr 2011

    Unanswered: Using Constraint Purposely

    This may seem like a out of the box type question, but what are the thoughts to setting a specific Alternate Key on a table to use to keep out specific records purposely. This is obviously the reason for keys, to keep out duplicate data, but in this case it is a little different. We want to throw > 1million inserts into a set of tables at once (this is done in chunks of about 25000 or so) and we know that out of this large amount of attempted inserts, the vast majority are duplicates (rather, they are not complete duplicates, but a subset of the data is a duplicate of what we need) that we dont want, but the server pushing them is going to push them no matter what, so we have to deal with the insert. At first I was flirting with using a trigger on the table to determine if the data is duplicate to reject it, but rolling back the transaction within the trigger will roll back the remaining inserts from the transaction as well without checking them. Some thought making a Alternate Key on the subset and just let Sybase (12.5.3) fail each insert that we dont want, but I thought that seemed like a bad idea, knowing that each time this process happens, we willingly expect to see hundreds of thousands of failures, making it hard to find when something is really going wrong. Thoughts???


  2. #2
    Join Date
    May 2005
    South Africa
    Provided Answers: 1
    To ignore duplicates including the ignore_dup_key option with a unique index

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts