I have an application that is somehow dropping duplicate records (complete with the exact same timestamp) into a table.
The strange things are that:
-- it doesn't always happen
-- when it does happen, the number of duplicates has ranged from two to thousands
This issue severely affected performance when the total number of rows in the table, which should have been about 2000, blew up to 3.5 million. A few months ago I wrote a query to eliminate all the duplicates, then kept an eye on the table, and I've noticed it's happening again.
Just wondering if anyone's experienced a problem like this, and if there's something in particular that generally causes this type of issue?
You should create a unique index that prevents the insertion of duplicates. Most probably there is a bug in your application. When you create the unique index you'll get an error message when it happens, and you can pin-point the part of the application that is buggy.