Hi guys,
need a little inputs on this.
we have a corrupted table with 140+million rows on ASEv12.5, unfortunately we can't bcp out all the data because of the corrupted data. the table has 3 unique composite index and one nonclustered. we tried to "select into" but we only got <600k rows. Recent backup is also corrupted since we don't exactly know the exact period where the errors occured. so we are looking the possibilty if PERL can accomplish this by trying to bulk copy out the data again and when error detected on specific data it will skip that corrupted data and proceed to next check sequence. hoping for you inputs.