I am importing data from Excel to a SQL table using a simple DTS. At times the DTS fails because one of the columns in the Excel file may have an invalid time date entry. Sometimes the time will be an invalid negative number and will cause an overflow error durring import to the SQL table column.
Is there a way to capture the data before writing it to the table and validate it and if it is invalid, or more specifically a negative nuimber, enter a default value or a null value?
If there is could you be specific in how to setup the DTS transformation script.
I'd probably just suck the data from the external source into a working table that had pure Unicode (NVARCHAR) character columns. Once it was there, you can "sanitize" it any way you need to using Transact-SQL.
Another option that saves on disk and keeps the package conceptually "atomic" would be to handle the exceptions within the DTS package itself. Instead of using a default "flow" transformation within the DTS column mapping, you could use script to do whatever validation suited your needs.
HHmmm, well it seems like you both are saying the same thing.
"raw table" and "working table" are they the same thing?
All fileds are nvarchar correct?
and use a copy column to column transform correct?