I have a DTS who loops over files in a directory processing each file (that is inserting rows in several tables of a sql server database) in a way that all files must be processed or if one is not processed due to an error (logged to a log table) then all previous files must be deprocessed (changes made to database must be 'rollback'ed). Thus transacctions are needed into the DTS for some steps (those who change database) while others mustn't (those who perform tests and just write to log table)

my DTS package properties configuration is:

advanced tab:
use transactions=checked, commint on succesful package completion=checked, transaction isolation level=read commited, use oledb service components=checked, all rest unchecked

loggin tab:
log package execution to sql server=checked
fail package on first error = NOT checked
write completion status to event log=checked

---

in package, the task flow designed is this way:

(listfiles)-->(get file)-->(datapump task to xfer file to table)-->(set phase)-->(test header)-->(test_A)-->(load A)-->(test B)-->(load B)-->...-->(test N)-->(load N)-->(loop phase)-->(loop file)

All names in brackets are tasks, while arrows are 'on success workflow', there's no 'on error workflow'

'get file' task is used to retrieve each file form a recordset stored as global variable and filled by prior task (listfiles). This task also performs a commit to save changes done to database after successfully processing a file (with no errors and thus no test task failed).

Workflow properties for 'get file' task are:
join transaction if present=not checked
commit transaction on successful completion of this step= checked
rollback transaction on failure=not checked

the 'set phase' task is used to proccess first Inserts and then Updates since each file is an incremental file

the datapump task is used to xfer file to a table for using the table in following tasks in DTS

each test task perform a test to determine if data must be loaded, and write to log table if data must not be loaded. If test decide data must be loaded then load task is performed and data is loaded into database. So test tasks must not join to DTS transaction (cause log table must not be rollbacked) while load tast must join to DTS transaction in order to rollback all changes made to database when a test task fails.

Workflow properties for all test tasks are:
join transaction if present=not checked
commit transaction on successful completion of this step=not checked
rollback transaction on failure=checked

Workflow properties for all load task are:
join transaction if present=checked
commit transaction on successful completion of this step=not checked
rollback transaction on failure=checked

loop task just change workflow to loop into phases and files in UNC directory, 'loop phase' redirects workflow to task 'set phase' while 'loop file' redirects workflow to 'get file'

the problem is when all files are ok DTS runs properly but when a file is not ok (a test task fails logging the error to log table) transaction is not rollbacked leting the database in an uncosistent state

my questions are:

1 - what's the reason for this behaviour (doing commit right but failing to rollback)?

2 - Due to locks in transaction I have to commit changes after each file is processed, thus i can only rollback the changes due to last file processed. But I'd like to commit or rollback changes due to all files processed in a way that if i have processed 10 files ok and 11th fails it must rollback changes of file 11th but also changes for all 10 previous processed files. Any idea to do that?