I'm trying to import data from a text file into my database. each line of this text file must be inserted into 2 seperate tables depending on characters in the line of text. any ideas about how to do this without using a great deal of cpu would be appreciated.
IF I understand correctly you are trying to use one Bulk Insert command using single file to load 2 tables at same time depending on some data. As per my understanding , it may be possible if you have triggers on table which would delete rows from primary and add to secondary (depending on character you loking for)
Or you can insert all the rows in Primary and use another BCP to load to another table using a select statement and later remove those rows from First table
another problem i have is that the file contains 128 bit strings that i have to separate into columns of my tables. what would be the best way to get these substrings separated while trying to insert the data into a primary table. if i could do that, then i can just bcp what i need from the primary table over to my secondary one.