We have to insert nearly 300K records in a single table, and saving time is of utmost importance.
Data for a few selected columns is supplied in a flat file with "~" as a field separator.
We have 2 levels of tables
1. Which matches the exact format of the flat file, there are no indexes and triggers at this level of tables.
Nomenclature :- temp_collection<table_name>
2. It is a superset of the columns provided in the flat file.
Nomenclature :- <table_name>
1. Using the BCP utility and inserting data from the flat file to the temp_collection_<table_name>
bcp Database.dbo.temp_collection_acctmast in file name –U(username) – P(password) –S(Server Name) -c -t~ -e Error File Name
2. Creating defaults for the columns in level 2 tables that are not in level 1 tables.
3. Dropping the indexes and disabling triggers for level 2 tables
4. Using DBlibrary routines in a C program to BCP selected columns from level 1 to the Level 2 tables.
a ) select * from temp_collection<table_name>
b ) dbbind into program variables
c ) bcp_bind these variables with columns
d ) bcp_send row
5. Recreating Indexes and Enabling triggers.
By this way the index segment of database is getting filled like anything, and the process leads to “LOG SUSPEND MODE”. Neither we are able to do dump transaction nor able to kill the process, which put the log in suspends mode.
We would like get the following details.
1. Is there any way we can increase the speed further?
2. Can we use views instead of tables for BCP of data from flat file if Yes then