Results 1 to 11 of 11

Thread: Bulk insert

  1. #1
    Join Date
    Jul 2010
    Posts
    19

    Unanswered: Bulk Insert / update / Select performance

    Hi All,

    We have SQL server 2008 ent. edition,
    HDD size is 1 TB,
    USED space is 800 GB.
    Database count is 25.
    RAM 4 GB

    We have big database and big transaction.
    How to improve our server / database performance.
    for example:

    1. INSERT QUERY:
    Daily we bulk insert 18 GB text file into the sql server 2008.
    26 tables have 26 different text files, total text files size is 18 GB.

    2. SELECT QUERY:
    Join two different tables. Each table contains (130000000 * 2) records.

    Create index for before select queries takes more time.
    Indexed columns are varchar datatype.

    3. UPDATE QUERY:
    Indexed columns are removed before Upate query, still the update query takes more time.

    Updated table count is 50000 * 2 (join two different tables).

    Advance thanks for your full support.

    Thanks
    G Arunagiri

  2. #2
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Please create a new thread for new questions.

    That is not enough RAM to my mind for these volumes of data and size of database. Is this 64 bit?

    Fastest bulk insert would be BCP with TABLOCK hint into either heaps or a clustered index ordered the same as the data in the text files with the ORDER clause specified. Disable all non clustered indexes before load and rebuild after. Switching recovery to SIMPLE or BULK LOGGED would, with the TABLOCK, ensure minimal logging.

    It is not correct to assume indexes will slow down updates - it depends on the indexed columns and which columns are updated.

    Other than that I am not really what you are asking for questions 2 & 3.
    Testimonial:
    pootle flump
    ur codings are working excelent.

  3. #3
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    SIMPLE recovery mode is the same as FULL when BCP is being executed. The only difference is that under SIMPLE the committed transaction containing loaded data will be removed from the log device on the first sweep of lazy writer, while under FULL it'll stay in the log until trx log backup runs. So, to avoid the unnecessary growth of the log during bulk load operation you need to control how many records will comprize a transaction to be committed, so that checkpoint can remove it from the log before the next batch saturates the log.
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  4. #4
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Robert - SIMPLE is the same as BULK LOGGED if TABLOCK is specified. Otherwise yes, you are correct.
    Testimonial:
    pootle flump
    ur codings are working excelent.

  5. #5
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Testimonial:
    pootle flump
    ur codings are working excelent.

  6. #6
    Join Date
    Jul 2010
    Posts
    19

    Question Best Query for Bulk insert

    Dear All,

    Daily we are inserting the 30 GB text files into the sql server 2008 tables.
    30 GB text file have 40 text files and inserted into separate tables.

    Which one is good and better for Bulk insert query the following queries or other than this:

    bulk insert [dbname].dbo.[tablename] from 'G:\Raw_Data\TEXTFILE\UNZIP\FILE1.txt' with ( fieldterminator = ' ',rowterminator = '
    ' , firstrow = 2, maxerrors = 100)


    Advance Thanks
    G Arunagiri
    Last edited by gkarung; 07-27-10 at 03:11.

  7. #7
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    If you're specifying MAXERRORS, then you allow for up to 100 rows to fail before BULK operation fails. You need to specify ERRORFILE='<whatever_file_you_want_to_save_failed_ records_to>'. Also, with such great volume of data you want to control the rows per batch (BATCHSIZE), along with maximizing throughput by using TABLOCK and setting the database to either SIMPLE or BULK-LOGGED recovery mode.
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  8. #8
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Threads merged.
    Testimonial:
    pootle flump
    ur codings are working excelent.

  9. #9
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    No wonder it rang the bell...
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  10. #10
    Join Date
    Jul 2010
    Posts
    19

    Remove Unallocated space & Server level audit

    Dear All,
    1. How can I quickly remove unallocated space from a SQL Server database?
    2. How can I do server level audit if any table is dropped in the SQL Server?

    Thanks in advance
    G Arunagiri

  11. #11
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    gkarung:
    If you have a question related to a question you have asked previously then use the original thread.
    If you have a question that is not related to a question you have asked previously then create a new thread.

    This is a new, unrelated question so it should be in a new thread.
    Testimonial:
    pootle flump
    ur codings are working excelent.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •