Results 1 to 9 of 9
  1. #1
    Join Date
    Feb 2006
    Posts
    1

    Exclamation Unanswered: Cursors v/s Temp tables

    Hi,
    Can some one please list pros and cons on useof Cursors and temp tables. I came across a few people who say that Cursors cause Lock contention (yes I know they do) so we shouldnt use cursors and instead should try to create procedures in such a way that we can avoid Cursors. They even advocate the use of Temp tables. I know cursors do cause lock contention and at the same time Table level operations which are possible with use of temp tables are better in terms of performance but I would be thankful to the group if you can throw some more light on this or alternatively if you can send me some reading material on this topic.

    Thanks
    AP

  2. #2
    Join Date
    Jun 2004
    Location
    Paris, France
    Posts
    43
    if you don't NEED to do line by line operations, it's beter to use temp tables, just for optimization.
    for a quick exemple, you have n lines to be updated.
    - using a cursor, you will make n operations, send n times the request, do n times the treatment and receive n times the response. you will use a lot of I/O pages (sybase itself's, disk's AND network's communications)
    - using a temp table, the request is perhaps more difficult to realize and optimize, but it will be sent 1 time, the treatment (even if it uses several orders, each of them will treat an amount of rows) will be done 1 time, and the result will be returned 1 time.

    hope it answers to your question

  3. #3
    Join Date
    Feb 2003
    Location
    Netherlands
    Posts
    63
    Whenever you can do something in bulk its always better, temp tables are always the best thing.

    Another option you have that doesn't have the same effects as cursors is the following:

    Code:
    drop table #TMP_TEST
    go
    
    declare @idx numeric(15)
    declare @custname varchar(250)
    
    select CU_NAME, TMP_IDX=identity(15)
    into #TMP_TEST
    from F_CUSTOMERS 
    where CU_NAME like 'A%'
    
    select @idx=isnull(min(TMP_IDX),0) from #TMP_TEST
    while (@idx<>0) begin
      select @custname=CU_NAME from #TMP_TEST where TMP_IDX=@idx
    
      print "Selected customer %1!", @custname
    
      select @idx=isnull(min(TMP_IDX),0) from #TMP_TEST where TMP_IDX>@idx
    end
    What I do here is build a temporary table with the result of my query and add an identity column to the result. Then I loop through the unique values of the identity column. If you have large result sets you may want to add an index on the identity column.

    Offcourse the downside (or plusside) of this approach is that the source data doesn't get locked so other users may be changing your source data while you are looping through your result set. But that isn't always a problem.
    Greetz,

    Bastiaan Olij

  4. #4
    Join Date
    May 2005
    Location
    South Africa
    Posts
    1,365
    Provided Answers: 1
    do something in bulk its always better
    Yes, but the while loop posted is row by row processing and will perform slower than a cursor.

  5. #5
    Join Date
    Feb 2003
    Location
    Netherlands
    Posts
    63
    Cursor is also row by row. But indeed, this may have slight overhead.

    And my example was not one of bulk processing, but just an alternative to using cursors in the situation that you can't use a bulk approach and cursors are giving you contention problems.
    Greetz,

    Bastiaan Olij

  6. #6
    Join Date
    May 2004
    Posts
    35
    just make sure your tempdb is big enough to hold the #temptable...

    if ur temptable will become big and you have ample resources, you might want to create a seperate tempdb for tthem users and then allocate space from memory and not from disk for that temp table....vhrooom!

    imho, the fastest update would be that will not use cursors or temp tables, so you have your point of departure,

  7. #7
    Join Date
    Feb 2003
    Location
    Netherlands
    Posts
    63
    Quote Originally Posted by Jonga
    just make sure your tempdb is big enough to hold the #temptable...

    if ur temptable will become big and you have ample resources, you might want to create a seperate tempdb for tthem users and then allocate space from memory and not from disk for that temp table....vhrooom!

    imho, the fastest update would be that will not use cursors or temp tables, so you have your point of departure,
    Very true. We actually have large tempdbs, not because of this technique but because we often handle large amounts of information and often need to store temporary information to do complex calculations. Often I've found that to be faster then directly updating on the table that needs to change simply because it simplifies a complex update into a number of simpler inserts and/or updates.

    Using the while loop I personally only use when I need to return data in a specifically formatted way. Lets say multilevel hierarchical data that I want ordered in the correct way. I don't like cursors for some reason allthough I can do the same thing with them.
    Greetz,

    Bastiaan Olij

  8. #8
    Join Date
    Aug 2009
    Posts
    1

    Similar question

    Hello all,

    An engineer converted the original query below into a stored procedure with cursors.

    Question: Would it have been better to just use a temp table only? The output is usually only 3-4k rows each night.

    Original Query:

    select product_id as "GAME",
    serial_number as "PACK",
    begin_piece as "STARTING SERIAL",
    end_piece as "ENDING SERIAL",
    location_id as "LOCATION",
    ips_text_status.status_description as "STATUS"
    from inv_serial_tracking, ips_text_status
    where location_type = 2 /* FIELDREP = 2*/
    and product_id < 900
    and ips_text_status.product_type = 1
    and ips_text_status.gtms_product_id = 59
    and ips_text_status.inv_status_id = inv_serial_tracking.inv_status_id
    Converted to:

    /*--------------------------------------------------------------------------------------*/
    /* populating the x_temp_salesrep_audit table with the data */
    /*--------------------------------------------------------------------------------------*/

    CREATE proc proc_build_salesrep_audit
    as
    BEGIN
    DECLARE @gtms_product_id int,
    @product_type int,
    @location_type int,
    @product_id int,
    @game int,
    @pack int,
    @begin_piece int,
    @end_piece int,
    @location int,
    @description varchar(30),
    @rec_num int

    select @gtms_product_id = 59,
    @product_type = 1,
    @location_type = 2,
    @product_id = 900

    DECLARE cur_res CURSOR for
    SELECT product_id, serial_number, begin_piece, end_piece, location_id,
    ips_text_status.status_description
    FROM inv_serial_tracking, ips_text_status
    WHERE location_type = @location_type and
    product_id < @product_id and
    ips_text_status.product_type = @product_type and
    ips_text_status.gtms_product_id = @gtms_product_id and
    ips_text_status.inv_status_id = inv_serial_tracking.inv_status_id
    ORDER BY product_id, serial_number asc
    for read only
    open cur_res
    fetch cur_res into @game,@pack, @begin_piece, @end_piece, @location, @description

    select @rec_num = 1

    while @@sqlstatus = 0
    BEGIN

    insert into x_temp_salesrep_audit values (@rec_num,@game,@pack, @begin_piece, @end_piece, @location, @description)
    select @rec_num = @rec_num + 1

    fetch cur_res into @game,@pack, @begin_piece, @end_piece, @location, @description

    END
    deallocate cursor cur_res
    END

    go

  9. #9
    Join Date
    Mar 2009
    Location
    Sydney, Australia
    Posts
    258
    It is always better to query (select as well as update) the tables directly, period.

    Using a cursor "instead" is very slow (turns the set-oriented engine on its head and forces it to do row-by-row processing) and uses an additional class of locks; if the lock contention was bad, it will be worse with cursors.

    Using a while loop instead of a cursor (and there are better constructs to the examples here) is better than a cursor, but still nowhere near the speed of accessing the table directly.

    Using temp tables is an abomination of a different sort. It creates and uses proportionately large, duplicate tables in tempdb; with all the attendant tempdb administration problems. Most of the time it is quite unnecessary, and it is a quick fix to get around a problem, or simply because the coder could not get their head around the original tables and joins. The overuse of tempdb is becoming more and more ommon, and is now a problem in itself.

    Point is, the question is not "cursor or temp tables", they are two different evils; the focus and goal should always be "use the original tables directly". Now if you cannot do that, or have problems doing that, post exact problem, and we can solve it. Without using either cursors or tempdb.

    Poclovis

    Your problem is different again, and no, it is not a "conversion", the second code segment populates a temp table very badly, the first segment has no temp table. Can you post a new separate thread, with more complete code segments; otherwise this thread will get complicated with people answering two separate problems.
    Last edited by Derek Asirvadem; 08-31-09 at 21:18.
    Regards
    Derek Asirvadem (Formerly DerekA)
    Information Architect / Senior Sybase DBA
    Copyright 2009 Software Gems Pty Ltd

    I answer questions from the Original Poster only. If you have a genuine question, as the moderators have requested, start a new thread.

    http://www.softwaregems.com.au

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •