Results 1 to 5 of 5
  1. #1
    Join Date
    Sep 2002
    Posts
    11

    Unanswered: Inserting data from multiple column into one?

    I need to write a query to summarize a transaction detail. For example, I make a summary table for all the winners in Bingo game. It is apparent that more than one person can win Bingo. So, I have this summary table with GameID and UserID:

    GamePlayID UserID
    -------------- ----------
    333 12
    333 45
    333 56
    334 44
    335 65

    I need to get all the UserIDs who won the game, and list them as like this:

    GamePlayID Winners
    --------------- -----------
    333 12, 45, 56
    334 44
    335 65


    I tried to self-join to get all the distinct winners, but the problem is that the number of distinct winners varies.

    Any suggestions will be greatly appreciated.

  2. #2
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002

  3. #3
    Join Date
    Aug 2002
    Posts
    20
    Hi Iamcrom!

    My suggestion will be a cursor. I made an example you checkout….
    --Test database
    Create Database TestRdb
    go
    use TestRdb
    go
    create table t2
    (
    c1 int,
    c2 int
    )
    go
    INSERT INTO t2 VALUES(1,101)
    INSERT INTO t2 VALUES(1,102)
    INSERT INTO t2 VALUES(1,103)
    INSERT INTO t2 VALUES(2,104)
    INSERT INTO t2 VALUES(3,105)
    INSERT INTO t2 VALUES(2,106)
    INSERT INTO t2 VALUES(2,107)
    INSERT INTO t2 VALUES(1,108)
    go
    Create table #Res
    (
    c1 int,
    c2 varchar(100)
    )
    DECLARE @Int int
    DECLARE @String1 varchar(100)
    DECLARE @String2 varchar(100)
    Set @String2=' '
    DECLARE C CURSOR FOR SELECT DISTINCT c1 from t2
    OPEN C
    FETCH NEXT FROM C into @int
    WHILE @@FETCH_STATUS=0
    Begin
    DECLARE C1 CURSOR FOR SELECT convert(varchar(3),c2) from t2 where c1=@int
    OPEN C1
    FETCH NEXT FROM C1 into @String1
    WHILE @@FETCH_STATUS=0
    BEGIN
    Select @string2=@String1+','+@string2
    PRINT @String2
    FETCH NEXT FROM C1 into @String1
    END
    Insert into #Res values(@int,@string2)
    SET @String2=' '
    CLOSE C1
    DEALLOCATE C1
    FETCH NEXT FROM C INTO @int
    END
    CLOSE C
    DEALLOCATE C

    Select * from #Res
    Drop table #Res

    Hopefully this will help you
    /Mada

  4. #4
    Join Date
    Sep 2002
    Posts
    11
    Originally posted by Mada
    Hi Iamcrom!

    My suggestion will be a cursor. I made an example you checkout….
    --Test database
    Create Database TestRdb
    go
    use TestRdb
    go
    create table t2
    (
    c1 int,
    c2 int
    )
    go
    INSERT INTO t2 VALUES(1,101)
    INSERT INTO t2 VALUES(1,102)
    INSERT INTO t2 VALUES(1,103)
    INSERT INTO t2 VALUES(2,104)
    INSERT INTO t2 VALUES(3,105)
    INSERT INTO t2 VALUES(2,106)
    INSERT INTO t2 VALUES(2,107)
    INSERT INTO t2 VALUES(1,108)
    go
    Create table #Res
    (
    c1 int,
    c2 varchar(100)
    )
    DECLARE @Int int
    DECLARE @String1 varchar(100)
    DECLARE @String2 varchar(100)
    Set @String2=' '
    DECLARE C CURSOR FOR SELECT DISTINCT c1 from t2
    OPEN C
    FETCH NEXT FROM C into @int
    WHILE @@FETCH_STATUS=0
    Begin
    DECLARE C1 CURSOR FOR SELECT convert(varchar(3),c2) from t2 where c1=@int
    OPEN C1
    FETCH NEXT FROM C1 into @String1
    WHILE @@FETCH_STATUS=0
    BEGIN
    Select @string2=@String1+','+@string2
    PRINT @String2
    FETCH NEXT FROM C1 into @String1
    END
    Insert into #Res values(@int,@string2)
    SET @String2=' '
    CLOSE C1
    DEALLOCATE C1
    FETCH NEXT FROM C INTO @int
    END
    CLOSE C
    DEALLOCATE C

    Select * from #Res
    Drop table #Res

    Hopefully this will help you
    /Mada

    Thank you, Mada!!!

    It actually works perfectly! Many thanks to you!!!

    However, as usual, the performance is not really good. Going through a table with 3500 entries take about 90 seconds to 120 seconds. Making this as a stored procedure is problematic at this time.

    I hoped that the performance is not that low. I'll try to limit the entries to be updated, and see what happens.

    Thank you again, Mada!!!

  5. #5
    Join Date
    Aug 2002
    Posts
    20
    Hi Iamcrom!

    90 to 120 seconds for 3500 sounds much…

    Perhaps you can add date variables to your query to narrow down the result set.

    Or see if you can add an index to our table. Create a test table and add to your query SET STATISTICS IO ON and in the end SET STATISTICS IO OFF. Then start to test by adding index…

    /Mada

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •