Results 1 to 7 of 7
  1. #1
    Join Date
    Jan 2004
    Posts
    70

    Unanswered: Joining a table with a file on a schema with no tempspace

    Is there any clever way to do this? Like use a cursor to cycle through each row of the file(<100 rows), join the row with the table, insert that one row into the output table, repeat?

    Because of the volumn it was already mentioned to just update the file manually but I really don't like the idea of that. And if there was some code to do this then its reusable as stuff like this is done fairly often.

  2. #2
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    Sorry, please explain "with no tempspace " in the subject line!

    You may consider using a C/Java Table Function to read/write into the file ...

    Sathyaram
    Visit the new-look IDUG Website , register to gain access to the excellent content.

  3. #3
    Join Date
    Jan 2004
    Posts
    70
    I can't use any temp tables and there is no longer a workspace/delete schema where this has to be run. And it has to be a shell script running sql :\

  4. #4
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    I'm afraid, I'm bit lost (or is it only me - not recovered after the Easter break )

    Can you give an example, please ?

    Sathyaram
    Visit the new-look IDUG Website , register to gain access to the excellent content.

  5. #5
    Join Date
    Jan 2004
    Posts
    70
    Sorry for my poor description. I'm a noob at db stuff

    I don't have any examples because its done manually right now. So we are given a csv file that's has 100 rows. Then someone will take those 100 rows and manually make 100 inserts which take a column from another table and the data is joined by a key that is included in the csv file. The inserts are out into a sh file and that's executed on a unix box

    What I wanted to do is have a sh file that reads the csv file row by row and makes the corresponding insert statement in some loop.

    So the shell file would do this

    1 take the input params
    2 make the db2 connection
    3 start loop
    4 read next row in csv file
    5 get Col from table based on key Val from csv row
    6 insert all the data
    7 loop next
    8 commit



    Sorry for the pseudo code. My laptop just died so I'm typing on my BB

    Maybe I should be posting in the shell scripting forum...

  6. #6
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    You can do this operation in one of the three different ways :

    a) Use shell script as you have mentioned ... You can also try to read all the data from a file and put the logic you plan to do 100 times (ie one for each row in the input file) in one single statement .. Of course, this 'merge' depends on the complexity of the logic ..

    b) You can create a Declare Global Temp table, and using a shell script, read the file contents and insert into the DGTT. Then you can join it with the other 'permanent' table to get new insert stmt

    c) You can create a table function in C or Java that will read the file and present to the SQL as a table function ie , reads the records from the table and builds a result set from it , you can use this table function in SQL as if you are operating on a physical table ...

    HTH

    Sathyaram
    Visit the new-look IDUG Website , register to gain access to the excellent content.

  7. #7
    Join Date
    Jan 2004
    Posts
    70
    I have to do option A.

    I just have to figure out how to do all the parsing and looping in shell script.

    So I posted over in the shell scripting forum:

    http://www.dbforums.com/showthread.php?t=1215711

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •