Results 1 to 8 of 8
  1. #1
    Join Date
    Feb 2002
    Posts
    11

    Unanswered: Passing long text strings to a stored procedure

    Hello!

    I am attempting to insert a group of records into a SQL Server 2000 database table. The data for each of these records is the same, with the exception of a foreign key (hereafter known as the 'RepKey') and the generated primary key. To improve performance and cut down on the network traffic, I pack the RepKeys in a comma-delimited string and send it as a single parameter, with the intention of parsing it in the stored proicedure to obtain each individual RepKey, or to use it as a list in an 'WHERE RepKey IN (' + @RepKeyString + ')' type of query.

    My problem is that there may be 1000's of items in this string. Using a varchar(8000) as the parameter type is too short, while using the 'text' data type does not allow me to perform any string operations on it. Any ideas on how to make one network call and insert multiple records that breaks the 8000 character barrier?

    One thing that I cannot do is add a table so that the record is stored once, then mapped to each individual rep key. The database structure cannot change. Other solutions that I may not have considered are welcome. Thanks!
    Everett

  2. #2
    Join Date
    Oct 2001
    Location
    England
    Posts
    426
    Can you add a global temp table?
    You could write all the data to a disk file then bcp it in to the global table.
    Could also create a disconnected recordset on the golobal temp table, diconnect it, populate it then connect it to commit the records then use that.

    You could use a text datatype then use substring to parse it in chunks and use char functions on it.

  3. #3
    Join Date
    Dec 2001
    Location
    Toronto, Canada
    Posts
    335
    Nigelrivett idea of a text file sounds interesting. What if the parameter used in your stored procedure was the path to a file containing the list of RepKeys? Once in your procedure you use BULK INSERT into a temporay table (Globle table if needed like nigelrivett suggested) then perform the same looping as you would have done before.

    sp_MyProc (RepKeyFile AS varchar(50), .....)

    CREATE TABLE #temptable ....
    BULK INSERT #temptable FROM @RepKeyFile
    .
    .
    .
    CREATE CURSOR on #temptable
    loop through

    The only problem is your point on:
    or to use it as a list in an 'WHERE RepKey IN (' + @RepKeyString + ')' type of query.
    I thought that you could create a local text variable and while looping append the RepKey to the local text field, SET @txt = @txt + ',' + @RepKey. However I got an error when trying to create a local variable as type text.
    Msg 2739, Level 16, State 1, Server ATLAS, Line 1
    The text, ntext, and image data types are invalid for local variables.

  4. #4
    Join Date
    Feb 2002
    Posts
    11
    Thanks guys - I ended up using the substring procedure to break off chunks, then used an INSERT..SELECT statement that looks like the following:

    SET @query = "INSERT INTO RepContact([fields])"
    SET @query = @query + "SELECT RepKey, [@vars] FROM Rep WHERE RepKey IN (" + @currentString + ")"
    exec(@query)

    @CurrentString is the current list of keys. Each time through the loop, as long as there are still items, the query is run.

    Thanks again - if anyone has any ideas on speeding this up, it would be greatly appreciated. (The insert runs a bit slower than I would have hoped).

    Everett
    Everett

  5. #5
    Join Date
    Oct 2001
    Location
    England
    Posts
    426
    I was thinking that the text file would hold the keys delimitted by crlf so that the bcp would insert them into separate rows and you wouldn't have to do any further processing.

  6. #6
    Join Date
    Feb 2002
    Posts
    11
    I think that I would prefer to leave it as it is and avoid writing to and reading from disk. Wouldn't this make it slower, not faster? Anyways, thanks again.
    Everett

  7. #7
    Join Date
    Oct 2001
    Location
    England
    Posts
    426
    >> Wouldn't this make it slower, not faster?

    Depends on the data and environment.
    the bcp will be non-logged so the inser will be faster. It will reduce the handshaking across the network and reduce the amount of manipulation needed before the insert into the production tables.
    It would probably end up slower but maybe not. It does give an automatic record of the dta inserted from the text files and makes it easy to make the insert asynchronous if you need to.

  8. #8
    Join Date
    Feb 2002
    Posts
    11
    I'll try it during the week and advise you of the outcome.

    Thanks again for all of your help.
    Everett

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •