Results 1 to 10 of 10
  1. #1
    Join Date
    Feb 2012
    Posts
    5

    Unanswered: Varchar(8000) issue while transferring sql server table to csv file

    Hi

    I am currently woking on transfering the table contents in sql server in an csv file i have created a stored procedure which would do same but the prob i am facing is the data in the table is not clean it contains tab,newline etc so i had to clean the data i had applied the folowing procedure
    declare @columns varchar(8000), @sql varchar(8000), @sql1 varchar(8000),@data_file varchar(100)
    set @columns=''
    --'@columns+''replace(replace(replace(''+column_name +'',Char(10),''''''''),Char(13),''''''''),Char(19) ,'''''''')'''
    --print @sql
    select

    @columns=@columns+'replace(replace(replace('+colum n_name+',Char(10),''''),Char(13),''''),Char(19),'' '') as '+column_name+ ', '
    from
    information_schema.columns
    where
    table_name='Table_name'
    ORDER BY ORDINAL_POSITION
    print @columns
    set @sql=len(@columns)
    Print @sql
    the table_name contains around 300 columns with the column_name of min 20 characters
    When i run the above query i get only few columns instead of all the columns so i tried to find the length which gives me the result as 4000.

    I had declared @columns as Varchar(8000) i dont know why this issue is coming up?? is there any other way i can clean the data an the transfer it into the file

    It would be of great help if any one can help me in this

    Many thanks in advance!!!!

  2. #2
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    Tools>option>sql Server>query results>result to text
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  3. #3
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    What version of SQL Server are you using?
    George
    Home | Blog

  4. #4
    Join Date
    Feb 2012
    Posts
    5
    George,
    i am using Sql server 2000.

  5. #5
    Join Date
    Feb 2012
    Posts
    5
    brett,

    i was trying to clean the data and stored it in a different table and then apply the bcp command to stored it as a csv file

  6. #6
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    Do you need to permanently clean the data (i.e. update the raw table data) or just export it clean?
    George
    Home | Blog

  7. #7
    Join Date
    Feb 2012
    Posts
    5
    George,

    i want the clean data to be exported into the new table.

    i was using the following bcp command
    bcp "select * from @db_name.@table_name" queryout "M:\filextract\root\demo.txt" -c -t"$" -r"\n" -T

    if an tab,or newline character is encountered it will print it in a newline.

    so i went with the approach to clean the table first and then store it in a temp table...and using bcp transfer to a file using the temp table

  8. #8
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    Well because the length of the expression is an issue I would run an update statement per textual column instead before sorting your BCP command.

    Generate the update statements "air-code" (i.e. written from my head without testing)
    Code:
    SELECT 'UPDATE ' table_schema + '.' + table_name + ' SET ' + column_name + ' = Replace(' + column_name + ', Char(10), '''''') WHERE CharIndex(' + column_name + ', Char(10)) > 0'
    FROM   information_schema.columns
    WHERE  data_type IN (<your list of appropriate text data types>)
    George
    Home | Blog

  9. #9
    Join Date
    Feb 2012
    Posts
    5
    Hi George

    That Worked !!!!!! thanx a loads.......

    But my only concern her is since we are using an update statement will it impact the performance beacuse there would be milllions of records in the table and the structure of table which we work has many columns with varchar datatype...

  10. #10
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    I was under the impression that this was a one off thing so didn't really think of performance other than adding the WHERE clauses (incidentally: I got the parameters of CharIndex the wrong way round :P ).

    Can you run this as a job out of hours?
    If you are updating more than a thousand records or so on each update statement then consider batching them up using loops.

    If you explain your scenario better then we may be able to help further.
    George
    Home | Blog

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •