Results 1 to 7 of 7
  1. #1
    Join Date
    Mar 2009
    Posts
    27

    Unanswered: Dynamically export tables to flat file w/header and "" txt qual?

    SQL Server 2005

    OK, I'm in the running for most chaotic workplace this week. The 5 crisis issues I was working on have been replaced by a new (to me) need for having an automated process in place by Monday (preferably in TSQL) that will export 36 tables--not all columns from every table, encrypt them and push them onto a different drive. Output of tables is specified as flat file with header row, pipe delimited, and with "text qualifiers".

    I can dig up something to do the encryption and file shuffling is stock but I was just checking the old standby BCP and it doesn't seem to have the params to queryout to these specs. I can see creating a view that would append the header and append '"' to the beginning the first field and the end of every last field and then use "|" as the field delimiter.

    However, I though I would ask if anyone has anything handy that would be less cluged than this?

  2. #2
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    Could you BCP out to a temporary file, then encrypt that (using whatever crypt specs they want) into the destination file? This would allow you to build a view or BCP queryout spec to keep things fast, easy, and maintainable at the cost of having a temporary file that contains the clear text version of the output.

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  3. #3
    Join Date
    Jun 2004
    Location
    Long Island
    Posts
    696
    to create a pipe delimited file with no headers if you dont want to use BCP
    sqlcmd -E -S"server" -Q"query" -o"file.txt" -d"database" -h-1 -s"|" -w width -W

  4. #4
    Join Date
    Mar 2009
    Posts
    27
    Alas, I would have loved to use BCP, but the requirement is that the output must be created with headers AND with "Text Qualifiers". So, I played around with that for a bit yesterday in BCP but short of creating a view of each table that added in the header (for the specific rows included in the file, not all rows are used on every table) and appended the " to the beginning and end of every field, I couldn't make it work. Since some of these tables have 150 columns (and hundreds of millions of rows), no time to continue with views or format files for 36 of these whoppers.

    Today I'm hacking together SSIS for each table. Not portable or easily changeable and SSIS has a ridiculous propensity to corrupt itself. But fast to set up with selected columns, a header and text qualifiers.

  5. #5
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    So then EVERY column in your output needs the "Text Qualifier"? If that's the case, just BCP the queryout into tempfile1, then feed tempfile1 through gawk using:
    Code:
    		{ gsub(/\|/, "\"\|\""); print "\"" $0 "\"" }
    to produce tempfile2. Encrypt tempfile2, and you're done!

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  6. #6
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    Brain fart on my part... I'd missed that you were planning to use "|" as the separator instead of just using | by itself. If you used quote-bar-quote then you could still use my gawk script and just leave out the call to gsub.

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  7. #7
    Join Date
    Mar 2009
    Posts
    27
    Well, except for the header thing and setting up queries or views for 36 tables. But it's an interesting idea and since this will turn into a weekly event in a couple weeks I may time-test it compared to the silly SSIS mess.

    The main issue with SSIS is that I can't easily change my source or my destination globally without manually creating the packages and a whole lot of additional work for global variables, etc. I have 3 possible sources for these tables so it would be nice to be able to flop it around to a different source if one of them is underperforming or overcommitted. So far 2 tables have been exported; 97 GB and 55 GB. Working on automating the encryption and push to a portable hard drive. Only 34 more tables to go...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •