Results 1 to 5 of 5
  1. #1
    Join Date
    Apr 2014
    Posts
    1

    Unanswered: Storing a dynamic file name table

    I have a file that is automatically generated by an external process that will always have the same name + a date stamp.

    I will say test_(Datestamp).txt

    what i am trying to do is use sql to bulk insert this file but i will not know the full file name as the date stamp is also includes time. What my script is doing is taking this file everyday storing its contents temporarily in a table where i then use a trigger to edit and repopulate the table and kick out a new file that another external process uses.

    any insight would be helpful

  2. #2
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,800
    Provided Answers: 11
    You may want to use BCP, instead. BCP is a command line utility. Set up a separate user with only insert permissions on the target table, because you will likely have this user's password lying around in clear text, unless you can set this up with a windows authenticated ID.

  3. #3
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    Another possible solution would be to use SSIS, the SQL Server Integration Services. SSIS provides a large number of ETL tools that I think are easier to use than the BCP.EXE program or the BULK INSERT statement in Transact-SQL.

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  4. #4
    Join Date
    Jan 2013
    Posts
    354
    Provided Answers: 1

    Magnetic tape files live!

    I have a file that is automatically generated by an external process that will always have the same name + a date stamp.
    This is how we labeled mag tapes in the 1950's! The format was "yy-ddd" and tape library systems would fire off programs to re-write or re-cycle the tapes based on that label. This was one of the Y2K problems; Google it.

    Since a table is a set of entities, we would never do this. You are saying that every one of these created tables is a totally different kind of thing in the universe of discourse. I do not think that is true.

    what I am trying to do is use SQL to bulk insert this file but I will not know the full file name as the date stamp is also includes time. What my script is doing is taking this file everyday storing its contents temporarily in a table where I then use a trigger to edit and repopulate the table and kick out a new file that another external process uses.
    Triggers? More procedural code !

    Scrub the data, then use BCP or another tool to bulk load it. If this timestamp is real data and not a meta-data tracking tool, then make it a column. Write a procedure to query the table and send data to the second process.

  5. #5
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    why stuff it into a temporary table inside SQL in the first place if 'all' you are going to do is kick out another file that another process uses.. unless you are actually doing anything inside the db itself.

    you could do this process with a command line batch processor (depends on your OS but there's plenty of choice. you could do this with VB script, PHP or VB or .NET or....... from what you have said so far there is no need to go via the db at all


    A possible process outline could be:_
    specify a directory where the input file MUST reside
    read the file, do whatever reformatting you require
    write to a log file what the process has done (what fiel was read, what action was takne, and so on (so you know trhe process has doen something)
    then once you are sure the output is complete rename the file (or move it to another directory)
    have a housekeeping function which deletes files that are known to have completed the process or time expired (eg say keep the copied/renamed file for a period of time, say 30 days, after which time you can get rid of them. if the user doesn't realise they are missing data after 30 days then the data has little or no value in my books. but as I have a fundamental distrust of users I'd actually chop the file after 60 days.. giving them another 30 days ill grace to spot the problem
    Last edited by healdem; 04-22-14 at 14:35.
    I'd rather be riding on the Tiger 800 or the Norton

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •