Results 1 to 8 of 8
  1. #1
    Join Date
    Feb 2004
    Posts
    193

    Unanswered: xp_cmdshell problems

    Once again in my development career when I go into a new area and get stuck I know to ask "the panel" here at dbforums. I am trying to run this shell script which doesn't find a couple of test text files I placed myself for testing. I then shortened the number of folders by a couple and it gives me the second, successful result below. The first script should have found the test file. Is there a limit to how many folders you can pass as a path? Also what I ultimately want to do is have a file path to a "main" folder which will contains sub folders with these text files in them. I want a procedure to search through all these subfolders to see if a new file has arrived in any of them. The reason I wish to do it this way is if someone sticks a new file into an old folder by mistake that file will be processed. The subfolders will be organized by date, ie 201001 for Jan 2010, 201002 for Feb 2010 etc so I was thinking of looping through these folders and searching for specific text file names. The exact names will vary but there will be a few identifying characters consistenly in these different file types so I can count on that to identify and run my procedures, one for each file type. If I can't even get this shell script to work that scraps the whole idea. Currently we are copying new files into a specific path and running procedures from there. We want to minimize this step so we can just search the general folder area where the new files should be and run it if it arrives. I wish to set up a job to simply execute the procedure every night. Is this all a pipe dream? Thanks again guys. You are super!

    ddave


    Script:
    ---------------------------------------------
    create table #dir(fn nvarchar(300))
    insert #dir exec master..xp_cmdshell 'dir "\\ServerName\folder1\folder2\folder3\folder4\*.tx t" /b'
    select * from #dir
    drop table #dir

    Results:
    --------------------------
    File Not Found
    NULL

    2nd Results:
    --------------------------
    txt.txt
    NULL

  2. #2
    Join Date
    Nov 2004
    Location
    on the wrong server
    Posts
    8,835
    Provided Answers: 6
    Recursive folder searching....

    using System;
    using System.Data.SqlClient;
    using System.IO;

    namespace ImportSourceCode
    {
    /// <summary>
    /// Summary description for Class1.
    /// </summary>
    class ImportSourceCode
    {
    /// <summary>
    /// The main entry point for the application.
    /// </summary>
    [STAThread]
    static void Main(string[] args)
    {
    if (args.Length == 0)
    {
    Console.WriteLine ("Please Enter A directory ");
    return;
    }

    string directory2 = args[0];

    string directory = @directory2;

    if (directory.Length == 0)
    {
    Console.WriteLine ("No directory");
    return;
    }

    DirectoryInfo dir = new DirectoryInfo(@directory);
    FileInfo[] bmpfiles = dir.GetFiles();

    //Console.WriteLine("Total number of files ", bmpfiles.Length);

    SqlConnection conn = new SqlConnection("Initial Catalog=SourceCodeSearch;Data Source=MyServer;Integrated Security=SSPI;");

    SqlCommand cmd = new SqlCommand();
    cmd.Connection = conn;

    try
    {
    cmd.CommandText = "EXEC dbo.InsertDirectory @Directory = '" + @directory.ToString() + "';";

    conn.Open();
    int i = cmd.ExecuteNonQuery();
    }
    catch
    {
    Console.WriteLine("Error Inserting Directory");
    }
    finally
    {
    conn.Close();
    }


    FileStream file;

    string PathAndFile = "";

    foreach(FileInfo f in bmpfiles)
    {

    try
    {
    PathAndFile = @directory + "\\" + f.Name;

    Console.WriteLine("Path and file " + PathAndFile.ToString());

    // Specify file, instructions, and privelegdes
    //file = new FileStream(f.Name, FileMode.OpenOrCreate, FileAccess.Read);
    file = new FileStream(PathAndFile, FileMode.OpenOrCreate, FileAccess.Read);

    // Create a new stream to read from a file
    StreamReader sr = new StreamReader(file);
    //StreamReader sr = new StreamReader(PathAndFile);

    // Read contents of file into a string
    string s = sr.ReadToEnd();

    s = s.Replace("'","''");

    //Console.WriteLine("Where is my data? " + PathAndFile.ToString());
    //Console.WriteLine("Where is my data? " + s.ToString());
    sr.Close();
    file.Close();

    try
    {
    //SqlCommand cmd = new SqlCommand();
    //cmd.Connection = conn;

    //Console.WriteLine("EXEC dbo.InsertSourceCode @NameOfFile = '" + f.Name.ToString() + "',@PathOfFile = '" + @directory.ToString() + "',@ContentsOfFile = '" + s.ToString() + "';");

    cmd.CommandText = "EXEC dbo.InsertSourceCode @NameOfFile = '" + f.Name.ToString() + "',@PathOfFile = '" + @directory.ToString() + "',@ContentsOfFile = '" + s.ToString() + "';";
    //cmd.CommandText = "EXEC dbo.InsertSourceCode @NameOfFile = '" + f.Name.ToString() + "',@PathOfFile = '" + @directory.ToString() + "',@ContentsOfFile = '';";

    conn.Open();
    int i = cmd.ExecuteNonQuery();

    Console.WriteLine("Record Inserted");
    }
    catch(Exception e)
    {
    Console.WriteLine(e.ToString());

    return;
    }
    finally
    {
    conn.Close();

    }
    }
    catch
    {
    Console.WriteLine("Could not read file");
    //sr.Close();
    //file.Close();
    return;
    }

    }

    DirectoryInfo[] d = dir.GetDirectories();


    string[] direct = new string[1];
    //string thepath;

    foreach(DirectoryInfo x in d)
    {
    //thepath = x.FullName.Replace("\\","\\\\");
    //direct[0] = thepath.ToString();
    Console.WriteLine("Directory Path " + x.FullName.ToString());

    direct[0] = x.FullName.ToString();
    ImportSourceCode.Main(direct);

    }

    return;
    }
    }
    }
    “If one brings so much courage to this world the world has to kill them or break them, so of course it kills them. The world breaks every one and afterward many are strong at the broken places. But those that will not break it kills. It kills the very good and the very gentle and the very brave impartially. If you are none of these you can be sure it will kill you too but there will be no special hurry.” Earnest Hemingway, A Farewell To Arms.

  3. #3
    Join Date
    Nov 2004
    Location
    on the wrong server
    Posts
    8,835
    Provided Answers: 6
    I am a really crappy C# programmer and this may not be the most efficient way to do this, but I use it as personal tool and this is not production code.
    “If one brings so much courage to this world the world has to kill them or break them, so of course it kills them. The world breaks every one and afterward many are strong at the broken places. But those that will not break it kills. It kills the very good and the very gentle and the very brave impartially. If you are none of these you can be sure it will kill you too but there will be no special hurry.” Earnest Hemingway, A Farewell To Arms.

  4. #4
    Join Date
    Feb 2004
    Posts
    193
    Thanks, you wrote a mouthful. I will take a look at it and try to apply it.

    ddave

  5. #5
    Join Date
    Feb 2004
    Posts
    193
    Ok, I don't know which end is up on this code. Is there a way to do this from sql server? I know sql server pretty well but the other scripting codes I am a fish out of water. Thx.

    ddave

  6. #6
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    I go back to my old analogy on this... You can write love letters using Excel, but that doesn't make it a good idea.

    Transact SQL isn't well suited to this kind of problem. It can be bludgeoned into doing the job a couple of different ways, but the process is ugly at the very best and hysterically funny when you really work at it.

    One option would be to use a job to process the files. There are a number of ways to do this, and the sheer flexibility makes it hard for me to pick one without more knowledge of what you're trying to do.

    Another possibility would be to use xp_cmdshell to do a directory listing with subdirectories and capture its output in a temporary table. Parse the temporary table to get the full directory structure. Once you've got that structure (complete with filenames and timestamps), you can compare it to previous version(s) of the structure and voila!

    I'd strongly recommend taking a day or two to learn something like PowerShell. It could make this task roughly a thousand times easier, and it would also help with many other problems you face on a regular basis. Knowing one language makes you dangerous, knowing several languages makes you... A geek!

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  7. #7
    Join Date
    Feb 2004
    Posts
    193
    I will look into Powershell, whatever that is. The one good thing about this job is that my boss gives me plenty of time to do my work including learning time. After 8 years of sql server I suppose I am ready for "Geekdom".

    What I am trying to do is to identify any new file in a subfolder and process that file. I have three types of files, all three of which I wrote stored procedures for relatively successfully. Now we want to take the next step and set up a job for each file type to run during off hours on a daily basis. This job should look in the subfolders of a main directory for a particular healthplan and identify any new files and process them if they are there. There are consistent parts of the files naming conventions so I will look for that particular string within a name. I think this is the best way to do it as when we get new files every month the new files are put into a new folder chronologically by date. IF someone accidently sticks a file in a previous month's folder by accident this process will catch the error. FWIW, the subfolder names don't change and their naming conventions are consistent. So 2009 folder will have 12 subfolders named 200901, 200902...200912. One thing I just realized as I typed this there are weekly folders that will have different values ie 20090201_20090205, 20090206_20090213, etc. Currently I copy and paste each new file into a consistent processing folder and I bulk insert from this path into my procedures. Then there are two copies of the file on the network. The otherway I suppose is to copy the file path where the original file is, which will change each run, into the bulk insert script I have. It runs fine but I want to automate it to the next level. That is what I am trying to work with.

    Dolfandave
    No wonder people looked at me funny when I lit birthday candles w/ my military flamethrower .

  8. #8
    Join Date
    Feb 2004
    Posts
    193
    Resolved:

    I used this script and appended the file path and file search string to it in a variable.

    dir /B /S

    then executed this command w/ the variable: execute master..xp_cmdshell @gofigure

    .....where I dumped this list into a temp table and stripped out the file paths leaving me the file names

    .....and off to the races

    This was much easier than I thought. I thought I would have to loop through all folders and their respective subfolders to get all existing versions of this file, not even close.

    Thanks to you who replied.

    ddave

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •