Results 1 to 4 of 4
  1. #1
    Join Date
    Apr 2013

    Unanswered: Updating existing records from external file


    I know how I want to do this but am puzzled as to how I can achieve it in BD.

    Essentially I need to take some information from a CSV or Excel file and use that to update fields in existing records. There doesn't seem to be any built-in facility to do this so assume I need to run a script. I think I know what this needs to do:

    1. Select set of records that are to be updated
    2. Read the first line of the csv file and store the column values in separate variables
    3. Find the record with the matching unique identifier and select it
    4. Update the relevant fields in that record with the new data
    5. Move on to the next line of the csv file ... and so on

    I'm a bit stumped now though. I can see there is a Read File action but that it seems is only able to dump the entire file into a single variable. There is the parser facility but that seems only able to create new records. If anyone could point me in the right direction that would be great.



  2. #2
    Join Date
    Apr 2013
    Update: On further examination, perhaps I need to use connect to external DB? In the spirit of that I tried a simple test adapting an example in the help files:

    Connect to External DB (C:\test.xls, )
    External DB Operation (Move to the first record, , , )
    [$columnread] = External DB Operation (Get Value, 1, , )
    Show Message (I read this from the csv: [$columnread], Help!, Ok)
    Close External DB ()

    All excited, I clicked the button :-( It just said "I read this from the csv:" no value. Any ideas what is wrong. BD def has the file open as excel can't access it. And $columnread is a global variable.



    Update: OK, silly me, the first column is 0. Column 1 was empty. So I'm quite chuffed now! I really must go to bed early for a change. BD is getting a bit addictive!
    Last edited by matty1965; 02-26-14 at 18:17.

  3. #3
    Join Date
    Sep 2011
    Provided Answers: 2
    Hi Matty
    Not sure if you have this sorted but you should be able to use your original idea (it is what I would do with this task).
    Yes file is read in to text variable as you say and then you would split this mass of csv text in to lines (for each line action). With each line, we now have one record of data to be split up in to fields separated by commas in this case but could be any delimiter.
    There is more the one way to do this but as an example I would use the text formating options of "UPTO" and "REPO" within a loop and set variables to each field. Extract text upto comma and then replace this text (REPO) in line using field var value to null - repeat as required for number of fields. I would make your delimiter a variable at beginning of script. You might check out the file MacHeuS made for excel parsing also.
    You need to know if file has field headers that need to be bypassed (count your line loops and ignore line one.
    Some time would be needed to fine tune script as there are little things that often crop up that may need fixing (maybe not for MacHeuS or James who are fast operators).
    Last edited by tamcind; 03-02-14 at 03:42. Reason: we

  4. #4
    Join Date
    Apr 2013
    Thanks David, I'll give it a go.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts