Results 1 to 7 of 7
  1. #1
    Join Date
    Oct 2006
    Location
    Everett, WA
    Posts
    4

    Question Unanswered: Hello: Access/CF issues

    Hey everyone....long time listener, first time caller. How's it going? I am looking for help on a database update issue, and am probably missing something remedial, but you know how it is when you stare at a problem for hours and hours, sometimes you get lost in it. Help?

    I am working on an application for our online advertising department. I set up our ad delivery system to generate a daily CSV file containing information (Publisher, Advertiser, Ad Spot, Category, Subcategory) on all of our ad campaigns. The problem is that the reporting system won't provide the start and end dates of all the campaigns. I thus added columns in our Access 97 database of web ads for start and end dates so I could change these manually, and they come up beautifully in the Coldfusion 4.5 calendar system I built. With me so far? (If not, I can clarify.)

    NOW the problem is....how to update the day's advertisers, categories, etc. that come out of the report w/o losing the dates associated w/ each that I've manually inserted. What I would like to do is compare the records of the CSV file w/ the records of the database and insert new records--new ads--where they don't already exist. How do I do this?

    I've tried this:

    <cfloop index="FullAdRecord" list="#UploadedCSV#" delimiters="%"> // loop through rows of CSV report

    <cfif #ListGetAt(variables.FullAdRecord,1,",")# NEQ #GetAll.Publisher# AND.......<cfif #ListGetAt(variables.FullAdRecord,5,",")# NEQ #GetAll.Subcategory#> // compare field 1, field 2, etc. of each row w/ rows of data pulled by previous query
    <cfquery name="InsertNew" datasource="webads"> // if a row does not match a row already in DB, insert as new
    INSERT INTO AdStats (Publisher, Advertiser, AdSpot, Category, Subcategory, Start, End)
    VALUES ('#Trim(ListGetAt(variables.FullAdRecord,1,","))#' ....'#Trim(ListGetAt(variables.FullAdRecord,5,",") )#','1/1/06','1/1/07')
    ;
    </cfquery>
    </cfif>

    </cfloop>

    But all this seems to do is duplicate the same 10 records each time I run it--no new records. This is the most success I've had so far. Any suggestions?

  2. #2
    Join Date
    Aug 2005
    Location
    Louisville, KY
    Posts
    28
    You may be making this too difficult. If you create a table that all this information is pulled into, and set the table up not to allow duplicates, the set up an import protocol to import it for you, it should be as simple as clicking a button daily, and anything new will be pulled in. Unless I missed something it seems it would be that simple.

  3. #3
    Join Date
    Oct 2006
    Location
    Everett, WA
    Posts
    4
    Hmm, that w/b simple, now how to do that? I have to allow duplicate records in a column, but I need to eliminate any row that matches another row completely. Can a table be set up to do this?

  4. #4
    Join Date
    Aug 2005
    Location
    Louisville, KY
    Posts
    28
    I just copied this straight from the help menu so sorry for the dryness:

    To prevent duplicate values from being entered into a combination of fields

    Create a multiple-field index using the fields you want to prohibit duplicate values for. Leave the Indexes window open when you have finished defining the index.
    How?

    Open the table in Design view.
    Click Indexes on the toolbar.
    In the first blank row in the Index Name column, type a name for the index. You can name the index after one of the index fields, or use another name.
    In the Field Name column, click the arrow and select the first field for the index.
    In the next row in the Field Name column, select the second field for the index. (Leave the Index Name column blank in that row.) Repeat this step until you have selected all the fields you want to include in this index.
    Note The default sort order is Ascending. Select Descending in the Sort Order column of the Indexes window to sort the corresponding field's data in descending order.

    In the upper portion of the Indexes window, click the new index name.
    In the lower portion of the Indexes window, click the Unique property box, and then click Yes.

  5. #5
    Join Date
    Oct 2006
    Location
    Everett, WA
    Posts
    4
    Thank you for your help. I did try this, and am now getting the following error message:

    "ODBC Error Code = 23000 (Integrity constraint violation)


    [Microsoft][ODBC Microsoft Access Driver] The changes you requested to the table were not successful because they would create duplicate values in the index, primary key, or relationship. Change the data in the field or fields that contain duplicate data, remove the index, or redefine the index to permit duplicate entries and try again."

    This seems to get me closer--I obviously don't want to create duplicate values. But I do have one row/recordset in the CSV file I am trying to upload that is not in the database. Why is that row not being inserted, and why am I getting this error message?

  6. #6
    Join Date
    Aug 2005
    Location
    Louisville, KY
    Posts
    28
    I'm assuming that you are making the changes to the table you already had in place. If this is the case you already have some dup values in it. That is why you are recieving the error. Try finding it, and deleting the dup. Hopefully that will do the trick when you try to make these changes again.

  7. #7
    Join Date
    Oct 2006
    Location
    Everett, WA
    Posts
    4
    That was of some help, thank you. The problem now is that unless the new recordset is at the top of the CSV file, the program breaks; it will not continue until it finds a new recordset entry. I cannot set up the CSV export to do this. Am I still missing something?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •