Results 1 to 7 of 7
  1. #1
    Join Date
    Oct 2004
    Melbourne, Australia

    Unanswered: Importing Excel Data into Access

    Hello again, Colleagues all !
    I have a problem trying to import data from Excel into Access. The data in Excel refers to a hierarchy of four database tables; company, multiple locations per company, multiple addresses per location and multiple communications (i.e. phone, fax, email, etc.) per location. My technique is to load the Excel data into the database and then run a VBA program which will normalise the data, check it for inconsistencies and then distribute it to the various tables. I have tried both linking the Excel spreadsheet and importing it into a generated or existing table in Access. In all cases I get one type conversion error or another. I have particular problems with two fields. POSTCODE contains a sequence of four numeric digits, the local convention. PHONE consists either of an 8-digit local number preceded by an area code in brackets, or of an 8-digit number (local) or 10-digit number (interstate). Left to itself, Access would make both of these fields numeric, but I have changed the cell format in all fields to text before linking or loading.
    If I link the original Excel file, the POSTCODE is OK and so are the phone numbers with bracketed area codes. However, the #num! error flag appears in the fields with straight sequences of digits.
    If I import the data from the Excel spreadsheet into a new table (i.e. Excel determines the data types) I get a table with exactly the right types (all text except an autonumber ID) and the bracketed area code numbers appear correctly. However, the other phone numbers appear as numbers in scientific notation. Finally, if I try and import the data into an existing table with exactly the same schema as the generated one, the load fails completely. Access reports an error, but does not say what caused it.
    When importing data, Access creates a table (Sheet1$_ImportErrors1) in which to log errors. The only entry I get consists of the three fields:
    Error | Type Conversion
    Field | POSTCODE
    Row | 320.
    The only thing wrong with this is that there are no POSTCODE errors at all and the other errors I have described occur both before and after row 320.
    As the late great Yul Brynner was known to remark "Is a puzzlement!" and as the equally late great Julius Sumner Miller used to ask "Why is it so ?"
    Any thoughts, colleagues ?
    By the way, the applications are Access 2003 (in 2000 mode) and Excel 2000.

  2. #2
    Join Date
    Jul 2003
    One option would be to use TransferText. With that, you can use an import schema, specifying the data type for each column. Go to your help file and search for TransferText.
    Inspiration Through Fermentation

  3. #3
    Join Date
    Oct 2004
    Melbourne, Australia

    Solution to Loading Excel Data Problem

    I followed up the TransferText matter in the Access Help files. It led me to material that I had already received via a slightly different inquiry. Unfortunately, the list of options in the schema wizard only includes non-MS files, presumably on the assumption that MS formats are adequately covered in the method which I reported as failing.
    However, the advice from I received from RedNeckGeek (I find it odd to associate this whimsical name with someone who is obviously a very sophisticated Access practitioner ! ) did point me to a solution which is a hybrid of the schema method and the standard load which I had attempted before. I converted the Excel file into a CSV file (so that it lost its Excel formats) and loaded it into a pre-existing table which already contained the correct schema. Apart from the tedium of having to edit out lots of quotes and commas in the actual data (received from an external source, I should add) it worked just fine.

  4. #4
    Join Date
    Jul 2003

    For what it's worth....

    Not sure if this is ultimately what you are trying to do or not. If it is, this may be useful.

    I get data from our legacy system every day as a comma delimited text file.
    Since the files contain fields that allow users to put in characters that throw off a delimited file (like " and , ), I run this script against them.

    I run this as an ActiveX script from SQL Server, so I quickly adapted it to
    VBA for your purposes, but didn't test it thoroughly.

    Basically, it reads each file, one line at a time. Parses each line into fields, strips "bad" characters from the field, then writes the data to a temp file.
    Once it's done with one file, it overwrites it with the tempfile. When this is
    done, I import the files into my database with no problem.

    Function CleanCSVFiles()
        Dim objFso As Object
        Dim fle1 As Object
        Dim fle2 As Object
        Dim strPath As String
        Dim strFile As String
        Dim strFldr As String
        Dim strLine1 As String
        Dim strLine2 As String
        Dim intFile As Integer
        Dim strField As String
        Dim sinPosDelim1 As Single
        Dim sinPosDelim2 As Single
        Dim strSearch As String
        strFldr = "\\Agsql\dw\Data\FTP\Processing\TempFile.txt" 'put your own file path  to the temp file here
        intFile = 1
        Do Until intFile = 2
           Select Case intFile
            Case 1
              strFile = "opentonsd" '1st file i'm checking
            Case 2
              strFile = "invdetd" '2nd file i'm checking
           End Select
          strSearch = Chr(34) & "," & Chr(34) 'my delimiters are ","
          strPath = "\\Agsql\dw\Data\FTP\Processing\" & strFile  'path to file i'm checking
          Set objFso = CreateObject("Scripting.FileSystemObject")
          If objFso.FileExists(strFldr) Then
            objFso.DeleteFile (strFldr) 'If the temp file is found, delete it
          End If
          Set fle1 = objFso.OpenTextFile(strPath) 'Open check file
          Set fle2 = objFso.CreateTextFile(strFldr) 'Create the temp file
          Do While Not fle1.AtEndofStream
            strLine1 = fle1.ReadLine 'pull 1 line
            strLine2 = ""
            If Len(strLine1) > 0 Then
              sinPosDelim1 = 2
              sinPosDelim2 = 1
                sinPosDelim2 = InStr(sinPosDelim1, strLine1, strSearch)
                If sinPosDelim2 = 0 Then
                  sinPosDelim2 = Len(strLine1)
                End If
                strField = Mid(strLine1, sinPosDelim1, (sinPosDelim2 - sinPosDelim1)) 'get 1 field from line
                strField = Replace(strField, Chr(34), "") 'get rid of extra " characters
                strField = Replace(strField, ",", "") 'get rid of extra commas
                fle2.Write """"
                fle2.Write strField 'put scrubbed field in tempfile
                fle2.Write """"
                If sinPosDelim2 < Len(strLine1) Then
                  fle2.Write "," 'add new delimiter
                  Exit Do
                End If
                sinPosDelim1 = sinPosDelim2 + 3
              fle2.writeline 'complete the line in the temp file
            End If
          Set fle1 = Nothing
          Set fle2 = Nothing
          objFso.DeleteFile strPath, True 'This deletes the original file
          objFso.movefile strFldr, strPath 'This moves and renames the temp file, replacing the original
          intFile = intFile + 1
         If Not fle1 Is Nothing Then Set fle1 = Nothing
         If Not fle2 Is Nothing Then Set fle2 = Nothing
         If Not objFso Is Nothing Then Set objFso = Nothing
    End Function
    BTW - You only see the GEEK side of me here. Y'all don't wanna see the REDNECK
    Inspiration Through Fermentation

  5. #5
    Join Date
    Feb 2004
    Chicago, IL
    That 'feature' in Excel has always given me problems. As a result, whenever I have the ability to choose the format I get data in, I always avoid Excel spreadsheets.

    When I do get an Excel spreadsheet, I will usually convert it to a CSV file. Go to File/Save As then Save As Type change to CSV. It will give you some messages about losing formatting and functionality. That's what you want to lose, those 'features' in Excel.

    After I save it as a CSV file, I will change the extension to TXT and open in Notepad to check the results. From there you can use the TransferText method which allows you to determine the field type.

    As far as the file you are receiving goes, can it be changed to a text file? If not, the person generating the file, can they put an apostrophe infront of each piece of data? If so Excel will not try to determine the data type for you. You can actually test this by putting an ' in front one of the values giving you trouble. It should work now.

  6. #6
    Join Date
    Feb 2004
    HI Guys

    On that damed Excel "feature" causing the NUM# issues. There was a response to a question I logged here reference the same. One of the guys kindly responded with a macro for EXCEL that runs beneath the spreadsheet and forces a text formatting to specific columns of data. This removes the probelm

    I'll have a look in work monday and see where it is and try and get it on this reply ASAP.


  7. #7
    Join Date
    Apr 2004
    outside the rim

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts