Results 1 to 4 of 4
  1. #1
    Join Date
    Jan 2006
    Posts
    47

    Unanswered: Access Database File Size

    I am importing large CSV files into Access and am getting a significant increase in file size (80-120%) from the CSV file size. Additionally, the files would have prefereably been in the same Database but the total is pushing the 2G limit (1.9 G) and will only be getting bigger. To handle this I have them in 6 still rather large files.

    My questions are as follows:

    1. Is there any way to keep these file size down? (I have tried adjusting field sizes but it hasn't helped)

    2. If I just link to the csv files will I see a drop processing time? (I am already waiting significant amounts of time for queries to run)

    3. Any other suggestions to improve size and/or performance? (I continually run compact and repair.)

    As a note, at this time I have to use Access for these files both as it is the only thing available and additionally it is the only Database I am familiar with.

    Thank you!

  2. #2
    Join Date
    Dec 2002
    Location
    Préverenges, Switzerland
    Posts
    3,740
    ignoring your questions for a moment
    .CSV is flatfile.
    are you keeping this stuff as flatfile in access?
    if yes, at the cost of a slightly longer import process you should be able to get a magical improvement in your reporting times and your db size by moving the data into a decently normalised form, sensible indexing, etc.

    back to your questions:

    1. sizewise you can't do better in flatfile world than .CSV (apart from zipped .CSV, but that is cheating)

    2. decently normalised/indexed database is fastest, everything else is slow.

    ...maybe my preamble was an answer to 3.

    izy
    currently using SS 2008R2

  3. #3
    Join Date
    Nov 2004
    Location
    Harrogate, N.Yorks, UK
    Posts
    83
    Tools\Options\General tab, 'Compact on close' will save you having to manually compact the database...

  4. #4
    Join Date
    Jan 2006
    Posts
    47

    How do you Normalise/Index

    Thank you for your response.

    I am currently importing the data but I was worried about the file size however from what you are saying below I will have improved performance by importing the data so I will continue to do that.

    I am familiar with setting Primary Keys but it isn't always possible with the data I am using as there is rarely a field or set of fields where duplicates won't happen.

    Otherwise I am not to familiar with Normalization or Indexing. How does one do that? When would you index but allow duplicates?

    Any clarification would be greatly appreciated.


    Quote Originally Posted by izyrider
    ignoring your questions for a moment
    .CSV is flatfile.
    are you keeping this stuff as flatfile in access?
    if yes, at the cost of a slightly longer import process you should be able to get a magical improvement in your reporting times and your db size by moving the data into a decently normalised form, sensible indexing, etc.

    back to your questions:

    1. sizewise you can't do better in flatfile world than .CSV (apart from zipped .CSV, but that is cheating)

    2. decently normalised/indexed database is fastest, everything else is slow.

    ...maybe my preamble was an answer to 3.

    izy

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •