Results 1 to 6 of 6
  1. #1
    Join Date
    Aug 2003
    Posts
    2

    Unanswered: Need advise on huge ever increasing data

    Hi guys,

    I need some advise on my problem. Heres my scenario:

    I have an personal archive table(all transaction records spanning across 5years) which is currently about 5GB in size(using Access 97 and split into several mdbs).

    Imagine 60k record daily archived in my PC(FAT32), and it is still ever increasing.. my god...

    However this table is frequently in use as I need to dig out a person transaction history. And this would take 15 mins or more.

    ohh and I heard that file system(FAT32vsNTFS) affect the performance as well is this true?

    Can someone advise me a better and faster way of storing/retrieving these data?

    Many Many Thanks!

  2. #2
    Join Date
    Dec 2002
    Location
    Glasgow, UK
    Posts
    100
    You should think about using MySQL as a backend for your database - it's free and relatively simple to use - but I'd only recommend it for an NT based OS ie WinNT, Win2K & WinXP pro
    Access XP & WinXP Pro

  3. #3
    Join Date
    Jan 2003
    Location
    Rockville, MD
    Posts
    179

    Re: Need advise on huge ever increasing data

    Two things that may be obvious, but worth asking:
    1) Do you compact the database regularly?
    2) Are the data Normalized?

  4. #4
    Join Date
    May 2002
    Location
    London
    Posts
    87
    Oh and have you checked the field sizes? Particularly Text fields which can be set to 50 or 255 when a lot less would do.

  5. #5
    Join Date
    Aug 2003
    Posts
    2
    Hi all,

    Thanks for replying,

    Hi xander,
    I will try changing the OS to Win2k(NTFS) first, then I will look into MySQL. Thanks!

    Hi JTRockville,
    Yes, I always did a compact after daily backup. I dont think normalization is a good idea in my case. Thanks for reminding me anyway!

    Hi Risky,
    As I imported the data through ODBC (Link table then use query to back it up), all column size are specific. Thanks for asking!


    By the way guys for my scenario, how can MySQL help me or better facilitate my daily backup and daily history check?

    Many thanks to all for your idea!

  6. #6
    Join Date
    May 2002
    Location
    London
    Posts
    87
    Originally posted by ken39
    Hi JTRockville,
    Yes, I always did a compact after daily backup. I dont think normalization is a good idea in my case. Thanks for reminding me anyway!
    I'm curious why not. It might be excessive to perfectly normalise the data but if you have any 'name' fields in there, pulling them out to a lookup table is going to help a whole lot.

    Hi Risky,
    As I imported the data through ODBC (Link table then use query to back it up), all column size are specific. Thanks for asking!
    I have worked with databases where the data is imported by ODBC and it may well be that the field sizes are excessive, if that is the case in the source databases - (don't assume the other developer was perfect!)

    have a look at the max(len(x)) values for the text fields and see if there's anything worth hitting on?


    Seriously I spent a lot of time working with a 60k+ trade records per day database (imported via ODBC, in part fwiw) and pulling out the 'static' data (In my case 'Customer Name', 'Cust Country' etc and 'Stock Name' made a huge difference to the history database sizes and vastly speeded up historic querying.

    Mind you, tidying up 5 years is going to be pretty painfull, but there will be benefits in the end.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •