Results 1 to 3 of 3
  1. #1
    Join Date
    Feb 2003
    Folsom, Ca

    Unanswered: Question -175GB database, with a few tables w 26 million rows

    Good afternoon,

    Is there a FAQ regarding large databases? I'm current fighting with my database as it grows in size and the dataload process we use. In short, I have a few tables that have 27 million rows and 63 columns each. Some have 11 indexes. The size of the database is around 170GB and the data/device files reside on a SAN. Figured this topic has been beat to death so I'd start with some general overview and/or faq

    Anyway - I ran update statistics last night and it never returned after 14 hours. I'm expecting a long time due to the table size, but I'm wondering if there are some things I can do to help maintenance tasks such as index create, update stats, and database dumps. Should I consider using or looking into table partitions?

    Figured I would ask. Have a good day.

  2. #2
    Join Date
    Nov 2002
    You can denormalize to improve the performance (historical data / used data, etc).

    If you're using this database as a datawarehouse, Sybase Adaptiver Server IQ will probably help (and amaze) you

  3. #3
    Join Date
    Mar 2003
    Hi saf,

    I have got a database which is going to cross 64 gb and then I will not be able to dump it over 32 stripes...

    I wonder how do you manage to take backup of your database...

    thank you,

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts