Results 1 to 3 of 3
  1. #1
    Join Date
    Aug 2003

    Unanswered: Performance Issue I cant figure What do you think

    We have an internal site with about 300 users that uses asp / mssql to collect information and allows customer care to search through it. Nothing major we are a call center so that gives you an idea of the business type our db have about 6 / 7 hundred thousand total records split accross 3 entites.

    So about a month ago we started noticing a much slower system i cant put my figure on it We have applied sp3 re-worked a few stored procedures but nothing has helped. At first i thought it was a memory leak cause it was one day at 400MB then the next 2 weeks it was at 1.7GB's but memory was not set then. We've done traces and still cant figure it. CPU usage is at 50% or more all day the actual site is ran from a different server and the only application is on this server is sql


    2 2.1 P4's
    2 GBs RAM
    RAID 5 100GBs
    2 nic's into a hp procruve switch

    Give me some idea's guys I cant figure this one out. Do we have a beffy enough box? 300 users should we try something else some of our users are conplaining about timeouts between sql and the page.

    Thanks for your help.

  2. #2
    Join Date
    Jan 2003
    Provided Answers: 11
    The box sounds beefy enough for most applications. You said you had three entities, does that mean three tables? If so, you could be hurling around an awful lot of extra data per request. For the quickest fix, run Profiler against the database, and see what queries are generating the most CPU. These are most likely the bad boys ruining it for the rest of your users. If these big queries can be sped up with indexes, or re-writes, then you can breathe easier for a while. At least until you get upwards of 2 million rows pe table, anyway.

    Once you have a saved trace of queries, you can feed that to the indexing wizard, which should tell you if new indexes will help at all. Hope this helps.

  3. #3
    Join Date
    Jun 2003
    When you mentioned that the application being used for "Collecting Information" and for customers to "SEARCH" through it, plus yours business environment is a "CALL Center", all indicated that you're employed OLTP and pseudo Datawarehouse function at the same time.

    In a perfect world, it's best to separate these 2 functions to improve performance. The main reason is that for Datawarehous type, you'll need a lot of INDEXES to speed up the SEARCH. However, the Online Transaction Process (OLTP) should try to avoid INDEXES as much as possible, because more indexes mean each update/insert/delete to a table would also generated more modification to those indexes.

    First of all, many will agree that start the Trace/Profile to capture of what's going during the time the performnce becomes sluggist, then run the Index Wizard aginst the trace file to find out whether a new Index is needed or an Index should be drop/recreate or Defrag.

    Since you said that the DB has 3 entities with a total of 6 or 700,000 records combined, make sure the tables' statistics are up to date - don't rely on the "Auto update statistic" even when you ckecked in the database's Properties, because for a very large table (few hundred thousand rows), it may take awhile before the table's statistic is updated. In the meantime, the SQL engine is using the old statistic, thus poorer performance is a result.
    There are many ways to improve the performance, like separating DATA & LOG to a different Controller & disks, different type of RAID, Defrag or Rebuild the Indexes (or even drop the DUPLICATED indexes) etc... However, based on my experience & industry survey, often the performance killer is more likely the APPLICATION(coding) itself.



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts