Results 1 to 9 of 9
  1. #1
    Join Date
    Apr 2011
    Posts
    5

    Question Unanswered: Performance problem

    Hi,

    I have a strange problem with an application querying a SQL server database.

    1. The application is written in VB6, runs as a service and queries/updates a SQL Server database. For historic reasons, it uses the ADO driver for ODBC connections, connecting to a SQL Server 2000 database (because of the size of the application, changing this is not currently an option). The client connection is TCP/IP.

    2. As part of its processing, it runs a query which will return around 1,000,000 rows. The query runs fine, but performance is poor. It takes around 15 minutes to run and return a client based cursor.

    3. I use ISQL and run *exactly the same query* and it takes 4 minutes to return all rows to the text portion of the screen.

    4. I wrote a standalone application in VB (also running as a service) which runs exactly the same query using the same connection properties and client based cursor and it runs the query to completion in 17 seconds. And to be sure I've got all the rows, I can save the same out to a file in a further 20 seconds. In addition, to simulate the main application's usage of memory, I added the feature to allocate large chunks of memory with little effect. The performance is affected only slightly when the SQL server is loaded heavily (a few seconds onto the query time).

    I'm quite experienced at this sort of thing (20 years+), but it's got me stumped! Any help would be appreciated.
    Last edited by MA|Rollins; 04-18-11 at 07:26.

  2. #2
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,799
    Provided Answers: 11
    It could be that there are certain SET options set for ADO that are not set for ISQL. This could lead to different query plans. It is not easy to see in SQL 2000, but you can detect evidence of it. Start up profiler, and run this query both by the application (in test if you can), and by ISQL. If the reads column in profiler is radically different for the two executions, then there is a significant difference in query plan.

    Any chance of posting the query itself?

  3. #3
    Join Date
    Apr 2011
    Posts
    5
    Thanks for your reply.

    I see what you are saying and this is exactly what I thought at first.

    This is why I wrote a test VB program to run just the query in question and do nothing else. I have since run other queries that exist in the main application within the test program and get the same result - everything runs much more quickly in the test program.

    ISQL reports an execution time of 16 seconds, which is camparable to the test program. The additional 4 minutes odd is taken to pull the results back and populate the text area.

    This leads me to believe that there might be some sort of throttling of the transport of the results back to the main client application or that the application does not take large enough chunks of results. But why, and if so, can I control this?

  4. #4
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,799
    Provided Answers: 11
    Have you checked what the duplex and speed are set to on the network card of the affected machine? Here, I am assuming you are running the tests from a different machine than the one that the application is running on.

  5. #5
    Join Date
    Apr 2011
    Posts
    5
    I haven't - we don't have direct access to the machines in question as they are maintained by a different team (big company politics, you know how it is!)

    However, we are using the same machine for the test app , ISQL and production app and all point to the production SQL Server.

  6. #6
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,799
    Provided Answers: 11
    Well, the first thing is to identify the actual bottleneck. First thing I would do is to use profiler to confirm that the server is treating both applications' queries the same. If it is, then see if the affected machine is bottlenecking on network performance for whatever reason. If that is not an issue, then there must be something with the way the application is receiving data (a million row client side cursor? Really?).

  7. #7
    Join Date
    Apr 2011
    Posts
    5
    In my defence, I didn't write the original code

    I'll give a static cursor a try (we require a record count as soon as the query has completed). Then I'll look at the profiler.

  8. #8
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,799
    Provided Answers: 11
    @@rowcount and a stored procedure may be in your future, then. ;-)

  9. #9
    Join Date
    Apr 2011
    Posts
    5
    That may be a good idea.

    Part of the problem is that this bit of code forms the core of a data import/replication engine (hand crafted 10 years ago) with SQL updates being sent round the world to 800+ users, so if I make a change I need to be 101% certain it's going to work or risk the consequences.

    And because it takes updates from 80+ files, I wouldn't want to write a stored proc for each when the required queries already exist in the application. I would need to send the query string to the server to be exec-ed in a generic stored proc which also returns the rowcount.

    Thanks for your help.

    Mark.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •