Results 1 to 14 of 14
  1. #1
    Join Date
    Oct 2003
    Posts
    268

    Unanswered: Exception 911 Error: 911, Severity: 16, State: 1 DTS Designer

    I'm running a DTS package. It is moving transferring data VERY slowly and the local disk RAID is pegged in the performance monitor. There are no entries in the SQL Server Error Log, but if I run SQL Profiler, I see many entries like:

    Exception 911 Error: 911, Severity: 16, State: 1 DTS Designer

    I suspect something is wrong but have no clue how to go about debugging this. Any ideas?

    When I do: SELECT description FROM sysmessages WHERE error=911, I get:

    Could not locate entry in sysdatabases for database '%.*ls'. No entry found with that name. Make sure that the name is entered correctly.

  2. #2
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    What are DTS-ing? Are you doing Lookup operations in your transformation task?
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  3. #3
    Join Date
    Oct 2003
    Posts
    268
    No lookups. The DTS package just copies two tables from one server to another. Very simple.

    I have twenty worker systems that gather data and one repository server that periodically collects the data from the workers and processes it. Usually it is much faster to have the workers log the data locally and transfer it in bulk.

  4. #4
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    Well, I think I already made a point in earlier posts that I am all for BCP/BULK INSERT for things like this. Can you give us a little more info (whether tables already exist and you do an append, or you have some staging going on, etc.)
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  5. #5
    Join Date
    Oct 2003
    Posts
    268
    Thanks for the feedback thus far. BCP/BULK INSERT are work with flat files. I need to get the data from one database to another. I could use bcp to dump from one database to a flat file and bcp (or BULK INSERT) that same flat file back into the target database but that seems a roundabout way to do it. If nothing else it adds extra steps for errors to occur. I would hope SQL Server is smart enough to do this more efficiently.

    the tables already exist. They start at empty, I collect data from several workers at once depending on how much volume of data there is (ideally I'd pull from all 20 workers), then I process that combined data. This processing is very expensive as well and I'm desperately trying to find ways to optimize. Once this data is processed, the worker tables are truncated.

    Processing the data consists of:
    - running Analysis Services cube update
    - running a stored proc that calculates and stores certain necessary aggregate deduped data. This takes 12-24 hours depending on how much data is being processed.

  6. #6
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    If you post details on all your components (data structures and data massaging routines) that currently exist, I am pretty confident we can optimize this 24-hour process down to something like a normal maintenance window (2 to 6 hours nightly process, can't be longer!)
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  7. #7
    Join Date
    Oct 2003
    Posts
    268
    Quote Originally Posted by rdjabarov
    If you post details on all your components (data structures and data massaging routines) that currently exist, I am pretty confident we can optimize this 24-hour process down to something like a normal maintenance window (2 to 6 hours nightly process, can't be longer!)
    OK, you asked for it

    I will be very grateful if you can give feedback on this. I've posted the processing stored proc below.

    Quick table overview:
    BounceMessages/BounceAddresses - The two worker tables. Data is pulled from worker systems. Data is in somewhat verbose, non-deduped format, designed for Analysis Services cube
    Removables - Deduped table of email addresses
    SoftBounces - Data is put in here if it has bounce category 2, and "graduated" to the removables table if it persists over a certain date span.

    The % values in comments are what Query Analzyer reported when I ran the proc on a large set of data. This was an "Execution Plan" not an "Estimated Execution Plan".

    I also have many statements that log time to a status table. The %'s from Query Analyzer don't even come close to the time values in my status table:

    For a workload of:
    33 million BounceMessages
    78 million BounceAddresses

    149 minutes - Set Logical Domain IDs (2.91%+1.54% in plan)
    228 minutes - Insert Soft Bounces (23.79% in plan)
    100 minutes - Graduate Soft Bounces (27.91% in plan)
    54 minutes - Add Removables (36.20% in plan)
    182 minutes - Prune Soft Bounces (7.65% in plan)

    This proc doesn't include the cube update or the truncate. The cube update takes about 1.5 hours and the truncates take neglible resources.

    I have some basic thoughts on reordering this for speed but would like to hear someone with better ideas. If anyone reads through all this and gives feedback, I will greatly appreciate it.

    Code:
    CREATE PROCEDURE processNewRepositoryData
    	(@processHistoryID INTEGER) AS
    
    UPDATE ProcessBounceDataHistory
    	SET NumBounceMessages = (SELECT rows FROM sysindexes WHERE id = OBJECT_ID('BounceMessages') AND indid < 2)
    	, NumBounceAddresses = (SELECT rows FROM sysindexes WHERE id = OBJECT_ID('BounceAddresses') AND indid < 2)
    WHERE ProcessHistoryID = @processHistoryID
    
    -- 2.91%
    -- Set All LogicalDomainGroupID values.
    UPDATE BounceAddresses SET LogicalDomainGroupID =
    	(SELECT LogicalDomainGroupID FROM LiteralDomains
    	WHERE LiteralDomains.LiteralDomain = BounceAddresses.EmailDomain)
    
    -- 1.54%
    UPDATE BounceAddresses SET LogicalDomainGroupID = 1 WHERE LogicalDomainGroupID IS NULL
    
    UPDATE ProcessBounceDataHistory SET AfterSetLogicalDomainGroupID = getdate() WHERE ProcessHistoryID = @processHistoryID
    
    -- 23.79%
    -- Copy all soft bounces into SoftBounces
    INSERT INTO SoftBounces (EmailUser, EmailDomain, EntryDate)
    SELECT EmailUser, EmailDomain, EntryDate
    FROM BounceAddresses INNER JOIN BounceMessages ON (BounceAddresses.BounceMessageID = BounceMessages.BounceMessageID)
    WHERE BounceMessages.BounceCategoryID = 2
    
    UPDATE ProcessBounceDataHistory SET AfterInsertSoftBounces = getdate() WHERE ProcessHistoryID = @processHistoryID
    
    -- 27.91%
    -- Graduate prolonged soft bounce data
    INSERT INTO Removables (EmailUser, EmailDomain, BounceCategoryID)
    SELECT	EmailUser,
    	EmailDomain,
    	7 AS BounceCategoryID
    FROM SoftBounces
    WHERE NOT EXISTS
    	(SELECT * FROM Removables
    		WHERE Removables.EmailUser = SoftBounces.EmailUser
    		AND Removables.EmailDomain = SoftBounces.EmailDomain)
    GROUP BY EmailUser, EmailDomain
    HAVING datediff(day, MIN(EntryDate), MAX(EntryDate))
    	> (SELECT TOP 1 SoftBounceDaysToPermanentRemoval FROM ConfigurationSingleton)
    
    UPDATE ProcessBounceDataHistory SET AfterGraduateSoftBounces = getdate() WHERE ProcessHistoryID = @processHistoryID
    
    -- 36.20%
    -- Move data from verbose BounceMessages/BounceAddresses into Removables
    -- INSERT INTO Removables (EmailUser, EmailDomain, BounceCategoryID, EntryDate)
    -- SELECT
    -- 	EmailUser, EmailDomain
    -- 	, MIN(BounceMessages.BounceCategoryID) BounceCategoryID
    -- 	, MIN(EntryDate) EntryDate
    -- FROM BounceAddresses
    -- INNER JOIN BounceMessages
    -- 	ON (BounceAddresses.BounceMessageID = BounceMessages.BounceMessageID)
    -- WHERE BounceMessages.BounceCategoryID IN (SELECT BounceCategories.BounceCategoryID FROM BounceCategories WHERE DoNotMail = 1)
    -- AND NOT EXISTS
    -- 	(SELECT * FROM Removables
    -- 		WHERE Removables.EmailUser = BounceAddresses.EmailUser
    -- 		AND Removables.EmailDomain = BounceAddresses.EmailDomain)
    -- GROUP BY EmailUser, EmailDomain
    INSERT INTO Removables (EmailUser, EmailDomain, BounceCategoryID, EntryDate)
    SELECT
    	BounceAddresses.EmailUser, BounceAddresses.EmailDomain
    	, MIN(BounceMessages.BounceCategoryID) BounceCategoryID
    	, MIN(BounceMessages.EntryDate) EntryDate
    FROM BounceAddresses
    INNER JOIN BounceMessages
    	ON (BounceAddresses.BounceMessageID = BounceMessages.BounceMessageID
    	AND BounceMessages.BounceCategoryID IN (SELECT BounceCategories.BounceCategoryID FROM BounceCategories WHERE DoNotMail = 1))
    LEFT OUTER JOIN Removables ON (Removables.EmailUser = BounceAddresses.EmailUser
    		AND Removables.EmailDomain = BounceAddresses.EmailDomain)
    WHERE Removables.EmailUser IS NULL
    GROUP BY BounceAddresses.EmailUser, BounceAddresses.EmailDomain
    
    
    UPDATE ProcessBounceDataHistory SET AfterAddRemovables = getdate() WHERE ProcessHistoryID = @processHistoryID
    
    -- Delete SoftBounce records that are either too old or already correspond with a "removed" email address.
    DECLARE @softBounceExpirationDateTime DATETIME
    SET @softBounceExpirationDateTime = dateadd(day, -(SELECT TOP 1 SoftBounceDaysToExpiration FROM ConfigurationSingleton), getdate())
    
    -- 7.65%
    DELETE FROM SoftBounces
    WHERE EntryDate < @softBounceExpirationDateTime
    OR EXISTS (SELECT * FROM Removables WHERE SoftBounces.EmailUser = Removables.EmailUser AND SoftBounces.EmailDomain = Removables.EmailDomain)
    
    UPDATE ProcessBounceDataHistory SET AfterPruneSoftBounces = getdate() WHERE ProcessHistoryID = @processHistoryID

  8. #8
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    OK, first thing, - subqueries! Where you have one without referencing the outer query, - take it out (especially if it returns just one row), store the result into variable(-s), then reference the variable(-s) in the former outer query:

    declare @rowsBounceMessages int, @rowsBounceAddresses int
    set @rowsBounceMessages = (SELECT rows FROM sysindexes WHERE id = OBJECT_ID('BounceMessages') AND indid < 2)
    set @rowsBounceAddresses = (SELECT rows FROM sysindexes WHERE id = OBJECT_ID('BounceAddresses') AND indid < 2)
    UPDATE ProcessBounceDataHistory
    SET NumBounceMessages = @rowsBounceMessages
    , NumBounceAddresses = @rowsBounceAddresses
    WHERE ProcessHistoryID = @processHistoryID

    The following is one of the most expensive queries:

    SELECT EmailUser,
    EmailDomain,
    7 AS BounceCategoryID
    FROM SoftBounces
    WHERE NOT EXISTS
    (SELECT * FROM Removables
    WHERE Removables.EmailUser = SoftBounces.EmailUser
    AND Removables.EmailDomain = SoftBounces.EmailDomain)
    GROUP BY EmailUser, EmailDomain
    HAVING datediff(day, MIN(EntryDate), MAX(EntryDate))
    > (SELECT TOP 1 SoftBounceDaysToPermanentRemoval FROM ConfigurationSingleton)


    You can rewrite it based on the fact that Removables is already deduped:

    declare @maxSoftBounceDaysToPermanentRemoval int
    set @maxSoftBounceDaysToPermanentRemoval = (SELECT max(SoftBounceDaysToPermanentRemoval) FROM ConfigurationSingleton)
    SELECT s.EmailUser,
    s.EmailDomain,
    7
    FROM SoftBounces s
    left outer join Removables r
    on s.EmailUser = r.EmailUser
    AND s.EmailDomain = r.EmailDomain
    where r.EmailUser is null and r.EmailDomain is null
    GROUP BY EmailUser, EmailDomain
    HAVING datediff(day, MIN(s.EntryDate), MAX(s.EntryDate)) > @maxSoftBounceDaysToPermanentRemoval

    I am sure there is more, but that's a start, hey?!
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  9. #9
    Join Date
    Oct 2003
    Posts
    268
    Quote Originally Posted by rdjabarov
    OK, first thing, - subqueries!
    The "UPDATE ProcessBounceDataHistory" statements are all simple status statements. They aren't in an inner loop or anything and execute in no time (less than a second) and aren't worth optimizing further.

    You suggest pulling the "(SELECT TOP 1 SoftBounceDaysToPermanentRemoval FROM ConfigurationSingleton)" out of a subquery and into a separate variable. Hopefully SQL Server is smart enough to do that automatically. The estimated cost for both approaches are identical and the execution plans are as well. It's harmless but doesn't appear to offer much gain.

    Secondly, you changed a WHERE NOT EXISTS to a LEFT OUTER JOIN WHERE ... IS NULL. They both achieve the same effect and I would have thought they result in the same behavior under the hood. I changed it to the OUTER JOIN approach and the estimated execution value dropped in half but the real world performance slightly increased (got worse).

    Thank you for the time and feedback. It is very much appreciated. Did you get my private messages? Not that I'm complaining about free feedback

  10. #10
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    No I didn't...Was there one?
    "The data in a record depends on the Key to the record, the Whole Key, and
    nothing but the Key, so help me Codd."

  11. #11
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    Quote Originally Posted by rdjabarov
    No I didn't...Was there one?
    I'd have to guess that there was a private message, based on the question. You aren't exactly a speed demon on the replies sometimes... Maybe you should go "kick the pipe" and see what falls out!

    -PatP

  12. #12
    Join Date
    Oct 2003
    Posts
    268
    Quote Originally Posted by rdjabarov
    No I didn't...Was there one?
    ok, I resent you another private message. Let me know if you got it or if you didn't.

  13. #13
    Join Date
    May 2002
    Location
    Sydney
    Posts
    45
    I am sure you haven't done it this way, but, have you taken out the possibility that this is not an architectural prob.

    If you created the DTS package on your own pc, even though it was referencing two remote servers, all the data would come via you machine (even though you machine would not be mentioned at all in the DTS) and produce very slow response data throughput times,

    Although a schoolboy error, its been made before..

    Jim.

  14. #14
    Join Date
    Oct 2003
    Posts
    268
    Quote Originally Posted by jimskit
    If you created the DTS package on your own pc, even though it was referencing two remote servers, all the data would come via you machine (even though you machine would not be mentioned at all in the DTS) and produce very slow response data throughput times,
    I don't think that can be a possibility. When the DTS job is running, I can see the bottleneck is on the source disc or the target disc.

    And sometimes, I notice REALLY slow transfer speeds, the bottleneck is on the destination disc, and I can see the following Exception events with SQL Profiler:

    Exception 911 Error: 911, Severity: 16, State: 1 DTS Designer

    Thanks for the feedback!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •