Results 1 to 10 of 10

Thread: DTS Issues

  1. #1
    Join Date
    Aug 2007
    Posts
    7

    Unanswered: DTS Issues

    Hi,

    I am using Data transformation services to transform my data, from one
    data model to another.

    My issue is executing some of the steps take a very long time(hours), I
    am doing so activex processing, but nothing too funky, I end having to
    stop the server with no errors logged.

    Any clues, ideas.

    Thanks

    Jack

  2. #2
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    How much data are you transferring?
    Is it a straight copy of data from one place to another?

    You may wish to describe each step in your DTS process, including and SQL and ActiveX processing that is used - otherwise we can't give you much of a help!

    It may also be worth telling us what you want your DTS to actually be doing, as there may be more efficient ways to do this in the first place

    Kind regards,
    George
    George
    Home | Blog

  3. #3
    Join Date
    Aug 2007
    Posts
    7
    There is a asbout 200 mb of data 40 tables, all the data is being transformed forom one data model to another,

    the tasks balk on 4 tables.
    table a - 1 table 90000+ records lots of data this would be 30% of the db 3 table joing into 1
    table b - 1 table 1228 records nothing much- 3 table join into 1
    table c - 1 table 40000+ records nothing much as well 2 table join into 1
    table d - 1 table 25000+ records nothing much this table is the same as table b - 3 table join merge into 1

    although the one table with the bulk of tha data 40% transforms properly with no issues 225000+ records

    I am testing for null on every field that could be null.

    The data is being transformed from MSSQL to MSSQL.
    All the transformation tests work correctly, and parse correctly
    I have created separate packages for the remaining tables with still the same problems.

    Let me know if you need other details

    Jack
    Last edited by herders; 08-06-07 at 13:22.

  4. #4
    Join Date
    Aug 2007
    Posts
    5
    What version of SQLServer?

    You need to identify which steps are taking so much time, via
    package logging, or logging from your ActiveX, or profiling.

    Usually this is caused by SQL inefficiency, locking, and/or transactional recording. Are you using NOLOCK hints where possible?

    Your db logging level can slow things down too - Recovery Model 'full' vs. 'simple' adds a lot of weight. You can use bcp to avoid this.

  5. #5
    Join Date
    Aug 2007
    Posts
    7
    I am using MSSQL 2000 sp3.

    I am using the package loggin and it's shows nothing other than started at????? it never completes

    I will add the activex logging. not sure waht you mean by profiling.

    Can I transform the data with a BCP.

    Thanks

    Jack

  6. #6
    Join Date
    Aug 2007
    Posts
    5
    Right the DTS logging doesn't seem to checkpoint until completion so it's often unreliable.

    So try this....and probably easier than activex logging ...if you're still trying to identify the step, then log to the database - i.e. create an Audit table, and before each step, log the current time and step # using a straight insert/update. This way you can control the logging.

    It also sounds like you haven't looked at the running processes while your DTS is running - see Ent Mgr under Management -> Current Activity -> Locks/ Object and Locks/Process ID - that will probably show you some locking you didn't expect and tells you exactly your problem.

    You can also use the SQL Profiler tool to watch batch-by-batch what the db is doing, when all else fails; but the previous steps should be enough.

    You're into the hard part - getting it to work well! Good luck -

  7. #7
    Join Date
    Aug 2007
    Posts
    7
    I will give those a try,

    the activex looging seems to be reliable, I am testing on the first step and the last step and the log file says it's progressing.
    I may just be a little impatient I've been working on this for 5 straight days and getting tired, 90% the way though I appreciate the tips.

    Thanks

    Jack

  8. #8
    Join Date
    Jan 2007
    Location
    UK
    Posts
    11,434
    Provided Answers: 10
    When you check for null, what do you do, exclude it from your export, or put a dummy value in (e.g. "n/a" )?
    I bet this will be causing performance issues either way if you are implementing the check using ActiveX.

    The good news is that this can be easily (and not to mention - efficiently) remedied in SQL. The bad news is that it may require to re-write some of your DTS.
    George
    Home | Blog

  9. #9
    Join Date
    Aug 2007
    Posts
    7
    After all said and done, it was a lookup query that was bogging the package.
    Put the lookup into it's own package to run after
    Everything works great.

    Thanks for the assistance
    Jack

  10. #10
    Join Date
    Aug 2007
    Posts
    7

    More DTS Issues

    Right before I am to deploy the application, the DTS is crapping out.

    I have some rogue characters that I need to remove before the data gets transformed.

    The problem is I have lost the field.

    the field is DTSSource("l_text") and it is empty, when I execute, I am doing the activex file logging and the file is empty as well. But when I test the ransformation everything works great.

    Any Assitance would be appreciated

    Jack

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •