Results 1 to 2 of 2
  1. #1
    Join Date
    Sep 2003
    Location
    Nuernberg, Germany
    Posts
    36

    Unanswered: Performance monitoring

    Hi,

    I want to collect performance measures regarding the import of data
    and the growth of resulting extract_tables.

    I use - say - 15 tables from a erp-system (like JDE Edwards)
    to build a -say - sales-warehouse and a MS-OLAP-cube.

    For every incoming table I got a dts-package witch is
    protocolled into msdb.sysdtspackagelog.
    Every package got the name
    [Build]_[Subsystem]_[Table_name]
    e.g. JDEdwards_Sales_F0005
    The destination table is namend e.g. extr_F0005

    Now:
    With a seperate DTS-package I transport the
    records from msdb-db into my build-db
    - say - JDEdwardsExtract.
    Name: extr_performance_monitor
    (eventually filter on buildname, because there are several builds in
    my system)

    So this result is quit good and easy to handle for
    seeing elapsed time per day.

    But the dtslog won't tell me, how many records the dtspackage
    had to copy.(and there is one at least with no records (Cubeupdate))

    Now the count(*) comes in.

    In the dts-package
    sys...log ---- to --- extr_performance_monitor
    I added the columns
    extr_table_name,
    extr_table_rowcount,
    extr_table_timestamp.

    With
    select name, 'extr_' + replace(name, '[Build]_[Subsystem]','') as
    extr_table_name from extr_performance_monitor
    I cut the original dts packagename down to the extr_.. name.

    But I don't have a clue how to solve the count(*)

    Example:
    dts_package_name,.... , extr_table_name,extr_table_rowcount, extr_table_timestamp

    [Build]_[SS]_[F0005], ..., extr_F0005, 0, sysdate.

    I think about a package wich is running after the last
    data_import (and cube_refresh) is done.
    (but the same day)

    So the result could be:

    Table_name (as dimension category)
    Time to perform
    Number of records in import table
    Records per second.

    The next step could be to look for required space.

    The result should be a grafik - say - over 12 month
    were you can easily see the amount of data performend
    time consumend, (table space used),
    and - very important -
    you could extrapolate your hardware requirements.

    Looking forward for any hints.

    Thanks in advance.

    Michael

  2. #2
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    Michael,

    How "married" are you to the DTS methodology?

    What are the formats of the files?

    Are you in the middle of building this, or maintaining it?

    If I had a choice, I'd use stored procedures...

    I guess you could use ActiveX in DTS

    Where are all the DTS mavens?

    Stored procedures are just more flexible and faster..

    OK a statement like that deserves a big

    MOO *

    * My own opinion
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •