Page 1 of 2 12 LastLast
Results 1 to 15 of 19
  1. #1
    Join Date
    Jan 2003
    Posts
    1,605

    Unanswered: How to move package from one database to another?

    Hi,
    I can move data between two DB2 v8.2 FP11/Linux with db2move utility. How to move packages from one database to another? I only need to move packages between the same version and the same operating system. Is this possible?
    Thanks,
    Grofaty

  2. #2
    Join Date
    Jul 2006
    Location
    Pune , India
    Posts
    433
    i am puzzeled what may be the requirement that requires same packages !!!
    cant just recreating procedures on new db wont help...
    i have read somewhere about put/get routine sar .... but not sure regarding what they exactly do
    Rahul Singh
    Certified DB2 9 DBA / Application Developer

  3. #3
    Join Date
    Jun 2006
    Posts
    471
    take a db2look output and cut the needed statements for these procedures
    re-execute these statements at target database
    Best Regards, Guy Przytula
    DB2 UDB LUW certified V6/7/8

  4. #4
    Join Date
    Jan 2003
    Posts
    1,605
    Hi,
    I have two systems test and production. The main purpose is to spare CPU power on production environment. Why should I do something that is already done - why not just "copy/paste" the package from one system to another?

    I got question from DB2/VSE/zSeries administrator. He is not happy recreating packages on production environment one more time, because of waste of CPU cycles. It is interesting that on DB2/VSE this functionality is available and on DB2/LUW it isn't.

    So main idea, package is already compiled why compiled it one more time.

    Any idea is very welcome.

    Thanks a lot,
    Grofaty

  5. #5
    Join Date
    Jan 2003
    Posts
    1,605
    Quote Originally Posted by rahul_s80
    i have read somewhere about put/get routine sar .... but not sure regarding what they exactly do
    If I understand correctly put and get routine is for moving stored procedures from one db2 database to another.. so "moving" data from syscat.procedures.

    But I need to move packages. Some info about packages are stored in syscat.packages!

    Any idea how to move them from one db to another?

    Thanks,
    Grofaty

  6. #6
    Join Date
    Jan 2007
    Location
    Jena, Germany
    Posts
    2,721
    From my understanding (I never used GET/PUT ROUTINE), those two DB2 commands will do exactly what you want, i.e. extract the full definition of a routine from the DB2 catalog and restore it in the target database. It is not an option for the commands to just copy data from syscat.routines without the corresponding package because that would make the catalog inconsistent.

    Are we talking about LANGUAGE SQL procedures or other, external procedures (where you have to worry about the shared library)?
    Knut Stolze
    IBM DB2 Analytics Accelerator
    IBM Germany Research & Development

  7. #7
    Join Date
    Jan 2003
    Posts
    1,605
    stolze,
    forget about store procedure. I don't have a store procedure, I was just trying to find out what put/get commands do according to rahul_s80 suggestion. I thing this commands are not solving the problem I have.

    My problem is:
    I have a Visual Generator applications (cobol programs) that runs on CICS on VSE/zSeries connecting to DB2/VSE/zSeries. Because of using static SQL technics (application are not using dynamic SQL to access DB2 database - they use static SQL) development tool creates SQL packages on DB2/VSE. This SQL packages can be copy/paste (don't know the exact terminology on DB2/VSE/zSeries, I think it is unload/load) from one DB2/VSE/zSeries computer to another DB2/VSE/zSeries computer. So far so good. But now we would like to move data from DB2/VSE/zSeries to DB2/Linux/zSeries. So the final result would be Visual_Generator application running on CICS/VSE/zSeries connecting to DB2/Linux/zSeries.

    We know we must create SQL packages with development tool to test DB2/Linux/zSeries system. But now we would like to copy/paste SQL package from test DB2/Linux/zSeries to production DB2/Linux/zSeries environment. We know we can also create SQL packages on production DB2/Linux environment from development tool, but we would like just use copy/paste functionality to spare CPU cycles (compiling consumes CPU power) - so why compile a SQL package on production environment if the SQL packages is already compiled on test system.

    I don't know if this functionality is supported on DB2/Linux, but it is supported on DB2/VSE/zSeries. So one million question is: Is there a support of copy/paste functionality of SQL package from one DB2/Linux to another DB2/Linux box?

    Thanks,
    Grofaty
    Last edited by grofaty; 09-26-07 at 03:33.

  8. #8
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    It is not available. Although it is available on DB2 for VSE and DB2 for z/OS, you cannot copy packages from VSE to z/OS, so even if it were available on LUW you could not copy them from VSE to LUW.

    The idea that the packages will use of lot of CPU time is typically a mainframe issue where there are many users of the system and a chargeback is made for CPU cylces. Very few LUW users have such a CPU time chargeback system.

    So far it looks like you and your system administrator have wasted valuable time trying to implement something that does not exist, and is probably not needed.
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

  9. #9
    Join Date
    Jan 2003
    Posts
    1,605
    Quote Originally Posted by Marcus_A
    ...so even if it were available on LUW you could not copy them from VSE to LUW.
    Hi,
    I am not trying to copy packages from DB2/VSE to DB2/Linux. I am trying to copy packages from DB2/Linux/zSeries/test_environment to DB2/Linux/zSeries/production_environment. So does DB2/Linux has somekind of functionality to copy package from one DB2/Linux to another DB2/Linux computer?
    Thanks,
    Grofaty

  10. #10
    Join Date
    Jan 2007
    Location
    Jena, Germany
    Posts
    2,721
    Quote Originally Posted by grofaty
    forget about store procedure. I don't have a store procedure, I was just trying to find out what put/get commands do according to rahul_s80 suggestion. I thing this commands are not solving the problem I have.
    Thanks for the clarification. I missed that.

    I have a Visual Generator applications (cobol programs) that runs on CICS on VSE/zSeries connecting to DB2/VSE/zSeries. Because of using static SQL technics (application are not using dynamic SQL to access DB2 database - they use static SQL) development tool creates SQL packages on DB2/VSE. This SQL packages can be copy/paste (don't know the exact terminology on DB2/VSE/zSeries, I think it is unload/load) from one DB2/VSE/zSeries computer to another DB2/VSE/zSeries computer. So far so good. But now we would like to move data from DB2/VSE/zSeries to DB2/Linux/zSeries. So the final result would be Visual_Generator application running on CICS/VSE/zSeries connecting to DB2/Linux/zSeries.

    We know we must create SQL packages with development tool to test DB2/Linux/zSeries system. But now we would like to copy/paste SQL package from test DB2/Linux/zSeries to production DB2/Linux/zSeries environment. We know we can also create SQL packages on production DB2/Linux environment from development tool, but we would like just use copy/paste functionality to spare CPU cycles (compiling consumes CPU power) - so why compile a SQL package on production environment if the SQL packages is already compiled on test system.
    As Marcus_A said, this is not (yet) possible.

    Very often, you don't even want to do what you have in mind. If test and production systems have different hardware (more CPUs, more RAM, more and faster hard disks, statistics, etc.), you want to have the SQL compiler recompile the statements when you bind the package to the target system. That way, you could possibly get a better plan. I understand that this is sometimes not desired to have a stable environment.
    Knut Stolze
    IBM DB2 Analytics Accelerator
    IBM Germany Research & Development

  11. #11
    Join Date
    Jan 2003
    Posts
    1,605
    Quote Originally Posted by stolze
    As Marcus_A said, this is not (yet) possible.
    Hi,
    that is the response I was looking for. Thanks a lot.

    Quote Originally Posted by stolze
    If test and production systems have different hardware... you want to have the SQL compiler recompile the statements when you bind the package to the target system. That way, you could possibly get a better plan.
    Stolze, that is very good point! But if using Linux-es under zVM on the same zSeries hardware and test and production having the same amount of resources then recompiling SQL package one more time on production system is waste of CPU time.

    Stolze, thanks a lot for your post. You help me a lot. Thanks to you all, also.

    Regards,
    Grofaty

  12. #12
    Join Date
    Jan 2007
    Location
    Jena, Germany
    Posts
    2,721
    Quote Originally Posted by grofaty
    Stolze, that is very good point! But if using Linux-es under zVM on the same zSeries hardware and test and production having the same amount of resources then recompiling SQL package one more time on production system is waste of CPU time.
    Sure, there are always situations where you either don't need the re-compilation of plans because of identical environments or where you don't want the re-compilation to ensure stable and tested execution processes.
    Knut Stolze
    IBM DB2 Analytics Accelerator
    IBM Germany Research & Development

  13. #13
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    Quote Originally Posted by grofaty
    Hi,
    that is the response I was looking for. Thanks a lot.

    Stolze, that is very good point! But if using Linux-es under zVM on the same zSeries hardware and test and production having the same amount of resources then recompiling SQL package one more time on production system is waste of CPU time.

    Stolze, thanks a lot for your post. You help me a lot. Thanks to you all, also.

    Regards,
    Grofaty
    I have never seen a stored procedure compile/package-bind that took more than 1-2 seconds elapsed time (unless it had errors). The advantages of a fresh compile and bind on each new database far outweighs the extra CPU cost.

    I would not even consider using get and put. I always do a new create SP on each new platform. There are too many things that can come back and bite you if you don't do this (especially if you happen to have a C function inside a SP) and the instance names are different on source and target systems.

    System reliability and performance are more important than the extra cost of CPU time for a one-time package bind when moving to a new system.

    Some mainframe people are old and stogy. They don’t like change and they get upset if things are different than on their big iron systems. They complain about DB2 on LUW, not always because the complaints are legitimate, but because they are trying to protect their expertise in outdated technology.
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

  14. #14
    Join Date
    Jan 2003
    Posts
    1,605
    Quote Originally Posted by Marcus_A
    I have never seen a stored procedure compile/package-bind that took more than 1-2 seconds elapsed time (unless it had errors). The advantages of a fresh compile and bind on each new database far outweighs the extra CPU cost.
    Hi,
    As I said, I don't have a store procedure, just application using static SQL...

    But you pointed out a nice idea: what is the CPU consumption nowadays? I know this was a problem on old system, but has someone measure compile time of static SQL recently, I doubt... Thanks for pointing that out.

    Regards,
    Grofaty

  15. #15
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    Quote Originally Posted by grofaty
    Hi,
    As I said, I don't have a store procedure, just application using static SQL...

    But you pointed out a nice idea: what is the CPU consumption nowadays? I know this was a problem on old system, but has someone measure compile time of static SQL recently, I doubt... Thanks for pointing that out.

    Regards,
    Grofaty
    The point about the SP is that it uses static SQL just like COBOL with embedded SQL.

    In my old mainframe days, we did sometimes only move the COBOL object code and the DBRM, but we always did a bind on the target system. I think you have to have the DBRM moved to the target system in case an automatic rebind is necessary.
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •