Let's say i have a program that creates several files while running and i want to move those files to a remote server without actually store those files locally cause i don't have enough space. I'm thinking about tar pipes with rsync or netcat commands but i have tried these approach before with commands whose result were just one file (like tar or pax commands), something like this:

cd /sourcedir; tar -cvf - | nc -q 10 -l -l 7878

then in the remote server:

cd /targetdir; nc remoteserver 7878 | tar -xvf -

This somehow have worked for me. I have even used rsync and ssh tunneling in order to make such transfer as secure as possible.

Now i want to use the db2move command in DB2 (whose creates several files) in order to create all output files needed to migrate our DB2 Datawarehouse of about 1TB based on Linux on Mainframe to one LPAR on AIX. I don't have that 1TB of aditional space in the Mainframe so i need to use the pipe feature of UNIX to move those N files resulting of the runnning of db2move to AIX through the network.

Have you friends ever done something like this before ??

Any help please.

Thanks in advanced