I am trying to import and not export.
Moreover when I am importing for a particular user the dbf file attached to this user is growing beyond 4 gb.Hence the problem lies with the OS seetings.So I need to know the correct settings which can handle large files.
The document says that filesize greater than 2Gb is always been a problem at most of OS. Here is another one that talks about "Can one beat the Unix 2 Gig limit?" stuff and deals with it the same way ... And here is a forum page where people talk about HP-UX and Oracle filesize limits; didn't read if from word to word, but you might.
I've never had such a problem, but I guessed that splitting the export file might solve your problem. Are you sure it won't help? Did you actually try it? Would it cost too much to do so?
If your DBF is growing beyond the max size of the Operating System, then simply add one or more datafiles to the tablespace that is being blown out. Make your datafiles with autoextend on, but a maxsize under the O/S limit. You can have more then one datafile in a tablespace.
You do not need a parachute to skydive. You only need a parachute to skydive twice.
Most versions of Unix, now have an option to specify being able to create files larger than 2G. Do not use this procedure, you are much better of limiting your dbf files to 2G. The reason for this is the ability to use DIRECT I/O. Seems that Oracle can not use this if you are using large files.