Every statement SELECT containing call of function CONTAINS got 30-40M of memory. Oracle releases memory on connection closing, but our application holds with pool of open connections due to speed work up.
So, application exhausts available memory too quickly :-((
We use Oracle9i Release 2 under Win2000.
I'm not sure whether you are saying the application or the server is using all that memory and failing. Oracle will maintain cache of data etc that has been worked on until it either needs the space for new work or in some cases, it is sure that the cache is no longer required.
It sounds like your connection is opening a cursor and not closing it (ie maintaining the result set client side, forcing Oracle to maintain it server side) - hence why the memory is only available again once the connection is closed.
You can get an idea of the maximum memory which Oracle will use by including the setting PRE_PAGE_SGA=YES in your init.ora file. When the database is started Oracle will automatically allocate the full allowed amount of memory (default is to only allocate it as needed). This figure can change very slightly though depending on whether you're in dedicated or shared server mode.
This does have some slight performance implications, so you may just want to set the PRE_PAGE_SGA to ascertain what's happening and then switch it off again.
I had a quick look in the Oracle docs, and on google, it appears that there can be a number of reasons for the problem, but from first glance most likely that Oracle is configured to use more memory than the OS can (or on *nix is allowed to) provide.
Hopefully this can help? (a repost of a metalink response)