Okay, I admit. I crib.
When pretend to be a good DBA I use IBM Benchmark results and SAP documentation as crib-sheets. Always good & reliable info.

Until now:
I was studying this document google found for me
http://www.evermeet.cx/public/presen...age_Layout.pdf
and on page 16 I learned:

the best way limit the memory consumption of a datbase is to set the instance_memory to a certain value and set the database_memory to automatic.

So I used that (on my test server). An easy was to configure parameters is using the good old db2cc. Here the confusion starts: according to this tool the max-value for the instance_memory is 524288 4K pages. The IBM documentation allows larger numbers, but db2cc does not let you.
I was playing (test server remember?) and left the instance_parameter on a value of 524288. This would become active after a instance restart.... and I forgot about it.
During a massive test conversion on that server I encounterd a disk-full and I had to bring DB2 down by force and reboot. So DB2 has to do some crash-recovering, who cares?
BUT: db2 connect to db gave a
Code:
SQL1084C  Shared memory segments cannot be allocated.  SQLSTATE=57019
It appeared that the instance_memory had that low value of 524288 and database_memory had a much larger value (the way STMM left it) so no go.
Luckely I could solve it like this
Code:
db2 UPDATE DBM CFG USING INSTANCE_MEMORY automatic
But, whenever you're in a crash-recovery FOR REAL in you production environment you have problems enough, you do not need this extra ! So I limit my databases by setting a fixed value in database_memory and leave the instance_memory to automatic, always! Dispite the advice found on SAP documents.