and so on ..... all the way down to the account level. Parameters set
the lowest level will override the values at a higher level. No matter
where the parameters are set, the requirements is to be able to see
the values at the ACCOUNT - SEC_SUB_TYPE level.
My first thought was to create one table for every level meaning Firm
- Business_Group, Firm-Product_Group etc. etc. ..... Then I have
thought of completely de-normalizing the hierarchy and carry all
combinations in 1 big table. Both seem a little extreme, although I
must confess I am leaning towards the latter. I realize that the
maintenance would be higher because of repeating elements but the
parameters will be changes no more than once every couple of months.
Also the number of records on both sides are fairly small.
This matrix that I am thinking about will be at most a cartesian
product between office and Sec_Sub_type. There are about 30 parameters
that need to be set in this manner.
Any thoughts on this ? One one side I want to normalize this, because
that what makes sense to me, but then on the other side, I am thinking
why do it and create 30 some tables in the process ? Since everything
has to be resolved down to the lowest level, why not just create a
This is an important design consideration for me since there will be
other parameters that may follow a similar pattern in the future and
performance is very important.
You should probably think about the way the access of the datasets is likely to be. Try to optimize for the actions, which are performed the most. Will the datasets be accessed as a whole (meaning, do you need all information for each dataset) or do people more likely only access the lowest level of information (meaning the lowest parts of the hierarchies).
If you need to display all or mostly all information at a high access percentage, then denormalisation will probably be the solution, rather than normalisation. And the other way round in the contrary case.