which uses most space a table or a column, and by how much? bytes/% etc?
is denormalization a good thing if it reduces the complexity of a design? & if it improves the semantics of a design?
is it normal for a schema to contain 80+ tables? if through amalgamating columns, which are related, i can shrink the no of tables down to around the 35-40 mark is this a good idea? although this will add extra columns of data to certain tables?
Denormalization: I'd say that - although it is good to have normalized tables - sometimes it is better to leave (some of) them denormalized if this improves performances (a dummy example: executing a report against a huge denormalized table could be faster than joining several huge normalized tables. Of course, you have to fill that table, probably using that joins, but this might be done in a matter of days/weeks/something, not in one session).
As of your third question ... number of tables can't be measured that way. There are applications that require small number of tables (for example, scoot/tiger schema), while - on the other hand - complex problems require much more tables (another example: my current project, which includes 3 "independent" teams with ~50 developers, has more than 200 tables).
It is up to you if you'll "shrink" number of your tables or not. If you think current condition isn't good, change it (if you can). However, keep in mind that "if something isn't broken, don't fix it" (meaning, you might make things worse instead of better).
P.S. Checked total number of tables as I returned to work ... 549 as of today. Gosh!