as the saying goes, "when is this assignment due?"
normalization usually reduces the overall size of the database (advantage), greatly simplifies updates (advantage), ensures data integrity (huge advantage), and may mean slightly slower select queries (disadvantage)
denormalization is the exact opposite
types of candidates for denormalization? data warehouses or data marts featuring summarized, temporal data, where codes and keys have been translated into human-understandable descriptions
Denormalisation is the act of having a single table containing all the informations regarding something. It results in Data redudancy, Data non-availability, No Data accuracy etc., In other words, data in an incosistent state.
Data normalization is the act of breaking down column data to
place in tables where each column in the table is functionally
dependent on only one primary key. This process reduces data
storage costs by eliminating redundancy and minimizes
dependency of any column in the "normalized" database to only
one primary key.
My take is that denormalization is a process or decision that occurs AFTER normalization, where for practical or performances reasons you make a conscious decision to break a normalization rule.
So to begin with, your data structure is unnormal. Then you normalize it. Then you may make a decision to denormalize some part of it.
Of course, all these processes could take place within a very short time of each other, or even may occur pretty much simultaneously.
You may even decide on reflection that something has not been fully normalized, but justify this state for practical or performance reasons.
You could also think of denormalizing as flattening out the data structure, typcially for exporting data to another system, though this is not necessarily an example that involves tables in this state, just the process that produces the export file.