I have an application that needs to handle huge amounts of data. I get roughly 100,000 records per hour. I need to store this information for several months. I would want to use MySQL as my DB.
I have devised a set of archiving tables (EveryHour, EveryDay, EveryWeek and EveryMonth). I populate these tables from the previous tables after crunching. I am considering using multiple tables within the same level (as in EveryDay1, EveryDay2 etc). I would use a MetaData table which would have information about about how data is spread over these multiple tables based on start index and end index.
Somehow I feel there must be a more optimal way of doing this with other DB design.
I don't suppose mySQL offers table partitioning? For example, Oracle lets you have a single table that is partitioned by some criteria, which enables easier data management (old partitions can be taken off-line for example).