I have a database, or tables that will grow to gigs of size. The problem is the size is keep on increasing, and we cannot purge the database to backup, because our web reports rely on old data.
Datafiles rely on OS soon to meet the limits of datafile size of OS, what should i do to maintain the current situation. We are currently running our Mysql in Win2K Server
If you could describe the nature of this data and the nature of the reports you produce on it, someone could provide specific guidance.
If this data is time related (values stored over time and the date/time is important) and you must have all of it available, one choice would be to summarize old data. For example if you have multiple samples for each day but you are producing reports that always process this data in the same way, such as min, max, average, count..., then pre-analyze the data for each day and store the results for each day instead of each piece of data for that day.
You could also break up the data by time periods (each year as an example) and store it in separate databases. This would require that your reports query each database that stores data in the date range of the report.
If the data and reports are organized by some sub value and are always filtered using that sub value, such as a clients name, you could create separate databases for each letter of the alphabet (A, B, C...) or even for each client's name or ID.
You could also use InnoDB tables and have multiple 2 GB/4 GB (I guess) files that together makes up the table space. If you spread these over multiple disk you might even gain some performance. Alternatively you could take a look at partitions, described in chapter 18 of the MySQL manual. Note that this is a new feature that require one of the non-production ready versions of MySQL, but you can experiment with it now so that you can use it in the (hopefully rather near) future when it becomes available in a stable version.
EDIT: added a missing word and change a preposition