I am evaluating MySQL for a large database project. I am hopeful MySQL is capable of the task as I would like to "win" the battle with the Oracle heads in the organisation.
I have generated about 2GB of test data ready to inject/test - final database size will be in the 30GB range.
I am familiar with MySQL but only with smaller (web back-end) databases. I have never used MySQL for something on this scale previously (Oracle, SQL*Server yes). This system will primarily be used for data warehousing/reporting i.e. not OLTP.
The application will involve the following...
Vouchers (c. 100M records, 100 bytes; growing by 3M records per month).
Transactions (c. 200M records; 100 bytes; growing by 5M records per month).
Will need to summarise the transactions into Monthly sets and also validate transactions against the relevant vouchers (i.e. update transactions with voucherID reference).
Any particular tips for structuring the data for optimal performance in MySQL.
* Daily SQL load (csv) dump of c.250k transactions per day to transactions db
* Periodic loading of voucher (batch) files containing 50-300k records to vouchers table
* Process transactions - flag used vouchers
Is it best to split the vouchers and transactions into tables for each batch (vouchers) and months (transactions) and create control (index) tables for each?