I can't offer any tips but I am interested in what your trying to store.
What does the database hold and what's in the 400GB table? I assume it's time series data but it still seems like a lot of data. What type of reports are you intending on running - will it be on one day or one item or are you hoping to run a report across the whole database?
It's telephone calls. We insert anywhere from 1.5 million to 6 million each day. so three months worth of data is roughly 130GB. We don't run any reports on this table directly. We build satellite tables which hold only the needed information for the report being called up. But these tables are populated from this much larger table holding the calls via cronjobs filing off php scripts. So there is a lot of back-end processing going on. It's this back-end processing that concerns me, as each of them and there are several that work directly with the larger table in the database.
i have the same problem; 60000 row-data is logged per minute, it's okay while inserting but when data is requested for reporting, it takes too long.
a friend of mine advised me using distributed databases: there should be master and summary databases. the master db holds the latest data and the others store older data. the data stored in master database should be moved to other databases in a period of time/limited size. by doing it, the size of the db can be smaller and the searching can be easier.