Unanswered: DB Design For Realtime Data logging
I have a application (VB) that collects realtime data across a E-Net network. Through testing I have found the fastest way to get the Data into the table is bulk insert from a csv file.
I have several hundred data points that I want to capture every second. I was thinking of 1 table with 1 row (ID, Timestamp, DataPointID, Value) per point. That is 17 million records a day.
Is that too much? Should I set up 1 table per day, week, year? Should I have a daily table and a archive table for all data older than a day?....
The Data Points that are being logged can change as needed for the operators. I do not think that 1 column for each data point is feasible because of this.
I was hoping to get some feedback on this from someone how is collecting realtime data.
Hope is the feeling that the feeling that you have will not last very long.