Results 1 to 2 of 2
  1. #1
    Join Date
    Oct 2003

    Unanswered: DB Design For Realtime Data logging

    I have a application (VB) that collects realtime data across a E-Net network. Through testing I have found the fastest way to get the Data into the table is bulk insert from a csv file.

    I have several hundred data points that I want to capture every second. I was thinking of 1 table with 1 row (ID, Timestamp, DataPointID, Value) per point. That is 17 million records a day.

    Is that too much? Should I set up 1 table per day, week, year? Should I have a daily table and a archive table for all data older than a day?....

    The Data Points that are being logged can change as needed for the operators. I do not think that 1 column for each data point is feasible because of this.

    I was hoping to get some feedback on this from someone how is collecting realtime data.

    Hope is the feeling that the feeling that you have will not last very long.

  2. #2
    Join Date
    Jul 2002
    Village, MD
    It depends on business logic of your application. How much time do you have to keep data for updating?(After that time you could move data to warehouse) Do you have to keep all this data after updating (analysis)? You could split data in parts by type if any. Sometimes, it is useful to have header and details for 'each data point'. Provide more details and hopefully you will have more advices.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts