Results 1 to 6 of 6
  1. #1
    Join Date
    Aug 2011
    Posts
    3

    Unanswered: How to handle millions of records.

    Hi

    I'm programming a website in which (possibly tens of) thousands of records have to be stored for every user.

    With 1000+ users the table size might quickly run up to millions of records.

    Will this affect query times drasticly and is it better to use a seperate table for every user? Or will loading times not be a problem using just one table with millions of records (obviously using indexing)?

    Thanks

  2. #2
    Join Date
    Sep 2009
    Location
    San Sebastian, Spain
    Posts
    880
    You need to look more closely at your application and how it will interact with the data in the database. For example, is the data going to be read only, in other words no changes are going to happen with the data? Also will users access only the current years data or all data?

    On the flip side, how stable is the software? Could table changes happen in the future? Imagine trying to make all table changes on 1000's of table?

    If you are going to be managing large amounts of data, have a look at partitioning which helps in this. But you need to determine a good partitioning policy to get the most from this feature. For this you need to fully understand your application and what is happening with the data.
    Ronan Cashell
    Certified Oracle DBA/Certified MySQL Expert (DBA & Cluster DBA)
    http://www.it-iss.com
    Follow me on Twitter

  3. #3
    Join Date
    Aug 2011
    Posts
    3
    First of all thanks for your reply.

    An example : A website where party organisators can add their party. People who suscribe on that website (enter name and e-mail) get a mail with a ticket. The organisators can download the e-mail adresses and names of people who got a ticket (per party or for all the parties they've had that year/the last years)

    Because it's a website adding records (with name and e-mail) has to go smoothly, even if its the 10 millionth record added) and downloading your names/emails per party should go fast as well.

    Only it's not a ticket website and their are +2000 "party organisators" and +-thousand of "tickets" per party.

    It's not very likely that the tables will have to be changed.

    So I'm wondering what's best, one table for all the tickets or a different table per organisator?

    Thanks in advance

    PS I'm using php and Mysql
    Last edited by ramus; 08-16-11 at 14:36.

  4. #4
    Join Date
    Mar 2004
    Posts
    480
    Look up DATABASE NORMALIZATION so you understand the concept. You would want one table for all of the tickets rather than a table per coordinator.

    Properly normalized data, indexed well and your server tuned correctly, mysql can handle significantly large amounts of data easily. Terrabytes of data, hundreds of millions of rows and more.

  5. #5
    Join Date
    Aug 2011
    Posts
    3
    I know about data normalization, just wasn't sure it would be able to process such vast amounts of data.

    Thanks for the help!

  6. #6
    Join Date
    Mar 2004
    Posts
    480
    Quote Originally Posted by ramus View Post
    I know about data normalization, just wasn't sure it would be able to process such vast amounts of data.
    So I'm wondering what's best, one table for all the tickets or a different table per organisator?
    Those two statements seem to be at odds with one another.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •