Results 1 to 4 of 4
  1. #1
    Join Date
    May 2012
    Posts
    2

    Unanswered: Databases VS Tables

    Hi

    I'm working on quite a complex project and would appreciate any feedback or guidance on achieving the most efficient database setup.

    The Project

    I'm developing a hosted visitor tracking and analysis service for affiliate marketers. The service will allow affiliate marketers to view and analyse their visitor metrics and behaviour in great detail, track their marketing campaigns and analyse overall campaign performance.

    The Data

    Each user will require access to 10-20 tables unique to their own user ID. Within these tables, I'll be storing the information relevant to their account. This information will be fairly extensive and I'd expect some tables to contain over 100,000 rows of visitor data.

    The Dilemma

    I'm unsure of the best way to store the data for efficiency. The two scenarios I have envisaged are:

    1 - Create a new database for each client that contains the 10-20 tables needed to store and analyse their data.

    2 - Add 10-20 new tables to a 'main' database, each appended with the client's unique user id (users_123456, clicks_123456, campaigns_123456 etc)

    Lets say for the sake of argument I have 1,000 clients using the service.

    This would mean in scenario 1:
    1,000 databases
    20 tables per database (20,000 total)
    100,000-200,000 rows of data per database (100-200 million total)

    Scenario 2:
    1 database
    20,000 tables
    100-200 million rows of data


    I'd greatly appreciate any feedback or thoughts on the best practice for this situation. Which scenario would be the most efficient in terms of running queries and allowing the users to view their data as quickly as possible?

    Many Thanks

    Ste

  2. #2
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    are those the only choices? i vote neither

    my solution:

    1 database
    10-20 tables
    unique client id to distinguish rows
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  3. #3
    Join Date
    May 2012
    Posts
    2
    Hi r937 and thanks for your feedback

    Nothing is set in stone at the moment - the suggestion you have made could also be implemented.

    I'm certainly no expert in database efficiency so will happily take on board more qualified opinions.

    Would your method be significantly faster on average than the methods I outlined, or are there other major advantages to your solution?

    Wouldn't searching through 200 million rows in a single table by client id be hard going for the server - particularly if another thousand users were doing the same?

    Cheers

    Ste

  4. #4
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    Quote Originally Posted by stebop View Post
    Wouldn't searching through 200 million rows in a single table by client id be hard going for the server - particularly if another thousand users were doing the same?
    if they were doing sequential reads, sure

    that's why we index our tables and optimize our queries
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •