Results 1 to 2 of 2
  1. #1
    Join Date
    Jan 2006

    Unanswered: How to handle a large DB and simultaneous accesses?


    I have to develop a companies search engine (looks like the Yellow pages). We're using PostgreSQL at the company, and
    the initial DB is 2GB large, as it has companies from the entire world, with a fair amount of information.

    What reading do you suggest so that we can develop the search engine core, in order that the result pages show up
    instantly, no matter the heavy load and the DB size. The DB is 2GB but should grow to up to 10GB in 2 years, and there
    should be 250,000 unique visitors per month by the end of the year.

    Are there special techniques? Maybe there's a way to sort of cache search results? We're using PHP5 + phpAccelerator.


  2. #2
    Join Date
    Jun 2004
    Arizona, USA
    Define "Instantly"...

    The way you've worded the question, you've defined a situation that is completely unatainable by any database/server/browser combination possible.

    When defining a system, you need to provide attainable specifications. In other words, define a maximum allowable response ime.

    For starters, address a proper database design. Ensure that your database is normalized, and denormalize it only if necessary. Make sure that your tables are adequately indexed.
    "Lisa, in this house, we obey the laws of thermodynamics!" - Homer Simpson
    "I have my standards. They may be low, but I have them!" - Bette Middler
    "It's a book about a Spanish guy named Manual. You should read it." - Dilbert

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts