Results 1 to 5 of 5
  1. #1
    Join Date
    Sep 2003
    Location
    Mumbai, INDIA
    Posts
    8

    Normalization for Webapplication

    We are in process of redesigning the database structure of our application for the primary objective to improve performance. i.e. fetching records to the presentation layer even when no. of users increases.

    Currently, we are using a flat table where about 5-6 column have repeated values and can be converted into Master tables, going by the normalization rules. But once done will it improve the query time considering the fact that the queries will become complex with joins.

    It is hosted on WIN2K and accessing a shared SQL 2000 database.

    The no. of records will keep on incresing, say @ 1000 records per month.

    Thanks in advance for your opinion.

  2. #2
    Join Date
    Nov 2003
    Location
    Bangalore, INDIA
    Posts
    333

    Thumbs up

    HI,

    Data normalization is the act of breaking down column data to place in tables where each column in the table is functionally dependent on only one primary key. This process reduces data storage costs by eliminating redundancy and minimizes dependency of any column in the "normalized" database to only one primary key. So its always better to have consistent databse.
    Once U write Ur query, U can always fine tune it. So there won't be any drastic performance Issue.
    SATHISH .

  3. #3
    Join Date
    Sep 2002
    Location
    UK
    Posts
    5,171

    Re: Normalization for Webapplication

    Joins are done in the server, and as such have no impact on the web front end. Databases are good at joining data efficiently, so I would not consider denormalisation in this case.

  4. #4
    Join Date
    Jan 2003
    Location
    Duncan BC Canada
    Posts
    80

    Re: Normalization for Webapplication

    Originally posted by nshukla Currently, we are using a flat table where about 5-6 column have repeated values and can be converted into Master tables, going by the normalization rules. But once done will it improve the query time considering the fact that the queries will become complex with joins.
    That really depends on the queries and the data. If the where clause that you use now has to look through duplicate data, you will gain quite a bit by normalizing.
    Bradley

  5. #5
    Join Date
    Nov 2003
    Location
    Bangalore,India
    Posts
    51

    Re: Normalization for Webapplication

    OKAY.... SQL 2000 acts bit shaky after a 0.1 million records I have experienced this even after having great design. What I suggest if it is web application I feel have a duplicate table for store/application administrators. Have minimal records probably 50000 tuples in master table which is for the entire planet. As the products become obsolete move them to duplicated table.....thatz way to have high performace. Well we have to make couple of compromises but thatz the way Life is anyway..........

    Originally posted by nshukla
    We are in process of redesigning the database structure of our application for the primary objective to improve performance. i.e. fetching records to the presentation layer even when no. of users increases.

    Currently, we are using a flat table where about 5-6 column have repeated values and can be converted into Master tables, going by the normalization rules. But once done will it improve the query time considering the fact that the queries will become complex with joins.

    It is hosted on WIN2K and accessing a shared SQL 2000 database.

    The no. of records will keep on incresing, say @ 1000 records per month.

    Thanks in advance for your opinion.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •