Results 1 to 8 of 8
  1. #1
    Join Date
    Feb 2010
    Posts
    3

    Unanswered: Which datatype to use in this scenario

    Hi , readers

    Let me clearly explain what i need.
    I have a table Name People with fields.

    ID______firstname____lastname_____etc_____and qualities


    All fields are clear,I only need to explain Qualities.

    I need an array of string in this column and am confused which data type to use.Currently i am using a text data type and insertting a value in a deliminated text.

    I want to conform if this is a good idea to use this or sqlserver have another specific data type.

    i have heard of xml data type,but frankly i am not sure how to use this data type to achieve a desired result.

    I think sqlserver allows us to programm custom data type, but again frankly i have never used it.

    Please help me what to do in this scenario.Is deliminated text idea is good or should i turn to another direction.

    One more thing,laterly i will like to bind a table with VB.net application with CheckedListBox control.

    Again with my deliminated idea,i have done good so far,but i need a clean one technique.

    Thanks in advance.

    Regards,
    Sadiq Rajani.

  2. #2
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    One of the principles of First Normal Form is that all data within an attribute (column) is atomic. This typically means no delimited strings. There are many Qualities to every person so you very simply have two tables and a one to many relationship, or you can have three tables and a many to many relationship (depending on specification and requirements).
    Testimonial:
    pootle flump
    ur codings are working excelent.

  3. #3
    Join Date
    Feb 2010
    Posts
    3

    Which datatype to use in this scenario

    In near future i will crate a real world Application based on this collection idea which will have surely minimum 100000 records and each record with minimum 1 to 20-30 approx records,.....it will rapidly increase database.

    Also it will be a lot of pain when filtering these records based on multiple criteria.So if it is........then what is the alternate to this.

    Thanks
    Regards,
    Sadiq Rajani

  4. #4
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Quote Originally Posted by msadiqrajani View Post
    Also it will be a lot of pain when filtering these records based on multiple criteria.So if it is........then what is the alternate to this.
    Nope - it will make filtering easier. You try filtering on CSV strings - you can't use any indexes on those.

    For some reason (I don't know why) people get hung up on the number of rows their application might one day have. This is barely a consideration when you are only talking 30 million or so rows in a table of this sort. Suitably indexed this will work fine. Once you have your data in a B-Tree index you are no longer talking about the number of rows but the number of levels in your B-Tree, and for a narrow, 30 million row table this is very few (I'll guess four including the leaf).

    Design the thing properly according to good relational design. It will run fine. Feel free to have us check your DDL before implementing it.
    Testimonial:
    pootle flump
    ur codings are working excelent.

  5. #5
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    The point I am trying to make is you will store the same amount of data anyway. Having these separated out in to rows is an optimisation not a bottleneck. Shoving them in to one column would be an optimisation if the goal of optimising a database was met by just having as few rows as possible. HINT - it is not.
    Testimonial:
    pootle flump
    ur codings are working excelent.

  6. #6
    Join Date
    Feb 2010
    Posts
    3
    first of all


    Also it will be a lot of pain when filtering these records based on multiple criteria.So if it is........then what is the alternate to this


    this was for my method.Sorry for my wording mistake.

    I understood your every words and currently working on your idea.
    For future i have to use another which would be fast and omtimized.

    Thanks.

  7. #7
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,800
    Provided Answers: 11
    To find the depth of the index (at least in SQL 2005+), you can run this:

    Code:
    select index_depth
    from sys.dm_db_index_physical_stats(db_id(), object_id('tablename'), -1, null, default)
    On a table of over 3 million rows, I only came up with a depth of 3.

  8. #8
    Join Date
    Feb 2004
    Location
    One Flump in One Place
    Posts
    14,912
    Quote Originally Posted by msadiqrajani View Post
    Also it will be a lot of pain when filtering these records based on multiple criteria.So if it is........then what is the alternate to this
    It will be an even bigger pain to filter on multiple criteria if you use the CSV.

    If you have time, try both designs populated with sufficient dummy data. Try to write queries for both scenarios. I can give you a hint that for the normalised design you would use COUNT() as part of the query....
    Testimonial:
    pootle flump
    ur codings are working excelent.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •