Results 1 to 4 of 4
  1. #1
    Join Date
    Nov 2015
    Posts
    1

    Smile Unanswered: Data Base Table Design

    Hi Guys,

    This is my first post in this forum.

    We are Designing Database for a project. We have

    1) Customer Master Table
    2) Item master Table
    a) Item Sub Master

    Each Customer will have min of 3000 Items
    Each item will have min of 30 Sub Items.

    i.e, For each customer we 9000 rows in the transaction table.


    Suppose if we have 500 Customers 45,00,000 rows

    We thought of keeping each customer data in text file (Json Format)

    Kindly suggest what is this the best way to handle the transaction DB. Frequent Update will be there for the table.

  2. #2
    Join Date
    Jan 2013
    Posts
    354
    Provided Answers: 1

    too vague to be used; try again

    This is my first post in this forum.
    Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal data (https://xkcd.com/1179/). We need to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL.

    And you need to read and download the PDF for:
    https://www.simple-talk.com/books/sq...l-code-smells/

    We are Designing Database for a project. We have

    1) Customer Master Table
    2) Item master Table
    a) Item Sub Master
    If this is homework, we have to report you to your school for plagiarism.

    I hope not! The term “master” comes from network databases, not RDBMS. We have referenced and referencing tables. The concept of a “sub-item” makes no sense, and you failed to post sample data.

    For a quick education on RDBMS and basic data modeling, read this eight part series: http://www.sqlservercentral.com/stairway/72899/
    or see the video version:
    https://www.youtube.com/watch?v=95mS0v4d1M8

    Pay attention to hierarchical encoding schemes to replace your item-sub item model.

    >> We thought of keeping each customer data in text file (Json Format) <<

    NO! It is not a tool for RDBMS.

  3. #3
    Join Date
    Jan 2016
    Posts
    24
    Provided Answers: 1

    Steps to better transaction log throughput

    Transaction size is unlikely to be the primary cause of import slowness. There are legitimate reasons to break up transactions, even within a single table or to import a whole set of tables in a single transaction as you do

    http://www.sqlskills.com/blogs/kimbe...og-throughput/

  4. #4
    Join Date
    Dec 2007
    Location
    Richmond, VA
    Posts
    1,328
    Provided Answers: 5
    Naikd,
    The numbers you are postulating aren't all that out of line for todays database systems. What are your concerns? That SQL server wouldn't be able to handle a table with
    45,00,000
    rows? Yes, you have to be sure to index the table properly and you may want to look into partitioning to spread the data out over multiple files. I wouldn't be overly concerned with the size. I'd be most concerned with your design and how you plan to manage the data.
    Dave

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •