Results 1 to 2 of 2
  1. #1
    Join Date
    Nov 2016
    Posts
    1

    Unanswered: Large Data Set - Need Computed Columns

    Hello

    I am quite new to PostgreSQL. I have a relatively large data set, each day perhaps 900,000 rows.

    I intend to use columns to then make a computed column. Simple math.

    However I am concerned with using calculations on this amount of data, it would slow it down right?

    My question is:

    Can I make a computed column, perform the math as data is being imported, so only run it one time, then store the results in the column? This way I am not contantly calculating??

    I certainly want to know my options with computed columns... advantages / disadvantages and how best to use them if i have too.

    Other option:

    If not using computed columns, perhaps I can run a script to perform the calculations within a .csv, then merge the final calculation into PostgreSQL

    Looking forward to hearing more on this, thanks!
    Andrew

  2. #2
    Join Date
    Nov 2003
    Posts
    2,988
    Provided Answers: 23
    A simple computation using two or more columns from the same row won't make a big difference.

    But you need to supply more details on what exactly you want to do. Show us the table definition (as create table statements) and the calculations you want to do.
    I will not read nor answer questions where the SQL code is messy and not formatted properly using [code] tags: http://www.dbforums.com/misc.php?do=bbcode#code

    Tips for good questions:

    http://tkyte.blogspot.de/2005/06/how...questions.html
    http://wiki.postgresql.org/wiki/SlowQueryQuestions
    http://catb.org/esr/faqs/smart-questions.html

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •