Results 1 to 9 of 9
  1. #1
    Join Date
    Aug 2011
    Posts
    42

    Unanswered: huge number of records for a session in postgres Log‏

    I have come across a very peculiar situation.

    We have a postgres installation 9.0. It was installed last year.

    But we started implementation on it, just recently.

    And therefore the need to develop a Logparser application.

    During our preliminary parsing , What we discovered is just beyond the grasp of my knowledge.

    It seems that during a certain period lastyear in November, it created a Session entry that holds more than
    Fifty thousand records for a SINGLE SESSION (4ebccaa2.20c) . Yes that is 5 with five zeros


    Although it never reoccurs and luckily we had csv option on during that period.

    Where should i report such findings

    I have uploaded that Part of Log at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip

    arvind

  2. #2
    Join Date
    Nov 2003
    Posts
    2,935
    Provided Answers: 12
    What exactly is a "session entry"? That isn't standard Postgres terminology
    I will not read nor answer questions where the SQL code is messy and not formatted properly using [code] tags: http://www.dbforums.com/misc.php?do=bbcode#code

    Tips for good questions:

    http://tkyte.blogspot.de/2005/06/how...questions.html
    http://wiki.postgresql.org/wiki/SlowQueryQuestions
    http://catb.org/esr/faqs/smart-questions.html

  3. #3
    Join Date
    Aug 2011
    Posts
    42
    Quote Originally Posted by shammat View Post
    What exactly is a "session entry"? That isn't standard Postgres terminology
    The CSV Log is created in form of sessions, with each session having its own ID , Session line number, Session Start Time. There are also records related to session start and end.

  4. #4
    Join Date
    Nov 2003
    Posts
    2,935
    Provided Answers: 12
    I have no idea what you are talking about.

    What about showing us the SQL statements and the corresponding table definitions (as CREATE TABLE)?
    How did you import/export/create/edit that CSV file?
    Where did this CSV file come from?
    I will not read nor answer questions where the SQL code is messy and not formatted properly using [code] tags: http://www.dbforums.com/misc.php?do=bbcode#code

    Tips for good questions:

    http://tkyte.blogspot.de/2005/06/how...questions.html
    http://wiki.postgresql.org/wiki/SlowQueryQuestions
    http://catb.org/esr/faqs/smart-questions.html

  5. #5
    Join Date
    Aug 2011
    Posts
    42
    A CSV Log is created by enabling

    log_destination = 'stderr,csvlog'

    in postgres configuration file. It just is another kind of log file in CSV format with .csv extension usually in data/pg_log folder

    We have developed a application in Csharp that then parses the CSV Log file and tells us about errors, sqlcommands and other server activities.

    you can have a look at the zip file. it can be opened in notepad

    for more information refer to PostgreSQL: Documentation: 9.1: Error Reporting and Logging

  6. #6
    Join Date
    Nov 2003
    Posts
    2,935
    Provided Answers: 12
    Well, the log contains one line for each statement the client has executed. Apparently that session executed 50.000 insert statements.

    What is the problem?
    I will not read nor answer questions where the SQL code is messy and not formatted properly using [code] tags: http://www.dbforums.com/misc.php?do=bbcode#code

    Tips for good questions:

    http://tkyte.blogspot.de/2005/06/how...questions.html
    http://wiki.postgresql.org/wiki/SlowQueryQuestions
    http://catb.org/esr/faqs/smart-questions.html

  7. #7
    Join Date
    Aug 2011
    Posts
    42
    yes, everything would have been ok IF those statements were processed by a client or an external application. The problem is these are server generated statements, executed on server , by the server.

  8. #8
    Join Date
    Nov 2003
    Posts
    2,935
    Provided Answers: 12
    Quote Originally Posted by arvindps View Post
    yes, everything would have been ok IF those statements were processed by a client or an external application. The problem is these are server generated statements, executed on server , by the server.
    PostgreSQL itself does not "generate" INSERT statements on its own. And that session also show that "CREATE TABLE" statements were executed. And that is definitely not something PostgreSQL would do on its own. So it has to be something in your applications that was doing that.

    So what's that "pem.probe_schedule" table that is being updated or the "pemdata.oc_database" table that gets the majority of the INSERT statements. Which application is using them?

    To me this looks like some kind of initial database loading was run there (setup tables, load data)
    I will not read nor answer questions where the SQL code is messy and not formatted properly using [code] tags: http://www.dbforums.com/misc.php?do=bbcode#code

    Tips for good questions:

    http://tkyte.blogspot.de/2005/06/how...questions.html
    http://wiki.postgresql.org/wiki/SlowQueryQuestions
    http://catb.org/esr/faqs/smart-questions.html

  9. #9
    Join Date
    Aug 2011
    Posts
    42
    No it is not related to any of our applications.

    if you run goog for Pemdata this is what i find

    Creating user probe and alert in PEM 2.1 | Database Technologies


    but i cannot make anything of it or what went wrong

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •