Results 1 to 6 of 6

Thread: Elegant way ?

  1. #1
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650

    Unanswered: Elegant way ?

    My aim is to print out all SQLSTATE message lines(paragrahs) from the command outputs of SQL Statements. I will not be interested about a couple of SQLSTATE values.

    Currently I have the SQL's output in sql.out and a list of SQLSTATES to be ignored in ignore_state.txt and I use the following command:

    cat sql.out | grep -p SQLSTATE | grep -vp -fignore_state.txt

    There should be a more elegant way to do this, I hope ...

    Any suggestions ?



    As an example :

    sql.out

    create table t2(i int not null,j int)
    DB20000I The SQL command completed successfully.

    create index ix2 on t2(i)
    DB20000I The SQL command completed successfully.

    alter table t2 add primary key(i)
    SQL0598W Existing index "SSANNASI.IX2" is used as the index for the primary
    key or a unique key. SQLSTATE=01550

    alter table t3 references t3(j)
    DB21034E The command was processed as an SQL statement because it was not a
    valid Command Line Processor command. During SQL processing it returned:
    SQL0104N An unexpected token "references" was found following "alter table t3
    ". Expected tokens may include: "FOREIGN KEY". SQLSTATE=42601

    and ignore_state.txt :

    SQLSTATE=01550

  2. #2
    Join Date
    Jun 2002
    Location
    UK
    Posts
    525
    You can remove cat from your pipeline and have the first grep take its input direct from your file...

    grep -p "SQLSTATE" sql.out | grep -vpf ignore_state.txt

  3. #3
    Join Date
    Jun 2002
    Location
    UK
    Posts
    525
    Originally posted by Damian Ibbotson
    You can remove cat from your pipeline and have the first grep take its input direct from your file...

    grep -p "SQLSTATE" sql.out | grep -vpf ignore_state.txt [/SIZE]
    Sorry, I meant...

    grep -p "SQLSTATE" sql.out | grep -vp -f ignore_state.txt

    However, if you only intend to exclude 1 or 2 sqlstates, you could do away with your 'ignore' file and do something like this...

    awk 'BEGIN {RS=""} /SQLSTATE/ && ! /01550|XXXXX|NNNNN/' sql.out

    RS="" sets the record seperator to a blank line and the rest is just a pattern match and an inverted pattern match where 01550, XXXXX and NNNNN are sqlstates that you want to ignore.



  4. #4
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    Thanks You Damian for your comments ....

    I will normally be having many SQLSTATEs and it might grow ... So, ignore state file will be the best approach ...

    I have documented the following as the limitations of the script Both are grep's limitations)

    1. Lines are limited to 2048 bytes.
    2. Paragraphs (under the -p flag) are currently limited to a length of
    5000 characters.

    Is there a way out ?

    Thanks

    Sathyaram


    Originally posted by Damian Ibbotson
    Sorry, I meant...

    grep -p "SQLSTATE" sql.out | grep -vp -f ignore_state.txt

    However, if you only intend to exclude 1 or 2 sqlstates, you could do away with your 'ignore' file and do something like this...

    awk 'BEGIN {RS=""} /SQLSTATE/ && ! /01550|XXXXX|NNNNN/' sql.out

    RS="" sets the record seperator to a blank line and the rest is just a pattern match and an inverted pattern match where 01550, XXXXX and NNNNN are sqlstates that you want to ignore.

  5. #5
    Join Date
    Jun 2002
    Location
    UK
    Posts
    525
    I think most awks also have a 2K record limit. GNU awk apparently doesn't but I don't know much about it.

    I've attached a rather verbose awk script that attempts to get round the paragraph size constraint by reading each line into an array element. This way, I think that the size will be limited by the memory of your machine.

    A perl guru (or maybe even a sed guru) might come up with something simpler but hey, I had fun writing it!

    Usage:

    awk -v ignoreFile=ignore.txt -f badSQL.awk < sql.out
    Attached Files Attached Files

  6. #6
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    Thanks for your time ...

    I'll try this out ...

    Cheers

    Sathyaram

    Originally posted by Damian Ibbotson
    I think most awks also have a 2K record limit. GNU awk apparently doesn't but I don't know much about it.

    I've attached a rather verbose awk script that attempts to get round the paragraph size constraint by reading each line into an array element. This way, I think that the size will be limited by the memory of your machine.

    A perl guru (or maybe even a sed guru) might come up with something simpler but hey, I had fun writing it!

    Usage:

    awk -v ignoreFile=ignore.txt -f badSQL.awk < sql.out
    Last edited by sathyaram_s; 05-10-12 at 09:16.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •