Results 1 to 3 of 3
  1. #1
    Join Date
    Jan 2003

    Unanswered: Big table & index build failures

    I recently recreated a table with 40M rows of data.
    Now I'm trying to do the idiotic thing of building an index on the data.
    Unfortunately, 29-30 hours into it Informix decides "Gee, this transaction has been running for a very long time. I better abort it!".

    logical log usage is low and doesn't seem to be going berzerk..

    I've tried fidddling with LTX* and TXNTIMEOUT..
    here's the most recent setting:
    LTXHWM 70 # Long transaction high water mark percentage
    LTXEHWM 80 # Long transaction high water mark (exclusive)
    TXTIMEOUT 0x12c00 # Transaction timeout (in sec)

    any suggestions?

    Informix Dynamic Server Version 7.30.UC3 -- On-Line -- Up 1 days 23:50:33 -- 1748960 Kbytes

    running on a 10 cpu solaris 2.6 box with many gigs of memory and much disk space.

    Upgrading is not an option... (In fact, on our test machine 9.3 simply crashes when trying to update from 7.3)

    If you want other onconfig settings, let me know.

  2. #2
    Join Date
    Mar 2002


    This job our site do abount 8-10 times for a month so I garuntee it isn't impossible our hardware quite same to you
    solaris 2.6 12 cpu 12G ram
    You have to
    - Increase LRU length
    - Define more DBSPACETEMP
    - Memory Buffer quite high
    - Update statistics low for the table
    - Set pdqpriority more than 0 before start create index
    - Change database mode to nolog while create your index

    Our environment set DBSPACETMP,update statistics,LRU 127

    index size : 48 bytes
    time usage : 6-7 hrs

  3. #3
    Join Date
    Jan 2003
    Thanks for the reply. Currently I have the buffers pretty high -
    BUFFERS 500000
    MAX_PDQPRIORITY is set to 90
    We have about 10GB of temp space..
    (The index on the old copy of the table is only 5.5GB)

    We DO have logging turned on, and I am very hesitant to turn it off since this is a live box. On the other hand, I don't think i could turn it off even if I want, since you need to level0 backup the db.. there are a number of corrupt pages in the database and ontape dies..

    (We moved some stuff around, I forgot to calculate offset into my dd numbers, lost the last page of 11 chunks. So I wrote an app to unload the data and now I'm reloading it back in). luckly the data is mostly static.

    restoring from backup isn't an option unfortunately, as our backup system gleefully went and overwrote the last good backup (before we noticed it wasn't backingup. Cool, huh?).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts