Results 1 to 3 of 3
  1. #1
    Join Date
    Dec 2003

    Unanswered: Speed up Perl execution times?

    For the time being, I am stuck with very slow server (K6-2 233Mhz, 160MB RAM) that is used as a small public webserver.

    The dynamic site that I have created uses about 183 lines of perl, including about 6 (average) to 14 (max) mySQL requests per pageload.

    A full pageload can take upto 2 seconds on the LAN. Presently, I cannot offload SQL to another computer, but the mySQL requests seem to take very little time.

    What's taking the time is perl. Is there any way to speed up the execution? I have optimized this code and SQL requests the best I can, and gotten it down this far. Is there anything I can do to get better load times?

  2. #2
    Join Date
    Oct 2003
    I'd suspect a very slow LAN, or maybe a 10 mb/sec interface card on the old machine. But you really can't tell where the slowdown occurs until you can objectively measure it. You don't tell us what server you're using nor what kind of statistics you're collecting. Nor do you tell us how big the queries are and so-on.

    But obviously something is very wrong because 183 lines of Perl on a MySQL database should execute quite instantaneously. To start, execute the Perl program from a command-line and see how long it takes to run there.
    ChimneySweep(R): fast, automatic
    table repair at a click of the

  3. #3
    Join Date
    Dec 2003
    Thanks for the reply, but it's a 100Mbit switched LAN.

    I did mention some of the server specs before, it's not what I'd like to be running as a server, but I'm stuck with it for now:
    K6-2 233Mhz
    160MB EDO RAM
    30Gig 7200RPM Harddrive
    100Mbit NIC
    Gentoo Linux 2.4.20-gentoo-r6
    Apache 2.0.47
    MySQL 4.0.14-r2

    Here's the TIME statistics from running the scripts:
    For the Header:
    real 0m1.088s
    user 0m0.945s
    sys 0m0.085s

    And the Footer:
    real 0m1.180s
    user 0m0.965s
    sys 0m0.110s

    Queries are tiny, so are the results... the longest one would be:
    SELECT name,url,id FROM tree WHERE owner = '$myid';

    The worst would be about 10 rows returned.

    These scripts are being run through SSI, but I do not believe that is a factor in their tardiness (due to the command line run times). The content of a page is: [header SSI exec] content [footer SSI exec]. This is the easiest to maintain for me.

    They dynamically build the navigation bars, breadcrumbs, news, and related links panels on the site. I thought of having them created as static pages, but it would be a lot of work to switch the system over, and a lot to maintain. The execution shouldn't take this long, and I was hoping to leave it running this way, for easy updates.
    Last edited by Chireru; 12-30-03 at 13:45.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts