Results 1 to 6 of 6
  1. #1
    Join Date
    Jan 2004
    Posts
    17

    Question Unanswered: PHP - Execution time is more than 45 mins.

    Hi all,
    i m working on project(PHP4.x/MySQL4.x) where DB size is more than 30 GB
    there is one module, in which i have to import the data from CSV file and after manipulating the data storing the data in the DB.

    Round about 5000 SQL statements are fired during this procedure. and the total time taken is more than 40 mins.

    how can i reduce the execution time?
    i have tried to run this PHP script in Background but somehow it's not succeced

    what can be optimised solution to reduce the execution time?

    awaiting for ur replies.
    thanks in advance

  2. #2
    Join Date
    Apr 2006
    Location
    Denver, Co. USA
    Posts
    242
    To achieve the fastest possible insert into a database from a CSV file take a look at the - LOAD DATA INFILE 'file_name' or LOAD DATA LOCAL INFILE 'file_name' statements. Depending on what processing you are doing on the data, you could either insert directly or create an intermediate file and load the data from that file.

    If the long execution time is due to the processing being done in PHP, you would need to describe what you are doing and post your code to get more specific help.

    Edit: Also take a look at the mysql manual sections starting at this link for things that effect the speed of execution - http://dev.mysql.com/doc/refman/4.1/...ert-speed.html
    Last edited by dbmab; 08-24-06 at 12:40.

  3. #3
    Join Date
    Jan 2004
    Posts
    17

    Optimizing Update Statements

    hi thanks for the answer.

    most of them are UPDATE statements.

  4. #4
    Join Date
    Apr 2006
    Location
    Denver, Co. USA
    Posts
    242
    See the other sections in the manual in that same chapter. 7.2.14 is for UPDATES (which is a SELECT plus a write.) A user contributed an interesting method when the rows to be updated had some common select condition - he ran a SELECT to get the ID's, then updated just the rows using those ID's instead of scanning through the whole database on each and every update.

    Depending on what is changing in the data, you could also do something like the above with the LOAD DATA LOCAL INFILE - select the rows you know are getting updated, use PHP to update the data and write a CSV file, then use the LOAD DATA... command to replace the changed rows in the database.

    Since an update involves a SELECT, see chapter 7.2.3 in the manual as well. Having the correct indexes is the one thing that will have the greatest effect on how quick a row can be found and updated.

    Edit: If you post more specific information about what you are doing, including your table definitions and your PHP code, someone here can see if there is a problem and provide specific things that will help.
    Last edited by dbmab; 08-25-06 at 04:44.

  5. #5
    Join Date
    Apr 2006
    Location
    Denver, Co. USA
    Posts
    242
    In case you check this thread, I see you have posted another thread which mentions a 10 minute execution time. That thread caused me to recall that there is a performance problem with mysql servers that has to do with name lookup. I can also remember posts where this problem occurs even if you are specifying an IP address in the connection.

    See the following link for a description - http://hackmysql.com/dns

    The "fix" for this performance problem is to use the --skip-name-resolve startup option.

  6. #6
    Join Date
    Jan 2004
    Posts
    17
    thanks
    i have optimized all the sql query and all processing....then also it takes more than 10 mins and it also depends on the processes running on the server.

    so finally i have concluded to run it in background.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •