Results 1 to 9 of 9
  1. #1
    Join Date
    Apr 2006
    Location
    Edmond, Oklahoma
    Posts
    3

    Unanswered: Is there a limit to MaxLocksPerFile?

    I know this is a repeat question (according to a google search that found it) and I apologize in advance. However, searching on this forum is not bringing it up. Must be an old post.

    I have a database that has a very large number of records that has to be imported and formatted. Around 13 Million records. One problem I am having, off the subject, is that I am hitting the 2.09 GB file size limit in Access. However, ignoring that, I am getting the errors about being out of memory. I have decreased the number of times I am getting this message by increasing the MaxLocksPerFile to 5000000. Which should be adequate. I am importing data from approx. 39 reports one at a time. None of which should be larger than 5000000 records individually. I am still getting this error and was wondering if anybody had figured out an upper limit for the MaxLocksPerFile. I have been searching the internet with no luck.

    In the end I will probably use a MySQL backend and an Access frontend to avoid the file lock and size issues, however, I am more comfortable with Access and would like to find a way to overcome this issue if at all possible.

    Thanks,
    Jason

  2. #2
    Join Date
    Jul 2005
    Posts
    50
    How much RAM do you have on your PC and how large is your Paging File?

    Are you checking your Performance with Task Manager while you are performing this operation?

  3. #3
    Join Date
    Jul 2005
    Posts
    50
    "Out of Memory" message usually means more memory was required to run the process than is available, or a 64K segment boundary was encountered.

    Some common causes for "Out of Memory" errors:

    Situation: You have too many applications, documents, or source files open.
    Fix: Close any unnecessary applications, documents, or source files that are open.

    Situation: You have a module or procedure that's too large.
    Fix: Break large modules or procedures into smaller ones. This doesn't save memory, but it can prevent hitting 64K segment boundaries.

    Situation: You are running Microsoft Windows in standard mode.
    Fix: Restart Microsoft Windows in enhanced mode.

    Situation: You are running Microsoft Windows in enhanced mode, but have run out of virtual memory.
    Fix: Increase virtual memory by freeing some disk space, or at least ensure that some space is available. Also, create a sufficiently large fixed-size paging file on a defragmented hard drive.

    Situation: You have too many terminate-and-stay-resident programs running.
    Fix: Eliminate terminate-and-stay-resident programs.

    Situation: You have many device drivers loaded.
    Fix: Eliminate unnecessary device drivers.

    Situation: You have run out of space for Public variables.
    Fix: Reduce the number of Public variables.

  4. #4
    Join Date
    Apr 2006
    Location
    Edmond, Oklahoma
    Posts
    3
    Thank you for the information. Unfortunately I am out of the office today so I will have to wait until Monday to try these fixes out.

    Thanks again,
    Jason

  5. #5
    Join Date
    Dec 2004
    Location
    Madison, WI
    Posts
    3,926

    Out of Memory

    mlbuie - that was an excellent explanation! Thank You.
    Expert Database Programming
    MSAccess since 1.0, SQL Server since 6.5, Visual Basic (5.0, 6.0)

  6. #6
    Join Date
    Jul 2005
    Posts
    50
    Quote Originally Posted by pkstormy
    mlbuie - that was an excellent explanation! Thank You.
    {blush} Thanks for the compliment!

  7. #7
    Join Date
    Jul 2005
    Posts
    50
    Quote Originally Posted by jparson
    Thank you for the information. Unfortunately I am out of the office today so I will have to wait until Monday to try these fixes out.

    Thanks again,
    Jason

    So give us an update ... what did you find out?

  8. #8
    Join Date
    Apr 2006
    Location
    Edmond, Oklahoma
    Posts
    3
    Sorry about that. I got busy and completely forgot about replying. I never was able to find a way around the maxlocksperfile limit. Although, I think the issue was with the amount of memory available on my machine. Even setting the maxlocksperfile to 5,000,000, I would get the message asking if i wanted to continue. In the end I have educated the end user about this and she will probably not be pulling in that many records at one time anyway.

    While trying to figure this out I did some testing with MySQL as a backend. It worked great and gave no messages of any kind that would stop or slow the importing and processing of data. However, MySQL was much slower on a per record basis, but much more consistent than Access. At about 140 MB or 857,876 records access slowed to the speed of MySQL.

    If this situation presented itself again were the end user had no other option but to import a large amount of data, I will use MySQL as a backend to avoid these problems.

    Sorry, I know this doesn't give any exact findings for those of you who are having the same issue.

  9. #9
    Join Date
    Jul 2005
    Posts
    50
    Good enough.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •