Max Locks Per File (1 Viewer)

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Some of you will be familiar with this error:
File sharing lock count exceeded. Increase MaxLocksPerFile registry entry.

There are 2 ways of increasing the MaxLocksPerFile:
1. Permanently by changing the registry
2. Temporarily using VBA code

Both methods are explained in this link:https://support.microsoft.com/en-nz/help/815281/-file-sharing-lock-count-exceeded-error-message-during-large-transacti

In my case, I increased the locks for Access 2010 32-bit from the default 9500 to 15000 using the MaxLocksPerFile registry key
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Access Connectivity Engine\Engines\ACE]

However when answering a question about this in Access 2016 (365), I checked the registry & there is no equivalent section
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\16.0\Access Connectivity Engine\Engines\ACE]

I found that the path is instead:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\ClickToRun\REGISTRY\MACHINE\Software\Microsoft\Office\16.0\Access Connectivity Engine\Engines\ACE

I was also asked 2 other questions:
1. Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?
2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?

As I couldn't answer either question, I'm hoping someone else can advise.
I'm fairly confident who this question is likely to appeal to... :)
 
Last edited:

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:23
Joined
Feb 28, 2001
Messages
26,996
Oddly enough, though I have Access installed on my machine (Ac2010 in my case), there is no MaxLocksPerFile ANYWHERE in my registry. Which means there is a default value somewhere in the bowels of Windows and I can't answer this question by playing with my own system. So I went online and poked around.

I have found a couple of references that suggest values in the 100K range as a target for a decent workaround.

Another reference says to look for the setting that says "Open Access Databases using record-level locking" and that setting this option bypasses the file locks issue.

A third reference suggests that there is both an Access setting for this and a Windows setting, and that Access (when running) provides a process-level override for the system setting. So there can be more than one factor in play. If you don't set the value, there is a possibility that the Access setting isn't the one in play.

Supposedly, this VBA command works if you wanted to experiment with 200K locks:

Code:
DBEngine.SetOption dbMaxLocksPerFile, 200000
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Hi Doc

Thanks for your reply.
I guessed that if anyone responded it would be you.

That's very odd that you don't have any MaxLocksPerFile registry keys.

I have both A2010 & A2016 installed on this computer & a full registry check shows I have 5 such keys:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Jet 3.x

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Jet 4.0

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Access Connectivity Engine\Engines\Jet 3.x

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Access Connectivity Engine\Engines\ACE

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\ClickToRun\REGISTRY\MACHINE\Software\Microsoft\Office\16.0\Access Connectivity Engine\Engines\ACE

The ACE keys clearly related to the 2 versions of Access I have installed.
The Jet keys are presumably there to handle old MDB files though why I've got 3 of those I've no idea

I've not added any of these so they must be written when installing or updating Office. The only change I've made is to up the values from 9500 to 15000

If I get the error message again, I'll up it a bit further or perhaps just do it for the database concerned using the VBA approach

I suspect you & I found very similar sites using Google but none referenced the 2 questions I was asked:

1. Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?
2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?

So my guess is that the answer is No to both questions.
Either that or nobody has asked the questions before

I've just posted the same question on the UA forum to see if anyone comes back with any more info...
http://www.utteraccess.com/forum/index.php?showtopic=2046119&st=0#entry2663183
 
Last edited:

Galaxiom

Super Moderator
Staff member
Local time
Today, 21:23
Joined
Jan 20, 2009
Messages
12,849
Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?

If there was no sensible limit then why would there be a place provided to record one?
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Hi Greg

Not sure I understand your reply correctly
Are you suggesting any number could be used without side effects.
If so, why is it set so low by default? (9500)

Any thoughts on the 64-bit question?
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 21:23
Joined
Jan 20, 2009
Messages
12,849
The existence of a limit setting suggests that there are detrimental consequences if it is set too high. However I have not found anything to suggest what those consequences might be.
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
OK I see what you mean now....

Unless anyone can tell me otherwise, I'll suggest setting to 15000 as I have done myself then slowly increasing if problems still occur
 

moke123

AWF VIP
Local time
Today, 06:23
Joined
Jan 11, 2013
Messages
3,849
from what I recall from a thread long ago, the maxlocks was only an issue with Novell server, which i believe had a limit of 10,000. It may be that it is a legacy issue with no real reason to increase the default number.
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
That's weird ... after 2 weeks without an answer, 2 answers in an hour - the other on UA.

All I can say is it definitely fixed my issue when I increased the max locks per file to 15000.
I'm still no wiser about the 2 questions I (was) asked however

Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?
2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?

An a similar point, I recently had a System Resource Exceeded error message when copying 2.5 million records in one database to a table in another database. I'm using Access 2010.
I found a fix which involved adding a new registry key
It worked for me and may be useful to others

Taken from https://support.microsoft.com/en-gb/help/2726928/-system-resource-exceeded-error-message-when-you-perform-a-query-in-ac

1. Locate and then click one of the following registry subkeys:
On 32-bit versions of Access on 32-bit versions of Windows or 64-bit versions of Access on 64-bit versions of Windows:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Access Connectivity Engine\Engines
On 32-bit versions of Access on 64-bit versions of Windows:
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Office\14.0\Access Connectivity Engine\Engines

2. On the Edit menu, point to New, and then click DWORD (32-bit) Value.

3. Type WorkingSetSleep, and then press Enter.

4. In the Details pane, right-click WorkingSetSleep, and then click Modify.

5. In the Value data box, type 1, and then click OK.

6. Exit Registry Editor.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:23
Joined
Feb 28, 2001
Messages
26,996
Interesting. WorkingSetSleep sounds like a behavior-tuning parameter that causes your process to pause while the system is adjusting your working set size. It is theoretically possible to not do this but I'm damned if I know why it WOULD NOT be set. On my virtual-oriented mainframe system (running OpenVMS), it was automatic that if your system needed working-set adjustment, you would enter an involuntary wait state until the page manager stuck its fingers in the pie.

If my interpretation is correct, then you were running while the system was diddling with your working set in the background and you got a "working set collision" (trying to modify memory pages while the system was working in that area) which caused a memory allocation request to fail, hence the System Resources Exceeded. (Though I might have expected a different system error code than that one.)

As to why I don't have the MaxLocksxxx parameters in my registry... if you never diddle with them, then defaults apply and they are adequate for most home projects. My home system is never called upon to deal with more than a couple of files at a time AND I am picky about immediately closing what I open. (That should not be a surprise to anyone on the forum who knows my viewpoints.)

The application that WOULD have hit this barrier is behind locked doors at the place from which I retired and I can't get there from here any more. My clearance has expired so now I'm just another old toot with no security clearance.
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Interesting. WorkingSetSleep sounds like a behavior-tuning parameter that causes your process to pause while the system is adjusting your working set size. It is theoretically possible to not do this but I'm damned if I know why it WOULD NOT be set. On my virtual-oriented mainframe system (running OpenVMS), it was automatic that if your system needed working-set adjustment, you would enter an involuntary wait state until the page manager stuck its fingers in the pie.

If my interpretation is correct, then you were running while the system was diddling with your working set in the background and you got a "working set collision" (trying to modify memory pages while the system was working in that area) which caused a memory allocation request to fail, hence the System Resources Exceeded. (Though I might have expected a different system error code than that one.)

As to why I don't have the MaxLocksxxx parameters in my registry... if you never diddle with them, then defaults apply and they are adequate for most home projects. My home system is never called upon to deal with more than a couple of files at a time AND I am picky about immediately closing what I open. (That should not be a surprise to anyone on the forum who knows my viewpoints.)

The application that WOULD have hit this barrier is behind locked doors at the place from which I retired and I can't get there from here any more. My clearance has expired so now I'm just another old toot with no security clearance.

Hi Doc

From one old toot to another....

Hard to know what the registry hack did exactly but it worked which was the main thing.
TBH I knew that trying to copy fields from 2.5 million records was going to put a strain on my old PC but the hack meant it worked.

As for MaxLocksPerFile, I'd never touched those registry values till recently & certainly never added any of the 5 reg keys that I had.
But I believe the default key(s) would still be installed automatically with Access

As I've had no answers to my 2 questions, I've told my questioner, no idea!

This was the response at UA - see post #2 - no comment!
http://www.utteraccess.com/forum/index.php?showtopic=2046119&hl=
The links provided are a 'real education' as well
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:23
Joined
Feb 28, 2001
Messages
26,996
Ridders, I am going to take a wild-eyed guess on this one. Some of it is speculative and I freely admit I am drawing on general operating system experience. Regarding the MaxLocksPerFile parameter:

1. Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?

I'd be lying if I said I knew my next comments for an absolute fact, but...

The MaxLocksPerFile deals with a system feature called the Lock Manager. A lock is a data structure within WINDOWS (not necessarily just Access) that allows a program to create a software lock on an object and then control who can access the locked item (and, more importantly, HOW they are allowed to access it.)

When you lock a file, you establish information in the system data structure for that file so that other users attempting to use the file will have to ask you (and any other lock holder) for permission to use the file. Then, if you hold a lock and are waiting for it to happen, a user asking for access to the file triggers an event in your process. When you take the event, you know what process was asking and what they were asking to do. (And the time, of course.)

You can lock something but allow others to read it, for example. You can lock a file exclusively so nobody else can read or write in it. Or you can lock something simply because you wanted to track who was "touching" the object (sometimes called an "interest" lock.)

This is supported by data structures in the Windows pool and the lock management is honored by the device driver, which is after all the thing that you call to open a file. The following article shows you how to get to the list of open files, ALL of which are locked against deletion by other processes.

https://superuser.com/questions/117902/find-out-which-process-is-locking-a-file-or-folder-in-windows

In general, the locking structure is taken from one of the memory pools.

https://msdn.microsoft.com/en-us/library/windows/desktop/aa965226(v=vs.85).aspx

These pools dynamically change size as needed, but if it is the non-paged pool (and I think all file locks ARE in the NPP), then physical memory will constrain how big the NPP can grow. So performance-wise, setting MaxLocksPerFile to a very high number allows the pool to get very large. But NPP cannot be paged (by definition) so letting it get big will take away from physical memory. AND because the file system is a Windows function (as opposed to an Access function), the NPP comes from the SYSTEM memory, not from a process area.

Actually, that's not entirely true, because if you opened a LOT of files, you need a per-process list of files that will be automagically closed if your process exits without doing its own closure. So you will have a bunch of file-handle pointers somewhere in your local memory to support something called process rundown.

https://msdn.microsoft.com/en-us/library/windows/desktop/aa364225(v=vs.85).aspx

In the last paragraph of the above link, there is a comment that clarifies that a file access structure is partly private, partly shared by other Windows users/processes.

Therefore, I must say that the number of locks per file, because it uses a lot of physical memory, CAN reach a point where you will affect performance by reducing the number of free pages available for dynamic memory management and virtual memory activities.

The thing that barfs first is hard to predict. It MIGHT be the page file (virtual memory file) that is adjusted from the System's Advanced Performance dialog. It MIGHT be that you run out of physical memory because of loading down the NPP or the system's lower limit on free pages.

https://blogs.msdn.microsoft.com/ti...-windows-memory-management-revealed-part-two/

The above article explains Windows paging dynamics including the various page lists. What it doesn't show quite as well is that the NPP area consumes Free Pages over time, thus reducing the pages available for other work. The size of the file handle structures and file internal lock structures isn't that big, but a million of ANYTHING is a lot to manage, and you said your question was about 2.5 million records. I believe that you implicitly create at least a million file internal lock structures when you do that.

2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?

Because the structures for file handles probably changed for 64-bit windows due to addresses getting bigger, you might be able to have more of them, but the truth is that the same argument still applies. The more of these things you have allocated, the more physical memory you will use. The only difference is that if you have a system with not merely the Very Large Memory model but the Ultra Large Memory model, Win64 will have more memory to allocate to you before it barfs.

VLM is what lets you get to 8 GB or 16 GB of RAM. ULM is for time-sharing systems where it is not impossible to have 192 GB of RAM. Note also that many virtual hosting systems (VMWare, for example) can limit each individual virtual host to the 32-bit addressing models, where the Large Memory Model (LMM) limits each virtual environment to 4 GB of RAM. In THOSE environments, the MaxLocksPerFile parameter can be quite limiting.
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Hi Doc

Thanks for the time you spent considering these questions
It may all be speculative but on first reading it all sounded highly convincing ....

I'll read the links you provided tomorrow as its late here & I won't take it in now.

Rather than try & precis your comments, I'm just going to refer the person who asked me the 2 questions to your post

=======================================
Just one thing.
The 2.5 million records was a separate situation which triggered the system resource exceeded error
However, it didn't trigger the max locks per file error - that last happened to me a few weeks ago before I posted this thread.

For now I'm keeping my max locks values = 15000 (up from the default 9500 ). If the problem arises for me again, I'll try 20000 ...

Thanks again. I appreciate your time on this
 

static

Registered User.
Local time
Today, 10:23
Joined
Nov 2, 2015
Messages
823
I was also asked 2 other questions:
1. Is there a sensible upper limit for this value? In other words if this was set to say 1,000,000 would this cause other performance issues?
2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?

As I couldn't answer either question, I'm hoping someone else can advise.
I'm fairly confident who this question is likely to appeal to... :)

Considering there are many ways of doing the same thing in Access, better questions would probably be 'why am I being told to change this setting?' and 'can I use different code that means I don't need to?'.
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Considering there are many ways of doing the same thing in Access, better questions would probably be 'why am I being told to change this setting?' and 'can I use different code that means I don't need to?'.

Good point as far as question 1 is concerned.
However, if you ever experience the error yourself, you'll be hard pressed to work out just why it occurred ON THAT OCCASION and what you'd need to alter.

However, that doesn't apply to question 2, which in my mind is the more interesting one. I've never yet heard of any benefit to 64-bit Access. This just could be the exception
 

isladogs

MVP / VIP
Local time
Today, 10:23
Joined
Jan 14, 2017
Messages
18,186
Over 5 years on and I'm increasingly finding benefits to using 64-bit Access due to its ability to handle more system resources / memory - useful when handling large datasets.

I thought I'd ask ChatGP
1. Is there a sensible upper limit for MaxLocksPerFile value in Access? In other words if this was set to say 1,000,000 would this cause other performance issues?
2. Is this one situation where 64-bit Access is better? In other words can it handle system resources any better than 32-bit?
This was the response
  1. Yes, setting a very high value for MaxLocksPerFile can cause performance issues and potentially lead to database corruption. The exact upper limit will depend on various factors such as the hardware and size of the database, but Microsoft recommends a value of no more than 20,000 locks per file.
  2. In general, 64-bit Access may be able to handle larger databases and larger datasets than the 32-bit version, as it can access more system resources. However, whether it is "better" in this situation will depend on the specific requirements and limitations of the system being used.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:23
Joined
Feb 28, 2001
Messages
26,996
Now if only MS would finally bite the bullet and transform internal pointers for the database, which would break the 2GB barrier. But of course, doing that would make ACE be a competitor to SQL Server, so unlike the sound barrier, that's one barrier we will probably never break.
 

InstructionWhich7142

Registered User.
Local time
Today, 10:23
Joined
Feb 24, 2010
Messages
199
I went through this thread and some others, for reasons of ODBC compatibility with Foxpro i'm stuck on 32bit, and Access 2016/office365 has only recently been made Large Address Aware, so you may find your issues have resolved themselves, if you're still on 2010 it turns out this simple utility lets you change it to 4GB without issue, I've been running like that for a year or two on 2010 without issues,

Code:
Large Address Aware.exe

you can check your memory usage with

Code:
vmmap

amusingly I just found this as well

 
Last edited:

Users who are viewing this thread

Top Bottom