Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

what is the lock limit on a file

Status
Not open for further replies.

lafalafa

Programmer
Sep 8, 2010
18
CA
I got an error stating that open lock limit has been reached.currently iam using a code which has the logic

perform until x > 1980
add 1 to x
read netd-file with lock key is .....
move some values to the fields of netd-rec
rewrite netd-file with unlock.
end-perform.

i got lock limit error.

can u explain what is the limit for opening a file in lock mode.


 
Using which release of which compiler on which platform?

Did you configure your environment or is this part of a centrally managed environment?
 
This is a run time error.We are using TANDEM application server,non-stop kernel Operating System,we use different environments for different functions.Currently iam dealing an online function for which code has been written in cobol.i already mentioned about the logic where i got the error.when i selected less num of deals iam not getting the error.
 
We understand the problem is a matter of volume. . .

The way limits are set often depends on on which compiler is being used.

As you have not posted which compiler is being used, suggest you speak with your technical support people who are responsible for the compiler. If all else fails, you could ask the compiler vendor.
 
ThIS IS THE PROC WHERE THE MENTIONED LOGIC EXISTS.THIS PROCEDURE IS BEING CALLED FOR EVERY RECORD WHICH HAS TO BE UPDATED.THER ARE TOTALLY 2000 RECORDS TO BE UPDATED.

***********************************************************
F310-UPDATE-NETD SECTION.
*-------------------------*
BEGIN-SECT.
*
SET UPDATE-SETT TO TRUE

MOVE DEALS-NETTED OF NETM-REC (WS-SUB)
TO PRIMARY-ID OF NETD-REC

MOVE "READ" TO MESG-OPERATION
READ NETDFILE WITH LOCK KEY IS PRIMARY-ID OF NETD-REC END-READ

IF NOT OK OF FILE-STATUS
MOVE SPACES TO WS-ERROR-TEXT-S
MOVE FILE-ERROR TO WS-MSG-NBR
MOVE FILE-STATUS TO WS-ERROR-TEXT-S
MOVE PRIMARY-ID OF NETD-REC TO WS-ERROR-TEXT-S (4:11)
MOVE "NETD-REC" TO WS-ERROR-TEXT-S (17:)
PERFORM U000-LOG-ERROR-ABEND
END-IF

IF DEAL-SELECT OF NETM-REC (WS-SUB) = "Y" OR "S"
IF DEAL-STATUS OF NETD-REC = "AN"
MOVE WS-PAY-REF TO PAYMENT-NBR OF NETD-REC
MOVE "PN" TO DEAL-STATUS OF NETD-REC
END-IF
IF DEAL-STATUS OF NETD-REC = "AG"
MOVE WS-PAY-REF TO PAYMENT-NBR OF NETD-REC
MOVE "PG" TO DEAL-STATUS OF NETD-REC
SET PAID-POST-NET TO TRUE
END-IF
ELSE
SET DONT-UPDATE-SETT TO TRUE
IF DEAL-STATUS OF NETD-REC = "AN"
MOVE "AG" TO DEAL-STATUS OF NETD-REC
MOVE "UPDATE" TO MESG-OPERATION
ADD 1 TO WS-DEAL-COUNT

REWRITE NETD-REC WITH UNLOCK END-REWRITE
* also update the status for the other side of the deal
IF BS-IND OF NETD-REC = "B"
MOVE "S" TO BS-IND OF NETD-REC
ELSE
MOVE "B" TO BS-IND OF NETD-REC
END-IF
MOVE "READ" TO MESG-OPERATION
READ NETDFILE WITH LOCK KEY IS PRIMARY-ID OF NETD-REC END-READ
IF DEAL-STATUS OF NETD-REC = "AN"
MOVE "AG" TO DEAL-STATUS OF NETD-REC
* update the deal count on the NETT file
MOVE GENERIC-KEY OF NETD-REC
TO PRIMARY-ID OF NETT-REC

MOVE "READ" TO MESG-OPERATION
READ NETTFILE WITH LOCK KEY IS PRIMARY-ID OF NETT-REC END-READ

IF OK OF FILE-STATUS
SUBTRACT 1 FROM DEAL-COUNT OF NETT-REC

MOVE "UPDATE" TO MESG-OPERATION
REWRITE NETT-REC WITH UNLOCK END-REWRITE
END-IF
END-IF
END-IF
END-IF

MOVE "UPDATE" TO MESG-OPERATION
REWRITE NETD-REC WITH UNLOCK END-REWRITE
.
END-SECT.
EXIT.
***********************************************************

 
compiler is cobol 85 compiler
That is which level of cobol you are using. There are many cobol 85 compilers.

Which compiler would be the product (i.e. MicroFocus, Realia, etc).

Is this Linux, UNIX, Windows or somethning else?

Who supports the compiler for your organization? Hopefully, they have the product documentation and the selections made at installation.

I suspect that the code is not the issue, but rather exceeding the limit for the environment.
 
I donno the level but we are running cobol85 compiler in tandem d45 operating system.OS is not linux or unix it is entirely different one known as "Himalaya k series guardian operating system".Give the possible limits for diff level of cobol.i could be successful when dealing with 1000 deals but when iam executing it for 2000 deals i got this issue.
 
yaa today i could find that it is processing 1661 deals successfully and when it comes to 1662nd deal it got abended.file is a key-sequenced file.so iam thinking like lock limit may be 1661.....iam eagerly waiting for ur suggestion.....can u provide the way to resolve this problem and how can we process more than 1661 records in a same program
 
Someone[\i] supports the compiler on the Tandem system. . . You need to work with them to determine if this is a parameter that can be changed and if it can be changed, what problems might occure due to the increase.

You might also consider breaking the "deals" into "batches" making sure that each batch does not exceed the limit.
 
Thanks for ur valuable answer......today we found that guardian environment in which we are running the program stops the process as it exeeds the lock limit...it is stating that "unable to obtain i/o processing control block...the transaction or open lock unit limit has been reached"....

In the manual we checked the corresponding explanation for the error like "All I/O process control blocks are in use,or a requester tried to acquire too many record locks or file locks".Then is there any way to make those i/o process control blocks usable in the same process??

Please help me.....
 
It looks as if your operating system has a limit that you have reached. If there is no way around this, then I suggest as Papadba has already suggested, batching the process up in some way.

Have you tried closing/unlocking/re-opening the file when this error occurs?

Marc
 
Thanks for the response u ve given.

Ya my manager suggested that after 1000 records have been processed successfully close the file and open it.we tried that also but got the same error.

Currently code is doing record lock for every record while reading and after rewriting the record it is unlocking.wen it comes to 1662nd record(that means after 1661 locks) it is getting abended,so instead of doing record lock we now modified the code in such a way that it locks the file once and perform the read,rewrite operations on the records(without record lock)and after completing the updation on all the necessary records(2500 records) it is unlocking the file.Now we resolved the problem and are successful.

But my doubt is for updating those many records it is taking a considerable time,that means for that much time file is in lock mode.So other users cannot do the similar kind of updation for their records in the same file.

Record lock is preferrable but we are getting the error.file lock may effect the other transactions.

Iam waiting for a response...thank you
 
Write a small program that splits the transactions into separate files of 1500 or less each. Then run each set with the record lock.
 
You may want to invest some time to determine why the lock count is not reduced when a lock is freed.

Hopefully, the limit applies to the number of concurrent locks (rather than the maximum that can be handled in a single execution).
 
Thanks for ur suggestion.

I ll tey to find the reason.

The error which i got is::

35 - Unable to obtain an I/O process control block, or the transaction or open lock unit
limit has been reached.


Cause. All I/O process control blocks are in use, or a requester tried to acquire too many record locks or file locks. This message is returned for a privileged operating system call.

Effect. The procedure sets the error code and returns without performing the requested operation.

Recovery. Wait, then try again. Check the system for processes that are performing too many concurrent I/O operations, or rewrite the application to request fewer locks.

 
Hi finally we could resolve the problem by file locking instead of record locking.

As file locking is not advisable we are trying to achieve the same using record locking.

After processing 1600 records we are calling the same process for the processing of remaining records.Because we found that there's a lock limit one process.so if we process the remaining records in a new process by invoking the same program,problem ll be resolved.

Is our thinking correct??can u suggest us?

Thanks
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top