Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Functions crashing and taking a thread?

Status
Not open for further replies.
Apr 5, 2005
15
NL

On our LL instance (9.2.0.sp1) we seem to have a problem with the handling of threads. We've seen it happen with the function (in threadstatus) ll.<number>.functionsMenu and ll.delete.

In the case of the functions menu something at client side is going wrong (probably in tje JVM). As a result the request at server side is never being cancelled nor does it time-out. In our case this problem takes 1/4 of the threads and in the end the service is so slow we have to restart it.

Does anyone of you have experience with this? Any solutions? Is it possible to kill thread proceses by hand on the server?

Thanks in advance,

Koen
 
Have you done any performance benchmarks on threads,the number needed for your instance.Thread contention will always be there as transcations in livelink are always different.The only way to alleviate this is by a)Using a load balancer,b)Vertical scaling,c)isolating LAPI traffic.
Greg has a utility in his website that prohibits the number of subitems in a delete transaction
A performance benchmark($$$) by OT is worthwhile to know the finer semantics of your system.It is worth the money spent as it will point out deficiencies in llserver,database everything that you can think of.

Well, if I called the wrong number, why did you answer the phone?
James Thurber, New Yorker cartoon caption, June 5, 1937
 

In fact, we're curently setting up a load balanced new enviroment. But in the mean time i'm wondering why processes get stuck and keep using threads.

The problem is reproducable from one workstation with a newly installed but also rolled back java vm. The functionMenu process never dies and keeps taking a thread.

- Is this knows LL behavior to you?
- Is there any way to kill a process to free up the thread, other than restarting Livelink?

Thanks

 
Please pose this as a qn to howard pell or hans stoop in the opentext forum.As far as I know they are SDK gurus and they may have written something to that sort.You can have a go at it it youself if you care to look at the req handler
ll.threadstatus.I think it shows you thread allocation and studff like that

Well, if I called the wrong number, why did you answer the phone?
James Thurber, New Yorker cartoon caption, June 5, 1937
 
We have seen the issue you identified before. It typically happens when there is a mismatch between Livelink and supported software. One example was running Livelink on a new version of Solaris. All kinds of threads started hanging. The OS, Web Server and database are the key things to check to make sure they are supported versions.

The next thing that typically causes this problem are custom modules or custom changes. If they are a problem you will typically see trace files being produced in your opentext/logs directory. These files start with "trace".

The other key thing to check is system resources. Livelink will do some crazy things if you run out of system resources. We saw a Unix instance where it ran out of files handles and all sorts of things started falling apart. I don't think that's your problem since the problem seems isolated.

The good news is that it is definitely something in your environment since lots of organizations are running this version without the problem you are having. You can turn on logging and check your thread.out files. They will show when threads hang and what caused it. It will be the last entry in the specific thread.out file. For example if thread one hangs the thread1.out file will have a final entry much earlier than thread2.out if it is still working. The thread.out file lists the time, function called and other key info to help isolate the problem down.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top