Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

browser connection timing out

Status
Not open for further replies.

tonykent

IS-IT--Management
Jun 13, 2002
251
GB
Happy New Year to one and all. I was working on this problem last year and still haven't managed to resolve it. Can anyone here suggest a possible solution?

I'm currently producing an automated build deployment tool for the developers I support. This will allow them to produce a release candidate and deploy it to a server simply by opening a browser, filling in a form and clicking 'go'. I'm using perl cgi scripts to read the browser form and use these details to run relevant sql queries and build commands. When the build is ready for deployment that is when my problem starts.

There are several thousand files to write to a server and I am finding that the browser always times out when the writing is about half way through. Writing the files always proceeds to completion, but the connection between the browser and the running script is then gone and I can't run further commands, such as reporting on the success of the write, compiling a list of build contents and closing the database session etc.

I don't know if I should have posted this elsewhere (perhaps a browser forum), but thought I'd start off with the forum I know best given that the main functionality is perl.

I have tried addressing it from the Firefox end by setting network.http.keep-alive=true and increasing the timeout period to 500 and then 1000, but this has not resolved the issue at all as the connection is always still lost in the middle of writing the files (incidentally, the total writie time is usually 5-10 minutes).

Has anyone overcome a similar issue with an ingenious perl solution?
 
Hi

tonykent said:
setting network.http.keep-alive=true and increasing the timeout period to 500 and then 1000
Absolutely useless here. The keep-alive is used to serve more than one requests over one connection, by keeping the connection open after serving finished, so consecutive requests may arrive.

You forgot to specify what response you get from the web server. I suppose it is 408 Request Timeout. That would unequivocally indicate that the timeout comes from the server. But anyway, I am quite sure this is the problem. See your web server's log and configuration. ( You also forgot to mention your web server's name. )

For a simple solution I would think about involving the underlying operating system's ( you also forgot to mention ) capabilities to ensure the build script will not be interrupted when the parent Perl process is terminated by the web server on timeout.

For a better solution I would write a builder daemon so :
[ul]
[li]the Perl script signals to the builder daemon that new build should be done and generates an almost empty HTML document[/li]
[li]the builder daemon performs the build an writes output to a log file[/li]
[li]JavaScript in that almost empty HTML document checks each couple of seconds for news in the log file on the server and updates itself[/li]
[/ul]


Feherke.
 
Hi feherke, thanks for the comments. I'm running this on Red Hat Enterprise linux; the web server appears to be Apache-Coyote/1.1.

From /var/log/error.log I see

Code:
Tue Jan 03 15:13:29 2012] [warn] [client 192.168.1.**] Timeout waiting for output from CGI script /var/[URL unfurl="true"]www/cgi-bin/createbaseline.cgi,[/URL] referer: [URL unfurl="true"]http://myserver/cgi-bin/find_conflicts.cgi[/URL]
[Tue Jan 03 15:13:29 2012] [error] [client 192.168.1.**] (70007)The timeout specified has expired: ap_content_length_filter: apr_bucket_read() failed, referer: [URL unfurl="true"]http://myserver/cgi-bin/testfind_conflicts.cgi[/URL]
 
Hi

So far it seems the web server got bored while waiting for the Perl script. So browser configuration can not solve it. The exact response code usually appears only in the access log. Thinking again, is possible the response to be other than 408. Maybe 500. Anyway, what I wrote earlier seems to apply.



Feherke.
 
You will also have problems when people click the stop button, or the browser crashes, or their connection has a hiccup.

I suggest taking the form values and storing them somewhere and assign a job ID (epoch? auto increment in a db?) and a status. Then have a local job running (cron, daemon, whatever) that reads those values and does what needs to be done, while updating the status (i.e. started, processing, updating db, done, what ever). Make sure you have a web page where the user can watch the deployment happening that auto updates.

This also separates the user you are using on to run your website away from the user that is running the commands. A lot of places like to keep these completely separate.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[noevil]
Travis - Those who say it cannot be done are usually interrupted by someone else doing it; Give the wrong symptoms, get the wrong solutions;
 
Thanks to both of you for taking the time to comment.

Regarding the access.log, the final line in there just states:

Code:
192.168.1.43 - - [03/Jan/2012:15:10:57 +0000] "POST /cgi-bin/createbaseline.cgi HTTP/1.1" 200 1266 "[URL unfurl="true"]http://myserver/cgi-bin/testfind_conflicts.cgi"[/URL] "Mozilla/5.0 (Windows NT 5.1; rv:8.0.1) Gecko/20100101 Firefox/8.0.1"

You've given me plenty to think about :)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top