Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Creating FTP Progress Bar. 3

Status
Not open for further replies.

1DMF

Programmer
Jan 18, 2005
8,795
GB
right, please correct me if i'm wrong..

When a form upload calls the script in the 'action' attribute on submit.

I've read that the file is actually transfered via HTTP before the script then runs.

Is this correct.

If so where is the data stored on the server before it is available through STDIN ?

I'm trying to give the surfer a progress bar for their file upload, and i'm strugling to understand how you could do this, if 1stly you can't tell the size of the file before the transfer starts and if the script isn't run until after upload, well then it's too late for a progress bar.

Can anyone shed some light on this please.

Cheers
1DMF



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
In an HTTP request involving the uploading of a large file, headers from the client are sent as normal followed by the file's data, like:

Code:
POST /upload.cgi HTTP/1.1
Host: mydomain.com
User-Agent: Mozilla/4.0
Content-Type: multipart/form-data
Content-Length: 655312

<binary upload file data comes here>

[COLOR=red](what follows is the server's response)[/color]
HTTP/1.1 200 OK
Content-Type: text/html
Content-Length: 3250

<html>
<head>
...

By that, I would imagine that the web server spends most of its time reading your uploaded file, storing it somewhere temporarily, and then calling the CGI script with it.

However, the CGI module has variables such as $CGI::pOST_MAX and $CGI::DISABLE_UPLOADS which can limit the size of POSTed data, or disable uploads, so if a user uploads to a CGI script that disabled it or they upload way too much content, the CGI script can automatically give an error and not even try to run.

This might be because CGI scripts tend to read <STDIN> into a variable, thus a 4 GB upload will result in a variable that contains 4 GB of data (the script could be coded to write the upload directly to another file but I think a lot of them don't do that).

Anyway, I don't know of a way to show a progress bar with Perl. I've seen a few concept scripts though. One of them basically had the CGI script fork(), where one fork writes the upload to a file, the other fork checks the file size as it's being written to disk and writes out HTML code for a progress bar with a meta refresh every few seconds.

Now some links:


Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Thanks for the reply, but I've found an uploader & progress bar with the help of feherke which is working great.

Uber Uploader
Has taken alot of messing around with as it is a mish mash of JQuery AJAX, PHP & Perl.

but I've finally got it intergrated and working with the members extranet and plugged into the master members module so I can validate user login.

So far so good, i'm gonna try with a 85mb file and see how it pans out :)


"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
nah, doesn't work, bombs at about 10minutes @ 15mb

I've changed all server settings to 1800 seconds (30mins) and set all the script timeouts and the PHP timeouts to 1800 seconds, nothing seems to work.

So unless anyone knows of something I've missed, the Uber Uploader is only good for 15mb files or less.

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
There may be an issue outside of the web configuration.

In my experience with web servers, usually the OS itself tends to kill off certain tasks after they run for long enough (particularly if the task is using a lot of CPU, and I imagine reading in megabytes of data is CPU-intensive. Especially if you use a while() loop, which lets Perl run as fast as possible. select() could slow it down a bit, but I digress).

For instance, on my virtual private server (which I have root access to) running a vanilla install of CentOS 5 (no cPanel or any management software installed, and a custom-built Apache with suexec compiled in), I made a custom backup script called kbackupd and it runs 24/7, kicking into action every 7 days or so to `scp` my sites to a backup server (I should use rsync, but, I haven't gotten around to changing it)...

My server allows my backup script to run 24/7 because 99% of the time it uses 0% CPU. I had another script running, though, that I used to poll Craigslist pages every so often and alert me if it found certain keywords in the ads. My server wouldn't allow this script to continue running for any length of time at all. If I kept a terminal open and ran this script without detaching it from the terminal, it could run for as long as my server would keep my SSH session alive (which turns out to be nearly forever as long as I'm not looking at an empty prompt).

So, I imagine it would be possible that the server itself kills off processes and not necessarily your Apache or PHP configuration. As for how to disable this behavior, I don't know; I saw another post on this site (or maybe another Linux forum) about how to disable this and there wasn't an answer there, either.

As for that select() thing I mentioned, all I know is that these 2 bits of code mean the difference between 100% CPU 100% of the time, or < 1% CPU 99% of the time:

Code:
while (1) {
   print ".\n";
} # uses 100% CPU

while (1) {
   print ".\n";
   select(undef,undef,undef,0.001);
} # even the 1/100th of a second helps a lot, CPU usage <5%

When I'd run batch Perl scripts that manipulate a lot of files by reading/writing them, my CPU usage on my PC would spike the whole time; this might be the case with CGI scripts reading in upload data. Adding a small pause might help; you can try it and tell us if it helps.

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
I did some digging and found the script I was talking about that used fork().

It was linked to from a Perlmonks thread about file upload progress bars (
The reply linked to this page saying that it addressed the problem of timeouts (where the server or apache or the browser gives up and disconnects):
In the link it uses traceroute as the long-running process. It looks as though using fork() and closing STDOUT causes Apache to just let it run in the background no longer associated with Apache, which might help a bit.

Note that fork() doesn't work on Win32; instead, fork() is emulated using interpreter threads, instead of spawning a new process. This may be a hindrance if you're on a Windows server.

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Hey,

For lack of being able to find a simple, concise, no-bullshit, straight-to-the-point file uploader with progress bar, I wrote one myself.

Info:
The basic gist is that onSubmit with your form, you start having ajax go on a loop polling a CGI script to get the details on how the file upload is going. The browser won't change the page until the CGI script is finished processing your upload, and so the ajax has all that time to keep polling for progress.

It might be useful.

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Hi

Hmm... That is it ? I thought there are some sophisticated tricks.

Well, you really cut off the bullshit there. [medal]

( Thinking that certain persons are selling such ( or worst ) solutions for money, I am sorry we can not give bigger star than that. )

Feherke.
 
Well I finanly got the uber file uploader working by editing manually the IIS6 metabase.xml file to change connection timeout and CGI Timeout, all works fine now...well.

new problem, but will start new thread for that.

I understand the priciple of checking the CGITemp file size, but how do you know how big the file is meant to be at start?

Anyways, the uber uploader now works, so I guess, if it aint broke!

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
Are you trying to verify you got the same size file as you started with? Can you md5 the file from javascript and compare it to a md5 on the server?

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[noevil]
Travis - Those who say it cannot be done are usually interrupted by someone else doing it; Give the wrong symptoms, get the wrong solutions;
 
Kirsle, I wish you'd replied before I spent 2 days reading the code bloat that is the uber uploader.

I've got it all working now, so can't be arsed to implement / try yours, but if I need another uploader, i'll certainly look at trying yours.

The uber uploader tries to be too clever using far to many switches and ini files, and why half PHP and half perl?

talk about making it far more complicated than needs be, where as a quick glance at yours and it seems straight forward and adatable to anyones system.

Good job, sotty i can't give you anymore stars!





"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
On my uploader I got the original size of the incoming file by doing a -s on the filehandle

Code:
my $handle = $q->upload("file");
my $size = -s $handle;

On the half PHP/half Perl bit, I'd say the coder likes PHP for the most part but uses Perl because Perl has lower-level access to incoming file upload data than PHP does (or at least I heard such claims on one site or another while searching for file uploaders). Although I think they should just use Perl for everything, but that's to go off on a completely new tangent of discussion. ;-)

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Kirsle,

Does this actually work?

Code:
my $size = (-s $handle);

I don't know if you can implement a hook to get the data needed for the progress bar output but in the CGI docs it looks like you can using a callback function:

Code:
$q = CGI->new(\&hook [,$data [,$use_tempfile]]);

 sub hook
 {
        my ($filename, $buffer, $bytes_read, $data) = @_;
        print  "Read $bytes_read bytes of $filename\n";         
 }


------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
Doing -s on a filehandle works for me (Fedora 10 server with the Apache that comes with it). I saw that on a page that I came across when searching for this. The number of bytes it gets from the upload matches the number given by `ls -l`.

I saw the CGI hook on a Perlmonks thread, - I just didn't use it. I was poking at other solutions when I got the basic idea about how to use ajax to handle the uploader, and with that basic idea I made a quick proof of concept to test it.

(To test that -s on a filehandle thing, it works on regular filehandles too):

Code:
[kirsle@firefly Tk-Spectrum-0.01]$ perl
foreach my $file (<*.*>) {
	print "Opening $file\n";
	open (FILE, $file);
	my $size = -s FILE;
	print "   Size: $size\n";
	close (FILE);
}
__END__
Opening Makefile.old
   Size: 23291
Opening Makefile.PL
   Size: 574
Opening META.yml
   Size: 306
Opening Tk-Spectrum-0.02.tar.gz
   Size: 17270
[kirsle@firefly Tk-Spectrum-0.01]$ ls -l
total 100
drwxrwxr-x 8 kirsle kirsle  4096 2009-03-31 17:10 blib
-rw-r--r-- 1 kirsle kirsle   249 2009-03-31 17:09 Changes
drwxr-xr-x 3 kirsle kirsle  4096 2008-03-27 11:26 lib
-rw-r--r-- 1 kirsle kirsle 23291 2009-03-31 17:10 Makefile
-rw-r--r-- 1 kirsle kirsle 23291 2009-03-31 17:09 Makefile.old
-rw-r--r-- 1 kirsle kirsle   574 2009-03-31 17:08 Makefile.PL
-rw-r--r-- 1 kirsle kirsle   150 2008-03-27 11:26 MANIFEST
-rw-rw-r-- 1 kirsle kirsle   306 2008-03-27 11:26 META.yml
-rw-rw-r-- 1 kirsle kirsle     0 2009-03-31 17:10 pm_to_blib
-rw-r--r-- 1 kirsle kirsle   535 2009-03-31 17:10 README
drwxr-xr-x 2 kirsle kirsle  4096 2008-03-27 11:26 t
-rw-rw-r-- 1 kirsle kirsle 17270 2009-03-31 17:10 Tk-Spectrum-0.02.tar.gz

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Though I'm not a coder, but I dabble in it as requirments come along. Threads like this are one of the reason, it's the first on my list to check. Start for Kirsle.
 
I'm skeptical....

-s does work on filehandles but the file has to be on the same computer. If your code can get the filesize that way it means the file is already uploaded so the progress bar is not showing real time data as the file goes from the remote PC to the server. Correct me if I am wrong.

So far your code is not working for me. I am using Opera 9.62 and the script is uploaded to an apache 2/Linux (Debian) server here:


The file uploads but the progress bar doesn't do anything.

------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
Hmm.

I would've figured that the filehandle given by CGI.pm comes with a length, which CGI got from the HTTP headers... you can read the filehandle until EOF, so it's not like a socket handle where the read continues forever until the other end of the socket closes.

I tested my script on a localhost server. Hitting upload would start Firefox loading the next page so the throbber animation begins, 1 second later the debug div would say "your session id wasn't found", and that would last for 5 or 10 seconds before it would start showing the progress, where each 1-second update would show another 5 to 10% of the file uploaded.

And then when the progress bar got near the end (90% or higher) it would stop updating and instead Firefox would load the page saying "thank you for your file". (I'd guess at that point, Firefox was getting the new page and so it unloaded the current page, halting the JavaScript running on it).

It could be possible that, since the server was on localhost, that the 700MB or so ISO image I uploaded was received within 5 or 10 seconds by Apache, but in that case the new page would've loaded right away, instead of beginning the progress bar and taking that long to update it towards 100%.

Also, I think I should've picked a smaller upload when testing your site. ;) I tried uploading JDK and after a while, the debug div went from that "session not found" error, to an internal server error. ;-)

At any rate, it'll take some further investigation to see how to get the uploaded file's size at the beginning of the request. I may have to revise my strategy for a progress bar.

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Well, it does work, not the progress bar, but the -s on the upload filehandle seems to work. I am surprised.

------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
oops..... not so sure now. If I try and load a larger file 2.5MB it will not report the file size until after the file has been transfered to the server. The small test file was transfering so fast it appeared to work.

------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
I was messing with the script so when you tested it on my server I may have had it partially disabled. It should be back in working order now althougth the progress bar doesn't do anything. I think your testing on localhost isn't a good test unless thats the way someone plans to run the script.

------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top