Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

File caching before sending

Status
Not open for further replies.

Tyger

Programmer
Sep 19, 2001
21
GB
I am trying to write a software download area in Perl which checks that a user is logged in before delivering a file to them.

It is not enough to produce a page with a link on it since this link could be copied and posted on another website as a 'back door' to a supposedly protected file.

My solution has been to send the file data to the browser (effectively sending it to STDOUT) from the script with a suitable header indicating that it is a download. This works fine for small files and is acceptable up to 30MB+ but it appears that the server is caching the entire file before sending it to the user as there is a lot of server-side hard disk activity.

If the file is very large (150MB+) the request times out before caching is completed. I have tried this on Windows and Linux and the outcome is the same. Is there any way of tunring off caching of this type?

An HTML link to a large file works fine and several users can download simultaneously so I wonder why sending the file from a script should be any differnet.

The heavy hard-disk activity begins as soon as the download page is opened and before the "Save as..." dialog has appeared. Almost as if it is reading ahead and preparing required files before sending anything to the browser. Perhaps this is more to do with the way Perl deals with file handles but I have had no luck in the Perl forum.

Thanks.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top