Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

URLDownloadToFile Locking Up

Status
Not open for further replies.

Error7

Programmer
Jul 5, 2002
656
GB
I use the following function, Hypetia's I think, to download still images from multiple remote video servers:

Function Download(Url As String, FileName As String) As Boolean
DoEvents
Download = URLDownloadToFile(ByVal 0&, Url, FileName, 0, 0) = 0
End Function

The function works well most of the time but occasionally it seems to get stuck in the loop and locks the programme. I have tried a couple of timer routines to break out but when the function locks up the timers stop incrementing as well.

Any suggestions?

[gray]Experience is something you don't get until just after you need it.[/gray]
 
This thread might help. thread222-1010889 Check periodically to see if the process is still running, then kill it if it goes over a time limit.
 
You might try calling this API with a progress callback, which permits cancellation if desired.

URLDownloadToFile Function


Easier, and more VB-friendly, might be to use the native VB form of this via the AsyncRead Method. This also provides for cancellation via the [tt]AsyncReadProgress[/tt] event.
 
By the way, my post won't help at all unless you isolate the download functionality in a separate process.
 
I checked before bedtime and no response to my posting. I was beginning to feel neglected but you night owls have obviously been busy.

Thanks Bob and Dilettante. I'll check these out later today.

Alan

[gray]Experience is something you don't get until just after you need it.[/gray]
 
It looks like the AsyncRead may be the solution. I spent most of the day getting to grips with it so have only just cobbled together a working version.

I will leave it running over the weekend. Hopefully when I come in on Monday it will have finished downloading about 1,600 images without locking up.

Thanks.

[gray]Experience is something you don't get until just after you need it.[/gray]
 
<you night owls have obviously been busy
I get the feeling that you don't live in California....
 
I get the feeling that you don't live in wet and windy Manchester (England) [bigglasses] Lucky You.

[gray]Experience is something you don't get until just after you need it.[/gray]
 
But, if I did, I could go see Wayne Rooney play for Man U, and I wouldn't get asthma from the smog.

(I lived in wet and windy Oxford when I was a lad. The January "paper round" was quite the experience; one can understand the popularity of Wellingtons...)
 
Paper rounds eh. Don't see many of them any more. 48 years ago I did 7 mornings and 6 evenings for 12 shillings and 6 pence a week. Just over 1 dollar in today's money.

Happy days!

[gray]Experience is something you don't get until just after you need it.[/gray]
 
Ah, for me it was 37 years ago, and it was 7 mornings for 27/6.
 
Bob it was that long ago I'd forgotten how to write 12/6. Thanks for the reminder. (Still got some 10 bob notes lying around somewhere)

[gray]Experience is something you don't get until just after you need it.[/gray]
 
For something of this scale you could run several in parallel. It's more a question of whether you'd use it enough to justify the extra engineering.


It might go like:

Put your "requests" into a queue of some sort. Use a control array of the downloader UserControls and dispatch new requests as each completes - until the queue is empty.

Each UserControl instance could have its own Timer control, started when downloading begins. When triggered, set a module level "Abort" Boolean variable that gets tested in the Progress event.

On "timeout abort" you might increment a TimeoutCount in the request object (UDT, Class, Recordset Record, etc.) and either put the aborted request back into the request queue or move it to a "dead" queue (i.e. when timed out 3 times) for reporting.


One complication is that AsyncRead activity is based on the WinInet API, which has a built in limit of 2 concurrent connections to the same remote host. You'd want to try to manage that or your "3rd" (4th, etc.) simultaneous request would end up timing out or getting an error all the time.
 
Thanks dilettante but I'm hoping that when I get into the office tomorrow, the AsyncRead will have completed without crashing or locking up. If so then I can ad the bells and whistles and move on.

Alan

[gray]Experience is something you don't get until just after you need it.[/gray]
 
Well the AsyncRead completed the task without a hiccup. All I have to do now is replace the URLDownloadToFile calls in my main app without breaking anything.

Thanks for all the help.

Alan

[gray]Experience is something you don't get until just after you need it.[/gray]
 
<Still got some 10 bob notes lying around somewhere

Well, in that case it would seem that you are not Mean Mr Mustard, for otherwise I would think that you would know exactly where at least one of them was. :)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top