I'm writting a program that has to retrive information from hundreds of web pages and put them into a single file. Most of the URL's are sort of consecutive (for example, , , ... Every once in a while a page will be missing, so I'll get an error and the program will stop. Can any one help me? I've been trying to use and a loop which steadily increases, but I need to put in something for the non-existant pages. Thanks!