Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Data sweep of website

Status
Not open for further replies.

d0nny

IS-IT--Management
Dec 18, 2005
278
GB
Is there a way of auto-reading a website with data on it (that changes) so I can then store it (or update my data based on the data I glean) and then use on my pages?
Obviously the easiest answer would be some RSS feed, but the site doesn't offer RSS.
The site would need to be parsed in some manner so I only got the data (figures) I need, but is there such a way of doing this?
 
yes.
always assuming that you have the permission of the site owner to screen scrape.
use file_get_contents() or some similar file function or cURL to capture the page that you are looking for.
have a routine to parse the captured text for the information you want,
check whether there have been changes and, if so, update your database.

make the whole thing happen periodically by setting a cron job.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top