I am looking for architecture suggestions. I have many different domains that use php to include an api call to a third party server. The results are xml and I use simplexml to parse the results.
Generally the third party api is very good. Sometimes, however, it is slow and causes pages not to load. Once it broke and I had thousands of pages not loading until the api was fixed.
I have been trying to eliminate http calls, so I am defining variables first, then including the script with simplexml. Everything I have read says there is no timeout function with simplexml.
The alternative would be to use file_get_contents, and throw a timeout in there somewhere. But this gets back to using http and the GET variable to define the variables, which results in many thousands of http calls.
Is there a simple way around this? That is, a way to generate and parse the xml, insert a timeout so the page load will not hang, and make the calls through the directory structure rather than http?
Thanks in advance for any tips!
Mike
Generally the third party api is very good. Sometimes, however, it is slow and causes pages not to load. Once it broke and I had thousands of pages not loading until the api was fixed.
I have been trying to eliminate http calls, so I am defining variables first, then including the script with simplexml. Everything I have read says there is no timeout function with simplexml.
The alternative would be to use file_get_contents, and throw a timeout in there somewhere. But this gets back to using http and the GET variable to define the variables, which results in many thousands of http calls.
Is there a simple way around this? That is, a way to generate and parse the xml, insert a timeout so the page load will not hang, and make the calls through the directory structure rather than http?
Thanks in advance for any tips!
Mike