Guest_imported
New member
- Jan 1, 1970
- 0
I have two webservers running the same version of Apache and PHP. One of
them runs a specific script in 0.06 seconds - supposedly, this is measured
in the server (so I've thought), but the other seems to be subject to the
client. For instance, from a 56 k modem client it runs the script slower
than from an ADSL client.
The point is i) I suppose the time is mesured in the server, during the PHP
processing of the script and then Apache sends it to the client, that is,
Apache sends it to the client only after the php processing is done; ii)
Both machines run the same versions of Apache and PHP, so there should be no
difference in behaviour.
I badly need some clue. 'Tis been two weeks that all my thoughts have been
about this puzzle.
What kind of configuration measure could prevent my second server from being
affected by the client (I know it is possible because the first server is
not affected by the client. Unfortunately, the people that configured the
first server don't know the answer either.)
Thanks
them runs a specific script in 0.06 seconds - supposedly, this is measured
in the server (so I've thought), but the other seems to be subject to the
client. For instance, from a 56 k modem client it runs the script slower
than from an ADSL client.
The point is i) I suppose the time is mesured in the server, during the PHP
processing of the script and then Apache sends it to the client, that is,
Apache sends it to the client only after the php processing is done; ii)
Both machines run the same versions of Apache and PHP, so there should be no
difference in behaviour.
I badly need some clue. 'Tis been two weeks that all my thoughts have been
about this puzzle.
What kind of configuration measure could prevent my second server from being
affected by the client (I know it is possible because the first server is
not affected by the client. Unfortunately, the people that configured the
first server don't know the answer either.)
Thanks