Hello,
I am new to Python, and I am trying to read in an arbitrary url to a python script. My script is running on an Apache 1.3.27 server, executed by a handler and a corresponding 'Action' directive.
I have looked at urllib.urlopen, but for that, the url must be static and known, or so it would seem. I need to read one into the script on the fly to parse the page and return the same page.
I am trying to do it with f = sys.stdin.read(), then
after running a few functions, I try to return the file
back to the browser by printing to stdout (or just 'print').
I wanted to read in the file, change something, and print the file back to the browser.
Any thoughts?
Thanks
HJ
I am new to Python, and I am trying to read in an arbitrary url to a python script. My script is running on an Apache 1.3.27 server, executed by a handler and a corresponding 'Action' directive.
I have looked at urllib.urlopen, but for that, the url must be static and known, or so it would seem. I need to read one into the script on the fly to parse the page and return the same page.
I am trying to do it with f = sys.stdin.read(), then
after running a few functions, I try to return the file
back to the browser by printing to stdout (or just 'print').
I wanted to read in the file, change something, and print the file back to the browser.
Any thoughts?
Thanks
HJ