[ back to toc ]

Go to url and grab resulting page

Date: 2002/01/12 00:36

I have a script that I'm working on that utilizes a mysql database for
variables. That part isn't my problem. My problem is that after it gets
the variables, it plugs them into a url. I'm stumped at my next step
which is to have the script "invisibly" go to the url and then save the
resulting page to my server. Is there an easy way to do this?

(unix server btw)

Thanks in advance!
You want to download an HTML page to the server by the server script just
as if it was a client? I assume this is what you meant.

In that case use the module libwww or the Perl package called Net. Consult
the documentation of the package.

If this is not what you seek, or I misunderstood something do not
hesitate to followup this question for clarification or more help.

Pretty much it's a script that is run by a cronjob, not via the browser.
It sends mail through this url and when the url is invoked, a results
window is generated. I need to save that results window to a file on my
server. Does that clarify a bit?

Not really. But anyway: if ever you want to get a page from a Perl
program, you need the Nte package and Net::http


[ back to toc ]