Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > websites and cookies?

Reply
Thread Tools

websites and cookies?

 
 
Tomasz Chmielewski
Guest
Posts: n/a
 
      09-24-2008
I have a webmail which is protected by a "user/password" form (one which
you can find in any webmail, i.e. squirrelmail, yahoo/google mail etc.).

The webmail is the only way to fetch mail; there is no POP3/IMAP there.


Therefore, I'd like to write a Perl script which will "log in" to that
page automatically, parse certain pages (inbox, new mail, save it as
files etc.).

As far as I understand, I need to "emulate" a webbrowser a bit:
1) log in to a website (fill proper form fields, send it to the web server)
2) get the cookie, browse other pages using that cookie


What do I need to implement it in a Perl script?

I'm not looking for a ready solution, I just need some terms to look up
(CGI::Cookie perhaps for handling cookies; but how do I fill a form to
obtain the cookie? links browser?).

Any links to Perl examples would be great too, of course.


--
Tomasz Chmielewski
http://wpkg.org
 
Reply With Quote
 
 
 
 
Sherm Pendley
Guest
Posts: n/a
 
      09-24-2008
Tomasz Chmielewski <(E-Mail Removed)> writes:

> As far as I understand, I need to "emulate" a webbrowser a bit:
> 1) log in to a website (fill proper form fields, send it to the web server)
> 2) get the cookie, browse other pages using that cookie
>
> What do I need to implement it in a Perl script?


Have a look at WWW::Mechanize

sherm--

--
My blog: http://shermspace.blogspot.com
Cocoa programming in Perl: http://camelbones.sourceforge.net
 
Reply With Quote
 
 
 
 
Tad J McClellan
Guest
Posts: n/a
 
      09-25-2008
Tomasz Chmielewski <(E-Mail Removed)> wrote:
> I have a webmail which is protected by a "user/password" form (one which
> you can find in any webmail, i.e. squirrelmail, yahoo/google mail etc.).
>
> The webmail is the only way to fetch mail; there is no POP3/IMAP there.
>
>
> Therefore, I'd like to write a Perl script which will "log in" to that
> page automatically, parse certain pages (inbox, new mail, save it as
> files etc.).
>
> As far as I understand, I need to "emulate" a webbrowser a bit:



You can use the Web Scraping Proxy to write Perl code that
emulates a browser. Google for "Web Scraping Proxy".


> 1) log in to a website (fill proper form fields, send it to the web server)
> 2) get the cookie, browse other pages using that cookie
>
>
> What do I need to implement it in a Perl script?



WWW::Mechanize (or LWP::UserAgent).


--
Tad McClellan
email: perl -le "print scalar reverse qq/moc.noitatibaher\100cmdat/"
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Where to buy, and best version for writing websites using VB.NET and ASP.NET??? jonny ASP .Net 3 09-01-2007 12:43 AM
Full trust and medium trust in .net and websites Linda ASP .Net Security 1 08-31-2006 05:16 AM
Missing Websites and no mail =?Utf-8?B?U2hpZnR3b3JrZXI0Mw==?= Wireless Networking 0 09-11-2004 12:31 PM
Problem with Firebird 0.7 and some websites Chris Firefox 5 11-24-2003 03:06 PM
Re: Free revision websites for MCSE, MCSA, A+ and Network + Lady MCSE 0 08-26-2003 04:42 PM



Advertisments