http://pear.php.net/package/HTTP_Client That package supports posting data and deals with cookies and such so that you can login to a site and get pages from it afterwards. You can then use something like HTMLSax to parse it if you really need it. I use Perl regular expressions myself for simple HTML parsing. On Fri, 9 Jul 2004 09:40:27 -0700, Flint Doungchak <flint@xxxxxxxxxxxxxxxx> wrote: > Hello All, > > I'm doing a little research. I need to pull some dynamic content off > some sites that don't offer XML data of RSS. So I can't go about it that > way. > > I'm wondering, from a theoretical sense, it is possible to write a php > application that could log into site, supply the necessary post > information (i.e. search criteria, etc.), and then parse the results? > I'd be going to the same site and I could somewhat depend on format. > It's just that the site admin isn't interested at this point in > developing an XML app I could tap. > > Practically speaking, I worry about things like sessions, SSL, supplying > that POST and GET data, etc. Has anyone ever done anything like this? > I've seen some static page and URL parsing on the list, but I'm talking > about something a bit larger (almost like a PHP based web browser > thingy...) I'm just looking at possible solutions now, any ideas? Any > resources would be great, I really don't mind reading other people's > experiences... > > Thanks, > > -Flint > > -- > PHP Windows Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > > !DSPAM:40eecc5a7186397312192! > > -- DB_DataObject_FormBuilder - The database at your fingertips http://pear.php.net/package/DB_DataObject_FormBuilder paperCrane --Justin Patrin-- -- PHP Windows Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php