Path: usenet.cise.ufl.edu!usenet.eel.ufl.edu!www.nntp.primenet.com!nntp.primenet.com!feed1.news.erols.com!howland.erols.net!news-peer.gsl.net!news.gsl.net!news.mathworks.com!uunet!in2.uu.net!crusty.teleport.com!nntp0.teleport.com!usenet
From: Gisle Aas <
[email protected]>
Newsgroups: comp.lang.perl.announce,comp.lang.perl.misc
Subject: libwww-perl-5.04
Followup-To: comp.lang.perl.misc
Date: 25 Oct 1996 00:55:37 GMT
Organization: Schibsted Nett AS
Lines: 50
Approved:
[email protected] (comp.lang.perl.announce)
Message-ID: <
[email protected]>
NNTP-Posting-Host: gadget.cscaper.com
X-Disclaimer: The "Approved" header verifies header information for article transmission and does not imply approval of content.
Xref: usenet.cise.ufl.edu comp.lang.perl.announce:40 comp.lang.perl.misc:6327
A new version of libwww-perl is available from
<URL:
http://www.sn.no/libwww-perl/> as well as from
CPAN/authors/Gisle_Aas/
Libwww-perl is a collection of Perl modules which provides a simple
and consistent programming interface (API) to the World-Wide Web. The
main focus of the library is to provide classes and functions that
allow you to write WWW clients, thus libwww-perl said to be a WWW
client library. The library also contain modules that are of more
general use.
Changes since 5.03 are:
o Added HTTP::Daemon. This is a HTTP/1.1 server class.
o HTTP::Message support the protocol() method. Used by HTTP::Daemon.
o HTTP::Response can be constructed with a header and content as
argument.
o Typo corrections in the documentation.
o File::Listing::parse_dir accepts "GMT" as timezone now.
o HTML::Parser will call the start() method with two new parameters;
$attrseq, $origtext.
o Integrated HTML::FormatPS patches from
Jim Stern <
[email protected]>
o Class modules don't inherit from AutoLoader any more. They just
import the AUTOLOAD method.
o LWP::Protocol will untaint scheme before loading protocol module.
o Digest does not send "opaque" if it was not present in the request.
The "Extension" header is not returned any more.
o New method: $url->crack that will return a list of the various
elements in a URI::URL.
o WWW::RobotRules did not use the agent() method when determining
who we are. This affected WWW::RobotRules::AnyDBM_File parsing
for robots.txt. Visit count did not increment for
WWW::RobotRules::InCore.
--
Gisle Aas <
[email protected]>