lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite for Android: free password hash cracker in your pocket
[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Message-ID: <20060316081401.GD7141@tivano.de>
Date: Thu, 16 Mar 2006 09:14:01 +0100
From: Peter Conrad <conrad@...ano.de>
To: bugtraq@...urityfocus.com
Subject: Re: WebVulnCrawl searching excluded directories for hackable web servers


Hi,

On Wed, Mar 15, 2006 at 08:38:24AM -0500, Michael Scheidell wrote:
> A misguided person is using the robots.txt exclusion file to search for
> vulnerable web applications. What he plans on doing with this list of
> vulnerable web applications is up to debate.
> 
> What he is doing is a violation of the RFC's (governing robots.txt..
> Yes, hackers do that also)

Which RFC? If you mean http://www.robotstxt.org/wc/norobots-rfc.html ,
that's not an RFC, it's an Internet Draft that expired in 1997.

> The robots.txt file is NOT AN ACCESS CONTROL LIST, and SHOULD NOT BE
> USED TO 'HIDE' DIRECTORIES. AALL DIRECTORIES SHOULD BE PROTECTED AGAINST
> Directory listing.

Yup, definitely.

> Either case, illegal under FEDERAL 1990 computer abuse and fraud act,
> 'attempted access beyond authorization'

As you already pointed out "the robots.txt file is NOT AN ACCESS CONTROL LIST".
In fact, with HTTP you can only tell if you're authorized to access a
document by attempting to access it and looking at the HTTP response code.
Unless you're clairvoyant, of course.

> Several other people also think this is illegal:

Well, then go ahead an sue him.


I really don't see what you're complaining about here. This guy seems
to be pretty open about what he's doing, so I doubt he will be doing
something evil with the information he gains... in contrast to all the
really bad guys who're doing just the same *right now*, *without*
telling anyone about it.

If you have confidential information on your webserver, then secure it
instead of complaining about people who happen to stumble over it.
If you have potentially vulnerable code on your site, pay someone to
audit it and fix the bugs instead of complaining about someone who
simply requests a few URLs.


Bye,
	Peter
-- 
Peter Conrad                        Tel: +49 6102 / 80 99 072
[ t]ivano Software GmbH             Fax: +49 6102 / 80 99 071
Bahnhofstr. 18                      http://www.tivano.de/
63263 Neu-Isenburg

Germany


Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ