lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  PHC 
Open Source and information security mailing list archives
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Wed, 12 Dec 2012 12:18:59 +0100
From: Christoph Gruber <>
Subject: Re: Google's robots.txt handling

On 12.12.2012 at 00:23 "Lehman, Jim" <> wrote:

> It is possible to use white listing for robots.txt. Allow what you want google to index and deny everything else. That way google doesn't make you a goole dork target and someone browsing to your robots.txt file doesn't glean any sensitive files or folders. But this will not stop directory bruting to discover your publicly exposed sensitive data, that probably should not be exposed to the web in the first place. 

Maybe I misunderstood something, but do you really think that "sensitive" can be hidden in "secret" directories on publicly reachable web servers?
Christoph Gruber
By not reading this email you don't agree you're not in any way affiliated with any government, police, ANTI- Piracy Group, RIAA, MPAA, or any other related group, and that means that you CANNOT read this email.
By reading you are not agreeing to these terms and you are violating code 431.322.12 of the Internet Privacy Act signed by Bill Clinton in 1995.
(which doesn't exist)

Full-Disclosure - We believe in it.
Hosted and sponsored by Secunia -

Powered by blists - more mailing lists