lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Fri Mar 24 14:17:19 2006
From: nocfed at gmail.com (nocfed)
Subject: Re: Re: Re: Links to Google's cache
	of626FrSIRTexploits

On 3/23/06, Dave Korn <davek_throwaway@...mail.com> wrote:
> nocfed wrote:
> > Really, do you ``hackers'' really not know howto at least read the
> > manpage for wget?
> >
> > There is no need for any script, only a few switches to wget.
> >
> > Hint: -e robots=off
>
>   Wow!  j00 R so 1337!  Hint:  -e clue=on
>
>   Seriously, I truly phj33r your 4w3s0Me!!!one!1 man-page reading skills,
> but how could you imagine that switch could possibly make the slightest
> difference?  robots.txt is enforced (or ignored) by the client.  If a server
> returns a 403 or doesn't, depending on what UserAgent you specified, then
> how could making the client ignore robots.txt somehow magically make the
> server not return a 403 when you try to fetch a page?
>
>   If you think that a switch that makes no difference to the data going over
> the wire could affect the response given to an otherwise identical protocol
> request sent back by the server, you must think they're using IP over ESP as
> a transport layer.  Which rfc was that again?
>
>   Or perhaps you just don't understand the first thing about the
> client-server model of system architecture.  In which case you're in no
> position to go around calling other people hackers in sarcastic quote
> marks[*].
>
>   Anyway, this is a great illustration of the dangers of posting smartarse
> replies without actually having TRIED what you claim will work.  Let me
> *prove* it: here's what happens if you try and wget the list of cached page,
> first with no switches, then with -e but no -U, then with -U but no -e.
>

You have failed to understand the 'hint' part.  It was a hint at ONE
of the switches to use..

As you apparently have not read the manpage for wget, here is the full command.

wget -e robots=off -Hr -nd -np --domains=72.14.203.104 -U Mozilla
http://www.elsenot.com/frsirt-google.html

Now please go snarf down the interweb.

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ