[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <12215.1189788043@turing-police.cc.vt.edu>
Date: Fri, 14 Sep 2007 12:40:43 -0400
From: Valdis.Kletnieks@...edu
To: jf <jf@...glingpointers.net>
Cc: full-disclosure <full-disclosure@...ts.grok.org.uk>
Subject: Re: Pro US government hackerganda
On Fri, 14 Sep 2007 01:41:40 -0000, jf said:
> You're suffering from a logical falicy, I worked in that arena (albeit it
> a different agency) in incident response for quite some time, while I find
> the number somewhat high, it's not unreasonable, if you broke into $lots
> of workstations and servers on a regular basis and downloaded everything
> that ended in extensions like .pdf, .eml, .doc, et cetera, it wouldn't
> take that long to get up to very high numbers. This is exactly what has
> occurred and makes your assertion that of ignorance and presumption.
Right. The first point is that you'd have to break into *lots* of boxes to get
enough PDF's to get 20 terabytes. That's asleep-at-the-wheel #1.
Then there's asleep-at-the-wheel #2 - moving 20 terabytes off the network
without the resulting traffic being noticed. It isn't rocket science to
keep aggregate traffic logs and say "Wow, that workstation usually only sends
a megabyte or two per day to the Outside World, mostly in the form of Google
queries and GET requests for cnn.com pages. Today they've send 2 gigabytes
out - I wonder what changed".
Maybe that network needs to contract upload detection out to the RIAA.....
Content of type "application/pgp-signature" skipped
_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/
Powered by blists - more mailing lists