lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Fri, 14 Sep 2007 13:18:33 -0400
From: "lostzero" <lostzero@...il.com>
Cc: "'full-disclosure'" <full-disclosure@...ts.grok.org.uk>
Subject: Re: Pro US government hackerganda

You're looking at it from the wrong view.  The 20 terabytes didn't happen
overnight.  Without a starting time frame you have no idea how many "years"
it has been happening.  Not to mention they have workstations and servers
all over the world.  Which means no 1 agency or individual looks at all the
traffic from all the locations at the same time.  If your network produced
terabytes of traffic a day, 50-100mb isn't that eye catching.

-----Original Message-----
From: full-disclosure-bounces@...ts.grok.org.uk
[mailto:full-disclosure-bounces@...ts.grok.org.uk] On Behalf Of
Valdis.Kletnieks@...edu
Sent: Friday, September 14, 2007 12:41 PM
To: jf
Cc: full-disclosure
Subject: Re: [Full-disclosure] Pro US government hackerganda

On Fri, 14 Sep 2007 01:41:40 -0000, jf said:

> You're suffering from a logical falicy, I worked in that arena (albeit it
> a different agency) in incident response for quite some time, while I find
> the number somewhat high, it's not unreasonable, if you broke into $lots
> of workstations and servers on a regular basis and downloaded everything
> that ended in extensions like .pdf, .eml, .doc, et cetera, it wouldn't
> take that long to get up to very high numbers. This is exactly what has
> occurred and makes your assertion that of ignorance and presumption.

Right.  The first point is that you'd have to break into *lots* of boxes to
get
enough PDF's to get 20 terabytes.  That's asleep-at-the-wheel #1.

Then there's asleep-at-the-wheel #2 - moving 20 terabytes off the network
without the resulting traffic being noticed.  It isn't rocket science to
keep aggregate traffic logs and say "Wow, that workstation usually only
sends
a megabyte or two per day to the Outside World, mostly in the form of Google
queries and GET requests for cnn.com pages.  Today they've send 2 gigabytes
out - I wonder what changed".

Maybe that network needs to contract upload detection out to the RIAA.....

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ