lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <1107652741.22051.18.camel@anubis>
From: james.mailing at gmail.com (James Eaton-Lee)
Subject: Multiple AV Vendors ignoring tar.gz archives

In the majority of businesses, firewalling and virus protection are done
at the border of the network for a reason; when you eliminate as many
viruses and malicious network traffic in one place at the edge of your
network first, you effectively segregate 'good' and 'bad' information,
and provide business accountability into the bargain. 

It is for precisely this reason that 99% of businesses have some sort of
firewall (even if it's only a NAT gateway) on the *EDGE* of their
network, at the internet point of presence, rather than having client
firewall software installed on every client PC. Although you'd be
hard-pressed to find a business without client antivirus software on
every machine (if only for the simple reason that you can't protect
floppy disks with a firewall), you can get away with not scanning
e-mails on a client machine. 

Add to this the simple fact that it is difficult to manage and account
for client anti-virus software - client machines are to an extent
unmanaged and an 'unknown quantity' whereas servers can be *very*
tightly locked down, assuring a far higher degree of reliability from
server-based virus-scanning tools, firewalling, and intrusion detection
than client machines which are vulnerable to a wide range of attacks and
problems. Server anti-virus software starts to become more and more
attractive at this point..

..add to this the behaviour of different e-mail clients; especially in
environments which are not based on a single platform, there can be many
e-mail clients run in a business, and not all of them will interact with
anti-virus software in the same way. Linux clients may not be running
antivirus software at all, and yet in the case of .tar.gz and .tar.bz2
archives, are most likely to be exposed to marginally more 'exotic'
antivirus formats. 

Bearing all of these factors in mind, and also factoring the growing
reliability of SMEs on third-party and centralised antivirus scanning
for their mail (from external service providers via MX routing, and via
e-mail servers which aren't exchange which incorporate antivirus
scanning simply by calling the antivirus software on the server itself),
and gateway scanning becomes more and more significant; possibly more so
than desktop software for mail - mail being the most common form of
virus transmission - so significant that this *is*, in fact, a serious
problem.

I also disagree with you that it's a non sequitur to say that anti-virus
writers would be slow to react to viruses transmitted inside archives
they could not scan; all that an antivirus package is - essentially - is
a daemon doing pattern matching on all data coming into and out of
machines; anti-virus definitions (patterns) are very easy to write and
require little work to push to customers of anti-virus writers; archive
scanning, however, is a *functional* difference in the way in which
antivirus software works, which carries numerous implications; updating
an antivirus scanner's scanning engine is quite different to updating
the definitions. 

Add to this the fact that implementing archive support in an antivirus
package isn't as simple as it might seem; although bz2 is released under
a BSD license, gzip isn't - it's GPL, and therefore any antivirus vendor
would have to write their gzip code totally from scratch. There could
certainly be enough of a curfuffle surrounding a virus using one of
these archive formats to cause a delay of a few hours or even days in
releasing updates for antivirus software which addressed the issue - and
as we all know, the major damage in any virus epidemic is done in a very
short space of time. 

To be honest, though, that isn't really the point. antivirus vendors
necessarily have to write antivirus definitions reactively - but there's
nothing other than sheer laziness which is preventing them from
*pro*actively incorporating support for these types of archives into
their software.

regards,

 - James.

On Sun, 2005-02-06 at 11:15 +1300, Nick FitzGerald wrote:
Barrie Dempster wrote:
> 
> > By passing some archives through www.virustotal.com I discovered
that
> > some AV companies ignore tar.gz's and possibly other archive formats
> > that aren't very common on windows systems (but supported by the
common
> > archive tools). 
> > 
> > If virus writers start using these formats AV companies could be
slow to
> > react as in some cases they may have to write functionality into
their
> > products that doesn't currently exist (support for scanning inside
said
> > archives) this could delay signature updates.
> 
> That's a non sequitur.
> 
> If a virus was released that depended on, say tar.gz archives, and
some 
> AV products did not have tar.gz unpacking capabilities, there would
be 
> no hindrance to those companies releasing detection updates.
Afterall, 
> what they detect are the unpacked contents of such archives, so 
> detection of the actual malware is just as easily added and shipped 
> regardless of whether the malware in question self-packs
in .RAR, .ZIP, 
> .TAR.GZ or .ZOO format (or does not pack itself in any archive format 
> at all...).
> 
> Worse however, is the implication that missing unpacking abilities
for 
> some modestly common archive type is a terrible flaw in a scanner.
> 
> Well, OK -- in a gateway scanner it is likely to be a terrible flaw.  
> Any vaguely competent gateway scanner should have basic knowledge of 
> all archive formats and should have an option to quarantine all 
> messages with archives in the formats it cannot unpack and inspect.  
> Sadly, most gateway scanners are not designed this way.  It is the
job 
> of a gateway scanner to not let anything "dangerous" in and if you 
> cannot tell what something is, prudence says you keep it out, or at 
> least set it aside for more expert inspection.
> 
> However, once something gets to the desktop, it is only very mildly 
> inconvenient that a scanner does not know how to unpack, say, tar.gz 
> archives.  The point of a desktop scanner is to stop as much as 
> possible that has got to where it shouldn't be.  Known virus scanning 
> is a far from perfect method for achieving this, but as the only 
> intelligent method of achieving it has been entirely disregarded by 
> users, AV and OS developers, scanning is pretty much what we are left 
> with.  Anyway, as we are assuming that the malware in question can be 
> detected already, let's look at the consequence of a desktop scanner 
> not knowing anything about tar.gz packing and the arrival of a piece
of 
> known malware in such an archive...
> 
> Let's assume that the user has a tar.gz handler and the user double-
> clicks on the dodgy Email attachment in question (the attachment that 
> the shoddy gateway Email scanner should have stopped, even if it 
> couldn't scan inside tar.gz files because this is hardly a
just-minted 
> compression format...).  What happens?  The on-access virus scanner 
> says nothing as the tar.gz file hits the disk in some temp dir, as it 
> doesn't know anything about tar.gz archives.  For the same reason the 
> on-access scanner says nothing when the user's archive-handling
program 
> opens the tar.gz file from its temp dir.  As no code has so far been 
> deposited on the machine in executable form, this is not any kind of 
> failure on the part of the desktop scanner.  (Yes, some lily-livered, 
> weak-kneed sops may _prefer_ the "reassurance" of the malware code 
> inside such files being detected "as soon as possible" but that is
not 
> a strong (or even useful) criterion for judging a desktop scanner's 
> quality.)
> 
> The user now sees, listed in the contents of the archive as displayed 
> by their tar.gz archive handler the "card" or "picture" or "document" 
> or whatever that the Email message promised, so double-clicks it.
_Now_ 
> their virus scanner gets excited.  The archive handling program 
> extracts the file to a temp dir and the on-access scanner (if set to 
> scan on writes and/or closes) detects the malware and pops an alert 
> (and blocks further access to the file or automatically quarantines/ 
> deletes/etc as it is configured).  If the scanner is only set to scan 
> on execute it will pop an alert (and block/quarantine/delete) a
moment 
> later when the archive handler tries to have the file executed.
> 
> There is no failure here.  The desktop scanner "protected" the user,
as 
> designed.
> 
> Yes, it is easy for testers to add tests such as "detects malware 
> packed in .ZIP files", "detects malware packed in .RAR files",
"detects 
> malware packed in .TAR.GZ files" but the results of such tests tell
you 
> squat about the quality of the product.  (In fact, that's not true -- 
> as it seems axiomatic that the larger and more complex a software 
> project the more bugs it will have, it would seem that the more
archive 
> formats a scanner can handle the buggier the scanner will be, so
maybe 
> such tests do tell us something about the quality of the products -- 
> the higher the score, the buggier the product will be...)
> 
> 


Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ