lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite for Android: free password hash cracker in your pocket
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <42065914.19012.4738501@localhost>
From: nick at virus-l.demon.co.uk (Nick FitzGerald)
Subject: Multiple AV Vendors ignoring tar.gz archives

James Eaton-Lee wrote:

> In the majority of businesses, firewalling and virus protection are done
> at the border of the network for a reason; when you eliminate as many
> viruses and malicious network traffic in one place at the edge of your
> network first, you effectively segregate 'good' and 'bad' information,
> and provide business accountability into the bargain. 
> 
> It is for precisely this reason that 99% of businesses have some sort of
> firewall (even if it's only a NAT gateway) on the *EDGE* of their
> network, at the internet point of presence, rather than having client
> firewall software installed on every client PC. Although you'd be
> hard-pressed to find a business without client antivirus software on
> every machine (if only for the simple reason that you can't protect
> floppy disks with a firewall), you can get away with not scanning
> e-mails on a client machine. 

Did you miss the part of my message where I wrote:

   Well, OK -- in a gateway scanner it is likely to be a terrible flaw.
   Any vaguely competent gateway scanner should have basic knowledge of
   all archive formats and should have an option to quarantine all
   messages with archives in the formats it cannot unpack and inspect.
   Sadly, most gateway scanners are not designed this way.  It is the
   job of a gateway scanner to not let anything "dangerous" in and if
   you cannot tell what something is, prudence says you keep it out, or
   at least set it aside for more expert inspection.  

Didn't that make you think I may have had an idea or two about the 
border/inside distinction?

> Add to this the simple fact that it is difficult to manage and account
> for client anti-virus software - client machines are to an extent
> unmanaged and an 'unknown quantity' ...

Actually, that's only true in a badly run outfit, but as that is most 
outfits, I'll let it pass as a pragmatic reality...

> ... whereas servers can be *very*
> tightly locked down, assuring a far higher degree of reliability from
> server-based virus-scanning tools, firewalling, and intrusion detection
> than client machines which are vulnerable to a wide range of attacks and
> problems. Server anti-virus software starts to become more and more
> attractive at this point..

Even if it ignores known archive formats that it cannot scan inside and 
simply passes such attachments on?  (See above...)

<<snip>>

Yes, yes, we all agree that scanning Email before it gets to the 
clueless users is a really spiffing idea.  That I never said otherwise 
makes me wonder why you think you are disagreeing with me on this, so 
I'll just put that down to your mis-reading (or not actually botrhering 
to read at all??) what I wrote.

> I also disagree with you that it's a non sequitur to say that anti-virus
> writers would be slow to react to viruses transmitted inside archives
> they could not scan; ...

That is because you clearly did not actually understand what I wrote.  
I suggest you go read my message again...

> ... all that an antivirus package is - essentially - is
> a daemon doing pattern matching on all data coming into and out of
> machines; ...

At the very most abstract, that is quite true of today's known virus 
scanners (it ignores all manner of much more complex stuff that 
happens, but as you cannot understadn what I write, I'll not bother 
trying to explain further).

> ... anti-virus definitions (patterns) are very easy to write and

No.

Perhaps if you have looked at ClamAV you would think that, but then 
Clam is not exactly the paragon of good AV practice...

> require little work to push to customers of anti-virus writers; archive
> scanning, however, is a *functional* difference in the way in which
> antivirus software works, which carries numerous implications; updating
> an antivirus scanner's scanning engine is quite different to updating
> the definitions. 

I _know_ that, as you would understand had you actually tried to 
understand my message.

HOWEVER, what you have still missed is that it is entirely unnecessary 
IN A DESKTOP SCANNER to be able to scan inside most archive formats 
because any code deleivered in archives must be unpacked, AT WHICH 
POINT THE AV WILL SEE IT, to be able to do anything.  (Of course, sub-
systems and components that handle certain "archive" formats purely in 
memory, are the exceptions -- .CHM and .JAR spring to mind as likely 
examples, and there are probably a few others, but this does not extend 
to the whole gamut of file archiving/compressing formats).

> Add to this the fact that implementing archive support in an antivirus
> package isn't as simple as it might seem; although bz2 is released under
> a BSD license, gzip isn't - it's GPL, and therefore any antivirus vendor
> would have to write their gzip code totally from scratch. There could
> certainly be enough of a curfuffle surrounding a virus using one of
> these archive formats to cause a delay of a few hours or even days in
> releasing updates for antivirus software which addressed the issue - and
> as we all know, the major damage in any virus epidemic is done in a very
> short space of time. 

In general, archive and compression handlers are written to slot into 
the recursive pseudo-filesystem harness of virus detection engines.  
Getting the algorithm right is seldom the most time consuming or 
complex piece of doing such an addition...

BUT you have still missed the flaming obvious -- a desktop scanner does 
not have to detect malware inside an archive.  As such, the malware is 
neutered.  _IFF_ the user has suitable archive handling utilities to 
unpack the archive, _then_ the on-access virus scanner will be able to 
detect the malware file and block further access/warn the user/etc.  
So, while it will take several days to several weeks (depending on the 
amount of format reverse engineering that is needed, developer 
avaialability and amount and quality of QA typically done for such 
things) for a vendor to add handling of a new archive format once they 
decide they should add such handling, adding detection of the "normal" 
binary form of the malware to the desktop scanner can progress 
unhindered by the fact that some new malware uses an archive format 
that is not supported by the AV engine.  (And, just in case you've 
forgotten, that is not desirable in gateway scanners, but as most of 
them simply pass "unhandled" archive formats now, you're not really 
that much worse off there either)

> To be honest, though, that isn't really the point. antivirus vendors
> necessarily have to write antivirus definitions reactively - but there's
> nothing other than sheer laziness which is preventing them from
> *pro*actively incorporating support for these types of archives into
> their software.

One thing that "prevents" them from adding such support is the scanning 
overhead in the on demand scanner when the user sets a scan of their 
whole hard drive going.  In general, most vendors seem to prefer to 
spare your CPU cycles rather than watsing them unpacking all manner of 
compressed, archived and "compound" file formats that are currently not 
known to "naturally" carry malware.


-- 
Nick FitzGerald
Computer Virus Consulting Ltd.
Ph/FAX: +64 3 3267092


Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ