lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Mon, 17 May 2010 20:34:27 +0100
From: "lsi" <>
Subject: Re: Windows' future (reprise)

On 16 May 2010 at 20:49, wrote:

Copies to:
Subject:	Re: [Full-disclosure] Windows' future (reprise)
Date sent:	Sun, 16 May 2010 20:49:29 -0400

> On Sun, 16 May 2010 23:49:00 BST, lsi said:
> > Malware is flooding at 243% (+/- error).  This is consuming the
> > oxygen in your machine.
> The basic error in your analysis is that although there may in fact be
> 243% more malware samples, that doesn't translate into 243% more oxygen
> consumption.

Yes, I agree that the oxygen is not being used at 243%.  

Last year, I did get a bit excited and said some things like that, 
("you'll need 200 of today's processors, just for malware filtering, 
by 2015."), I do think that was wrong.  So this year, I took pains 
not to say that, you'll note I only said the oxygen was being 
consumed, I didn't say at what rate.  

To go with your pizza example, say the CPU is the pizza, back in the 
80's I had the whole pizza to myself (no AV).  Then I installed AV 
and I had slightly less pizza; the AV takes a small slice of pizza 
for itself.  

As the years have passed the AV is doing more and more work.  That 
means its slice of pizza is growing, and the remainder, which is what 
I get, is shrinking.  

This is to ignore all the other junk that modern systems run, which 
also have their bit of pizza too.  

What I don't know is *how much* extra pizza is being consumed.  As 
you say, 243% extra samples does not correspond to 243% less pizza 
for me.  I am not familiar with the innards of an AV scan engine, so 
this might be naive - but surely there will be more CPU used by the 
AV as the number of signatures increases.

Therefore, there must come a time, assuming malware continues to 
increase in number, when eventually, my PC will use all of its CPU on 
malware filtering.  

Yes - maybe that is 20 years away, and I will have upgraded by then.  
But is it 20 years away?  And what if I can't upgrade?  What about in 
the meantime - am I going to tolerate my slow machine?  How slow is 
too slow?  Time is money.  Why would anyone willingly allow their 
machine to run slowly, and thus cost themselves money?  

As I said last year - as soon as Joe Average Business User figures 
out he can do stuff 25% faster, just by dumping his OS*, he will want 
to dump his OS.  

Note, 25% faster was a guess, that would be easy enough to measure, 
will need some old AV software and signature sets, to clock how fast 
they run while a set of tests are run, then install new AV and new 
signature sets and rerun the tests.  Then run the tests with the AV 
switched off.  

* he doesn't realise what a pain it is, but it's not his problem... 
it's mine!  And everyone else who is paid to keep stuff running.  
Although I see it an an opportunity rather than a problem.  Even Thor 
has his chance, he should get coding on that connector, then sell it 
to all his former competitors....  

> Consider a pizza cut into 8 pieces and somebody comes along and eats 6 of
> them.  Now consider an identical pizza cut 16 ways and somebody eats 12 slices.
> The rate of slice consumption has doubled, but the actual amount of pizza
> consumed hasn't changed.

> Similarly, the fact there's (say) 5 million new malware samples doesn't mean
> there's 5 million new holes in Windows this year.  What you have is 5 million
> new ways of poking the same 20 or 30 new holes.  This makes it a lot easier for
> the A/V companies. Although they may have 37 different samples, there's a very
> good chance they were produced using a Metasploit-like mindset - "pick an
> exploit, add a payload, launch".  And 37 samples that use the same exploit but
> have 37 different payloads need one detection rule (for the exploit), not 37.

Thank you for explaining this.  So what it will come down to is how 
efficient the AV is at reducing that big number (total threats) to a 
smaller number (total detection rules).  37:1 is a big ratio, is that 
likely, however?  Would you know the ratio as currently enjoyed by 
current AV software, by any chance?


Stuart Udall
stuart net -

 * Origin: lsi: revolution through evolution (192:168/0.2)

Full-Disclosure - We believe in it.
Hosted and sponsored by Secunia -

Powered by blists - more mailing lists