lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite for Android: free password hash cracker in your pocket
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <4AA14DF8.30901@gmail.com>
Date: Fri, 04 Sep 2009 12:27:20 -0500
From: Rohit Patnaik <quanticle@...il.com>
To: full-disclosure@...ts.grok.org.uk
Subject: Re: windows future

And that's also ignoring the fact that you don't have to scan for things 
that you know you're not exposed/vulnerable to. For example, I don't 
take precautions against Feline Immunodeficiency Virus, because I know 
it can't infect humans. I also don't take precautions against Ebola or 
Smallpox because the chance I'd be exposed to them is vanishingly small.

In the same way, I don't worry about IIS threats - I'm not running an 
IIS server. I'm not worried about threats to Outlook - its not my mail 
client.  I don't worry about boot sector virii from the late 80s/early 
90s - they're far too rare to spend time on.  Likewise, I don't care 
about threats against which I've already applied vendor patches or 
service packs.  The total number of threats may be growing 
exponentially, but once you factor in the growing immunity of my 
computer system to said threats, the number of outstanding threats 
(things for which I don't have immunity, and are capable of infecting my 
machine) drops to a much more manageable level.

--Rohit Patnaik

Valdis.Kletnieks@...edu wrote:
> On Fri, 04 Sep 2009 15:46:19 BST, lsi said:
>
>   
>> - approximate date when number of NEW threats reached 1 Million: 2008
>>
>> - approximate date when number of NEW threats will reach 1 Billion: 2015
>>
>> - approximate date when number of NEW threats will reach 2 Billion: 2016
>>     
>
> This is assuming an exponential growth model, when there's no realistic
> reason to believe it to be so.  There are however good reasons to expect
> that the correct model is the "logistics curve" (slow growth at first,
> a steep middle section, then flattening out asymptotic to a horizontal line).
>
> For starters, new threats have to come from *somewhere*, and there's only
> a limited supply of dark-side code hackers, and a limited supply of people
> worth fleecing (sure, OLPC may distribute 100M laptops - but those are going to
> people who can't be monetized easily).  From whence will the 1 billion
> new threats in the 2015-16 span come from? Who will create these, and who will
> make money from them?  At what point will some of the marginal players leave
> the game and find other avenues of making money?  Remember - if the threat
> pool is 100,000, and you have 1,000 threats, you have 1% of the market, and
> can probably live well off that 1% if monetized.  But if you have 1,000 threats
> in a pool of a billion, you're a marginal player and not likely to get rich
> fast doing that.
>
>   
>> - charts showing this: 
>> http://www.cyberdelix.net/files/malware_mutation_projection.pdf
>>
>> - will the AV companies be able to classify 1 billion new threats per 
>> year? that is 2.739 MILLION new threats per DAY (over 1900 new 
>> threats per minute).
>>
>> - will your computer cope with scanning every EXE, DLL, PIF etc 1 
>> billion times, every time you use them?
>>     
>
> You don't have to scan it a billion times. You need to scan it *once* for
> one billion attacks.  And proper pattern-matching should help a lot here - quite
> often, you'll have 2,934 exploit codes in the wild, all using the same attack
> code lifted from Metasploit or milw0rm or whatever.  So only one check is
> needed.  A bigger danger here is if we start seeing *single* threats that
> include a really good real-time polymorphism/obfuscator - *that* could really
> suck.
>
>   
>> - aside from the theoretical limits imposed by hardware and software, 
>> there is one extra limit, imposed by users.  Users will not tolerate 
>> machines operating slowly, and will seek alternative platforms well 
>> before 100% CPU utilisation (either as a direct result of the size of 
>> the blacklist, or indirectly caused by swapping due to low RAM).  
>> This user limit might be lower than 20% CPU utilisation.  If users 
>> figure out that 20% of their time is being wasted, and rising fast, 
>> they will run for the exit.
>>     
>
> Interesting statistic - year before last, around 10% of all new computer
> purchases were replacements for malware-infested boxes.  Just buying a new
> one was easier/cheaper than trying to fix the old one for a lot of people.
>
> Second interesting statistic - the vast majority of that 10% ended up using
> the exact same operating system.
>
> So even when it's well past the 20% mark and the box is basically unusable,
> they *still* don't run for the exit.
>   
> ------------------------------------------------------------------------
>
> _______________________________________________
> Full-Disclosure - We believe in it.
> Charter: http://lists.grok.org.uk/full-disclosure-charter.html
> Hosted and sponsored by Secunia - http://secunia.com/

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ