lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite for Android: free password hash cracker in your pocket
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <00b201d0165a$857ec5d0$907c5170$@acm.org>
Date: Fri, 12 Dec 2014 14:25:22 -0800
From: "Dennis E. Hamilton" <dennis.hamilton@....org>
To: <discussions@...sword-hashing.net>
Subject: RE: [PHC] How important is salting really?

 --Responding to--
From: epixoip [mailto:epixoip@...dshell.nl] 
Sent: Friday, December 12, 2014 13:10
To: discussions@...sword-hashing.net
Subject: Re: [PHC] How important is salting really?

[ ... ]

It's not just about indexing by salt, though. You still have to maintain
a list of salts to hash each plaintext candidate with, and remove salts
from said list when a salt is eliminated. Regardless of how you do it,
it is the number of salts, not the number of hashes, that slows down a
cracking job. Unless you are working with very large lists on AMD GPUs,
but that's a whole nother can of worms.

<orcnote>
   I believe there is no need for the defender to maintain an index of 
   used salts (although they are stored with the hash values somewhere).  
   One can use a counter or any other systematically-unique segment,
   Combined with a random portion, whenever a new salt is generated.
   One can then focus on the mechanism by which the systematic portion
   is prevented from ever producing a duplicate and also never running
   out for some foreseeable lifetime. 

   In this way, the adversary has to deal with each salt||hash combo as
   unique, with no opportunity to exploit duplications.
</orcnote>

Powered by blists - more mailing lists