lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite for Android: free password hash cracker in your pocket
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <alpine.DEB.2.11.1502121402370.17286@debian>
Date: Thu, 12 Feb 2015 14:49:49 +0100 (CET)
From: Stefan.Lucks@...-weimar.de
To: "discussions@...sword-hashing.net" <discussions@...sword-hashing.net>
Subject: Re: [PHC] PHC status report

On Thu, 12 Feb 2015, Donghoon Chang wrote:

> Let me try to explain how NIST chose 14 candidates from 51 in scientific ways. (At that time, I was
> not involved in NIST.)  But, it is clear why 14 was chosen based on scientific facts, not based on
> any kind of elegance concept.

There where also security issues with some of the 14 candidates which 
where promoted into the 2nd round. NIST decided which security issues they 
considered serious, and which they didn't. Which is perfectly OK, NIST 
had to do that.

CubeHash makes an nice example. The 1st-round submission of CubeHash was 
both absurdly slow and broken by a preimage attack. It was human judgement 
to keep CubeHash in, not neutral facts. Regarding the performance, it was 
based on an extremely conservative choice of the security parameters, 
which could be fixed. The preimage attack was based on a narrow pipe, and 
there appeared no plausible way to push that attack further.

To keep CubeHash in the game, in spite of a clear violation of the 
original security requirements, NIST even tweaked these requirements by 
re-interpreting them. Approximately 2^512 turned into more than 2^480.

This is no complaint, CubeHash was a more interesting design than most of 
the candidates that where kicked out. NIST really did the right thing, 
IMHO. But claiming the SHA-3 process has been so much based on scientific 
facts is plain wrong.

> Can you imagine that we would have used SHA-2 without any worry if the 
> compression function of SHA-2 were not collision resistant?

Doesn't it bother you that the compression function of Keccak/SHA-3 is not 
collision resistant?

NIST decided for some class of candidates (including Keccak, the final 
winner) to ignore "pseudo-collisions" (collisions of the compression 
function), while the same attack on other candidates was an immediate 
killer. This choice is based on judgement, not on scientifically 
measurable facts.

> In case of the remaining 14 candidates, there was no any security issue 
> on their underlying primitives and performance-wise there was no 
> extremely slowness.

Except for CubeHash, on both accounts (security and performance). 
Interestingly, CubeHash is actually a very elegant and simple design ...

> Therefore, I don't think that the PHC panel's 9-candidate selecting 
> process is similar to the 14 2-round candidates selection procedure of 
> the SHA-3 project.

I disagree. Moreover, more half of the 1st-round candidates for SHA-3 have 
been broken. This was a much easier choice than in the case of PHC.


------  I  love  the  taste  of  Cryptanalysis  in  the morning!  ------
uni-weimar.de/de/medien/professuren/mediensicherheit/people/stefan-lucks
--Stefan.Lucks (at) uni-weimar.de, Bauhaus-Universität Weimar, Germany--

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ