[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <CAOLP8p4ChZQ5XOEN7Y4j3-J4Yms0Np7y4FcWdx_tvt66Jt1ZVA@mail.gmail.com>
Date: Sat, 4 Jan 2014 06:31:25 -0500
From: Bill Cox <waywardgeek@...il.com>
To: discussions@...sword-hashing.net
Subject: Re: [PHC] Weakness in keystretch hashing, and a solution (I think)
On Sat, Jan 4, 2014 at 5:45 AM, Bill Cox <waywardgeek@...il.com> wrote:
> So, what if he selects pages cleverly? Can he do better? For example,
> keep the first half, and none of the second, since these are accessed more.
> In that case, he has to generate no pages for 25% of the second half pages
> generated. but he wont have the previous page of that page when he does,
> so the penalty is having to generate both. So he has to generate on
> average .5 pages of 1st level miss, and I think .25 pages of second level
> misses and so on, again converting to on average one extra page, so he's
> again stuck with the same time*area.
>
>
I made a mistake here. If only the 1st 50% of pages are kept, then
generating new pages becomes far more expensive, because the first page you
need in the second half of memory requires all the pages before it to be
generated because you need the previous page to generate the next one.
Also, if you spread the kept pages uniformly through memory, but kept less
than half, you'd get a diverging series that adds to a lot more than the 1
page average penalty when we kept half the pages. This is quite a bit
better behavior than script.
Content of type "text/html" skipped
Powered by blists - more mailing lists