[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Message-Id: <20230416172158.13133-1-david.keisarschm@mail.huji.ac.il>
Date: Sun, 16 Apr 2023 20:21:58 +0300
From: david.keisarschm@...l.huji.ac.il
To: linux-kernel@...r.kernel.org
Cc: Jason@...c4.com, linux-mm@...ck.org, akpm@...ux-foundation.org,
vbabka@...e.cz, 42.hyeyoo@...il.com, mingo@...hat.com,
hpa@...or.com, keescook@...omium.org,
David Keisar Schmidt <david.keisarschm@...l.huji.ac.il>,
ilay.bahat1@...il.com, aksecurity@...il.com
Subject: [PATCH v6 0/3] Replace invocations of prandom_u32() with get_random_u32() and siphash
From: David Keisar Schmidt <david.keisarschm@...l.huji.ac.il>
Hi,
The security improvements for prandom_u32 done in commits c51f8f88d705
from October 2020 and d4150779e60f from May 2022 didn't handle the cases
when prandom_bytes_state() and prandom_u32_state() are used.
Specifically, this weak randomization takes place in three cases:
1. mm/slab.c
2. mm/slab_common.c
3. arch/x86/mm/kaslr.c
The first two invocations (mm/slab.c, mm/slab_common.c) are used to create
randomization in the slab allocator freelists.
This is done to make sure attackers can’t obtain information on the heap state.
The last invocation, inside arch/x86/mm/kaslr.c,
randomizes the virtual address space of kernel memory regions.
Hence, we have added the necessary changes to make those randomizations stronger,
switching prandom_u32 instance to siphash.
Changes since v5:
* Fixed coding style issues in mm/slab and mm/slab_common.
* Deleted irrelevant changes which were appended accidentally in
arch/x86/mm/kaslr.
Changes since v4:
* Changed only the arch/x86/mm/kaslr patch.
In particular, we replaced the use of prandom_bytes_state and
prandom_seed_state with siphash inside arch/x86/mm/kaslr.c.
Changes since v3:
* edited commit messages
Changes since v2:
* edited commit message.
* replaced instances of get_random_u32 with get_random_u32_below
in mm/slab.c, mm/slab_common.c
Regards,
David Keisar Schmidt (3):
mm/slab: Replace invocation of weak PRNG
mm/slab_common: Replace invocation of weak PRNG
arch/x86/mm/kaslr: use siphash instead of prandom_bytes_state
arch/x86/mm/kaslr.c | 21 +++++++++++++++------
mm/slab.c | 29 +++++++++--------------------
mm/slab_common.c | 11 +++--------
3 files changed, 27 insertions(+), 34 deletions(-)
--
2.37.3
Powered by blists - more mailing lists