From: Masahiro Yamada masahiroy@kernel.org
stable inclusion from stable-v5.10.223 commit c85e6b7d9ef83eadd24e0cb793f59167d7edbb3a category: bugfix bugzilla: https://gitee.com/openeuler/kernel/issues/IAN3RF
Reference: https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux.git/commit/?id=...
--------------------------------
commit 24d3ba0a7b44c1617c27f5045eecc4f34752ab03 upstream.
The 32-bit ARM kernel stops working if the kernel grows to the point where veneers for __get_user_* are created.
AAPCS32 [1] states, "Register r12 (IP) may be used by a linker as a scratch register between a routine and any subroutine it calls. It can also be used within a routine to hold intermediate values between subroutine calls."
However, bl instructions buried within the inline asm are unpredictable for compilers; hence, "ip" must be added to the clobber list.
This becomes critical when veneers for __get_user_* are created because veneers use the ip register since commit 02e541db0540 ("ARM: 8323/1: force linker to use PIC veneers").
[1]: https://github.com/ARM-software/abi-aa/blob/2023Q1/aapcs32/aapcs32.rst
Signed-off-by: Masahiro Yamada masahiroy@kernel.org Reviewed-by: Ard Biesheuvel ardb@kernel.org Signed-off-by: Russell King (Oracle) rmk+kernel@armlinux.org.uk Cc: John Stultz jstultz@google.com Signed-off-by: Greg Kroah-Hartman gregkh@linuxfoundation.org Signed-off-by: Yuntao Liu liuyuntao12@huawei.com --- arch/arm/include/asm/uaccess.h | 14 ++------------ 1 file changed, 2 insertions(+), 12 deletions(-)
diff --git a/arch/arm/include/asm/uaccess.h b/arch/arm/include/asm/uaccess.h index 42aa4a22803c..2dc534c373e7 100644 --- a/arch/arm/include/asm/uaccess.h +++ b/arch/arm/include/asm/uaccess.h @@ -145,16 +145,6 @@ extern int __get_user_64t_1(void *); extern int __get_user_64t_2(void *); extern int __get_user_64t_4(void *);
-#define __GUP_CLOBBER_1 "lr", "cc" -#ifdef CONFIG_CPU_USE_DOMAINS -#define __GUP_CLOBBER_2 "ip", "lr", "cc" -#else -#define __GUP_CLOBBER_2 "lr", "cc" -#endif -#define __GUP_CLOBBER_4 "lr", "cc" -#define __GUP_CLOBBER_32t_8 "lr", "cc" -#define __GUP_CLOBBER_8 "lr", "cc" - #define __get_user_x(__r2, __p, __e, __l, __s) \ __asm__ __volatile__ ( \ __asmeq("%0", "r0") __asmeq("%1", "r2") \ @@ -162,7 +152,7 @@ extern int __get_user_64t_4(void *); "bl __get_user_" #__s \ : "=&r" (__e), "=r" (__r2) \ : "0" (__p), "r" (__l) \ - : __GUP_CLOBBER_##__s) + : "ip", "lr", "cc")
/* narrowing a double-word get into a single 32bit word register: */ #ifdef __ARMEB__ @@ -184,7 +174,7 @@ extern int __get_user_64t_4(void *); "bl __get_user_64t_" #__s \ : "=&r" (__e), "=r" (__r2) \ : "0" (__p), "r" (__l) \ - : __GUP_CLOBBER_##__s) + : "ip", "lr", "cc") #else #define __get_user_x_64t __get_user_x #endif