Re: [RFC][PATCH 02/22] x86,mmx_32: Remove .fixup usage

From: Josh Poimboeuf
Date: Thu Nov 04 2021 - 16:22:39 EST


On Thu, Nov 04, 2021 at 05:47:31PM +0100, Peter Zijlstra wrote:
> This code puts an exception table entry on the "PREFIX" instruction to
> overwrite it with a jmp.d8 when it triggers an exception. Except of
> course, our code is no longer writable, also SMP.
>
> Replace it with ALTERNATIVE, the novel
>
> XXX: arguably we should just delete this code
>
> Signed-off-by: Peter Zijlstra (Intel) <peterz@xxxxxxxxxxxxx>
> ---
> arch/x86/lib/mmx_32.c | 83 ++++++++++++++++----------------------------------
> 1 file changed, 27 insertions(+), 56 deletions(-)
>
> --- a/arch/x86/lib/mmx_32.c
> +++ b/arch/x86/lib/mmx_32.c
> @@ -50,23 +50,17 @@ void *_mmx_memcpy(void *to, const void *
> kernel_fpu_begin_mask(KFPU_387);
>
> __asm__ __volatile__ (
> - "1: prefetch (%0)\n" /* This set is 28 bytes */
> - " prefetch 64(%0)\n"
> - " prefetch 128(%0)\n"
> - " prefetch 192(%0)\n"
> - " prefetch 256(%0)\n"
> - "2: \n"
> - ".section .fixup, \"ax\"\n"
> - "3: movw $0x1AEB, 1b\n" /* jmp on 26 bytes */
> - " jmp 2b\n"
> - ".previous\n"
> - _ASM_EXTABLE(1b, 3b)
> - : : "r" (from));
> + ALTERNATIVE "",
> + "prefetch (%0)\n"
> + "prefetch 64(%0)\n"
> + "prefetch 128(%0)\n"
> + "prefetch 192(%0)\n"
> + "prefetch 256(%0)\n", X86_FEATURE_3DNOW

I think this should instead be X86_FEATURE_3DNOWPREFETCH (which isn't
3DNow-specific and should really just be called X86_FEATURE_PREFETCH
anyway)

> + : : "r" (from));
>
> for ( ; i > 5; i--) {
> __asm__ __volatile__ (
> - "1: prefetch 320(%0)\n"
> - "2: movq (%0), %%mm0\n"
> + " movq (%0), %%mm0\n"

Not sure why this prefetch was removed? It can also be behind an
alternative just like the first one.

--
Josh