Donate to e Foundation | Murena handsets with /e/OS | Own a part of Murena! Learn more

Commit 8540571e authored by Christophe Leroy's avatar Christophe Leroy Committed by Michael Ellerman
Browse files

powerpc/32: Fix again csum_partial_copy_generic()



Commit 7aef4136 ("powerpc32: rewrite csum_partial_copy_generic()
based on copy_tofrom_user()") introduced a bug when destination address
is odd and len is lower than cacheline size.

In that case the resulting csum value doesn't have to be rotated one
byte because the cache-aligned copy part is skipped so no alignment
is performed.

Fixes: 7aef4136 ("powerpc32: rewrite csum_partial_copy_generic() based on copy_tofrom_user()")
Cc: stable@vger.kernel.org # v4.6+
Reported-by: default avatarAlessio Igor Bogani <alessio.bogani@elettra.eu>
Signed-off-by: default avatarChristophe Leroy <christophe.leroy@c-s.fr>
Tested-by: default avatarAlessio Igor Bogani <alessio.bogani@elettra.eu>
Signed-off-by: default avatarMichael Ellerman <mpe@ellerman.id.au>
parent caa58f80
Loading
Loading
Loading
Loading
+4 −3
Original line number Diff line number Diff line
@@ -127,18 +127,19 @@ _GLOBAL(csum_partial_copy_generic)
	stw	r7,12(r1)
	stw	r8,8(r1)

	rlwinm	r0,r4,3,0x8
	rlwnm	r6,r6,r0,0,31	/* odd destination address: rotate one byte */
	cmplwi	cr7,r0,0	/* is destination address even ? */
	addic	r12,r6,0
	addi	r6,r4,-4
	neg	r0,r4
	addi	r4,r3,-4
	andi.	r0,r0,CACHELINE_MASK	/* # bytes to start of cache line */
	crset	4*cr7+eq
	beq	58f

	cmplw	0,r5,r0			/* is this more than total to do? */
	blt	63f			/* if not much to do */
	rlwinm	r7,r6,3,0x8
	rlwnm	r12,r12,r7,0,31	/* odd destination address: rotate one byte */
	cmplwi	cr7,r7,0	/* is destination address even ? */
	andi.	r8,r0,3			/* get it word-aligned first */
	mtctr	r8
	beq+	61f