summaryrefslogtreecommitdiffstats
path: root/arch/x86/crypto/aegis128-aesni-asm.S
diff options
context:
space:
mode:
authorLinus Torvalds <torvalds@linux-foundation.org>2018-08-13 22:35:26 +0200
committerLinus Torvalds <torvalds@linux-foundation.org>2018-08-13 22:35:26 +0200
commitf24d6f2654d39355cdf8285e21409ed8d56d4284 (patch)
treee6d2c683e61f30147bf73eba8d9fbf2c05865f03 /arch/x86/crypto/aegis128-aesni-asm.S
parentMerge branch 'x86-boot-for-linus' of git://git.kernel.org/pub/scm/linux/kerne... (diff)
parentx86/entry/64: Add two more instruction suffixes (diff)
downloadlinux-f24d6f2654d39355cdf8285e21409ed8d56d4284.tar.xz
linux-f24d6f2654d39355cdf8285e21409ed8d56d4284.zip
Merge branch 'x86-asm-for-linus' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip
Pull x86 asm updates from Thomas Gleixner: "The lowlevel and ASM code updates for x86: - Make stack trace unwinding more reliable - ASM instruction updates for better code generation - Various cleanups" * 'x86-asm-for-linus' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip: x86/entry/64: Add two more instruction suffixes x86/asm/64: Use 32-bit XOR to zero registers x86/build/vdso: Simplify 'cmd_vdso2c' x86/build/vdso: Remove unused vdso-syms.lds x86/stacktrace: Enable HAVE_RELIABLE_STACKTRACE for the ORC unwinder x86/unwind/orc: Detect the end of the stack x86/stacktrace: Do not fail for ORC with regs on stack x86/stacktrace: Clarify the reliable success paths x86/stacktrace: Remove STACKTRACE_DUMP_ONCE x86/stacktrace: Do not unwind after user regs x86/asm: Use CC_SET/CC_OUT in percpu_cmpxchg8b_double() to micro-optimize code generation
Diffstat (limited to 'arch/x86/crypto/aegis128-aesni-asm.S')
-rw-r--r--arch/x86/crypto/aegis128-aesni-asm.S2
1 files changed, 1 insertions, 1 deletions
diff --git a/arch/x86/crypto/aegis128-aesni-asm.S b/arch/x86/crypto/aegis128-aesni-asm.S
index 717bf0776421..5f7e43d4f64a 100644
--- a/arch/x86/crypto/aegis128-aesni-asm.S
+++ b/arch/x86/crypto/aegis128-aesni-asm.S
@@ -75,7 +75,7 @@
* %r9
*/
__load_partial:
- xor %r9, %r9
+ xor %r9d, %r9d
pxor MSG, MSG
mov LEN, %r8