summaryrefslogtreecommitdiffstats
path: root/arch/arm64
diff options
context:
space:
mode:
authorAndre Przywara <andre.przywara@arm.com>2015-04-20 12:14:19 +0200
committerWill Deacon <will.deacon@arm.com>2015-04-27 12:39:04 +0200
commit878a84d5a8a18a4ab241d40cebb791d6aedf5605 (patch)
tree6b1cdd047b75849c2e70ccc73643d4147500c26d /arch/arm64
parentLinux 4.1-rc1 (diff)
downloadlinux-878a84d5a8a18a4ab241d40cebb791d6aedf5605.tar.xz
linux-878a84d5a8a18a4ab241d40cebb791d6aedf5605.zip
arm64: add missing data types in smp_load_acquire/smp_store_release
Commit 8053871d0f7f ("smp: Fix smp_call_function_single_async() locking") introduced a call to smp_load_acquire() with a u16 argument, but we only cared about u32 and u64 types in that function so far. This resulted in a compiler warning fortunately, pointing at an uninitialized use. Due to the implementation structure the compiler misses that bug in the smp_store_release(), though. Add the u16 and u8 variants using ldarh/stlrh and ldarb/stlrb, respectively. Together with the compiletime_assert_atomic_type() check this should cover all cases now. Acked-by: Will Deacon <will.deacon@arm.com> Signed-off-by: Andre Przywara <andre.przywara@arm.com> Signed-off-by: Will Deacon <will.deacon@arm.com>
Diffstat (limited to 'arch/arm64')
-rw-r--r--arch/arm64/include/asm/barrier.h16
1 files changed, 16 insertions, 0 deletions
diff --git a/arch/arm64/include/asm/barrier.h b/arch/arm64/include/asm/barrier.h
index a5abb0062d6e..71f19c4dc0de 100644
--- a/arch/arm64/include/asm/barrier.h
+++ b/arch/arm64/include/asm/barrier.h
@@ -65,6 +65,14 @@ do { \
do { \
compiletime_assert_atomic_type(*p); \
switch (sizeof(*p)) { \
+ case 1: \
+ asm volatile ("stlrb %w1, %0" \
+ : "=Q" (*p) : "r" (v) : "memory"); \
+ break; \
+ case 2: \
+ asm volatile ("stlrh %w1, %0" \
+ : "=Q" (*p) : "r" (v) : "memory"); \
+ break; \
case 4: \
asm volatile ("stlr %w1, %0" \
: "=Q" (*p) : "r" (v) : "memory"); \
@@ -81,6 +89,14 @@ do { \
typeof(*p) ___p1; \
compiletime_assert_atomic_type(*p); \
switch (sizeof(*p)) { \
+ case 1: \
+ asm volatile ("ldarb %w0, %1" \
+ : "=r" (___p1) : "Q" (*p) : "memory"); \
+ break; \
+ case 2: \
+ asm volatile ("ldarh %w0, %1" \
+ : "=r" (___p1) : "Q" (*p) : "memory"); \
+ break; \
case 4: \
asm volatile ("ldar %w0, %1" \
: "=r" (___p1) : "Q" (*p) : "memory"); \