Quantcast
Viewing all articles
Browse latest Browse all 12

comment added

Replying to dsteffen@…:

+#define dispatch_atomic_barrier() \ + asm volatile("" : : : "memory")

This assembly instruction won't work on ARM or PPC, according to a similar function in the Haskell source code[1] as reproduced below:

294 /* 295 * We need to tell both the compiler AND the CPU about the barriers. 296 * It's no good preventing the CPU from reordering the operations if 297 * the compiler has already done so - hence the "memory" restriction 298 * on each of the barriers below. 299 */ 300 EXTERN_INLINE void 301 write_barrier(void) {

302 #if i386_HOST_ARCH

x86_64_HOST_ARCH 303 asm volatile ("" : : : "memory"); 304 #elif powerpc_HOST_ARCH 305 asm volatile ("lwsync" : : : "memory"); 306 #elif sparc_HOST_ARCH 307 /* Sparc in TSO mode does not require store/store barriers. */ 308 asm volatile ("" : : : "memory"); 309 #elif arm_HOST_ARCH && defined(arm_HOST_ARCH_PRE_ARMv7) 310 asm volatile ("" : : : "memory"); 311 #elif arm_HOST_ARCH && !defined(arm_HOST_ARCH_PRE_ARMv7) 312 asm volatile ("dmb st" : : : "memory"); 313 #elif !defined(WITHSMP) 314 return; 315 #else 316 #error memory barriers unimplemented on this architecture 317 #endif 318 }

[1] Source:

 http://hackage.haskell.org/trac/ghc/browser/includes/stg/SMP.h


Viewing all articles
Browse latest Browse all 12

Trending Articles