Re: [PATCH V7 9/9] MIPS: Loongson: Introduce and use LOONGSON_LLSC_WAR

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 24 June 2017 09:55:14 BST, Huacai Chen <chenhc@xxxxxxxxxx> wrote:
>Hi, James,
>
>smp_mb__before_llsc() can not be used in all cases, e.g., in
>arch/mips/include/asm/spinlock.h and other similar cases which has a
>label before ll/lld. So, I think it is better to keep it as is to keep
>consistency.

I know. I didn't mean use smp_mb_before_llsc directly, i just meant use something similar directly before the ll that would expand to nothing on non loongson kernels and still avoid the mass duplication of inline asm which leads to divergence, bitrot, and maintenance problems.

cheers
James

>
>Huacai
>
>On Fri, Jun 23, 2017 at 10:54 PM, James Hogan <james.hogan@xxxxxxxxxx>
>wrote:
>> On Thu, Jun 22, 2017 at 11:06:56PM +0800, Huacai Chen wrote:
>>> diff --git a/arch/mips/include/asm/atomic.h
>b/arch/mips/include/asm/atomic.h
>>> index 0ab176b..e0002c58 100644
>>> --- a/arch/mips/include/asm/atomic.h
>>> +++ b/arch/mips/include/asm/atomic.h
>>> @@ -56,6 +56,22 @@ static __inline__ void atomic_##op(int i,
>atomic_t * v)                          \
>>>               "       .set    mips0                                 
> \n"   \
>>>               : "=&r" (temp), "+" GCC_OFF_SMALL_ASM() (v->counter)  
>       \
>>>               : "Ir" (i));                                          
>       \
>>> +     } else if (kernel_uses_llsc && LOONGSON_LLSC_WAR) {           
>       \
>>> +             int temp;                                             
>       \
>>> +                                                                   
>       \
>>> +             do {                                                  
>       \
>>> +                     __asm__ __volatile__(                         
>       \
>>> +                     "       .set    "MIPS_ISA_LEVEL"              
> \n"   \
>>> +                     __WEAK_LLSC_MB                                
>       \
>>> +                     "       ll      %0, %1          # atomic_" #op
>"\n"   \
>>> +                     "       " #asm_op " %0, %2                    
> \n"   \
>>> +                     "       sc      %0, %1                        
> \n"   \
>>> +                     "       .set    mips0                         
> \n"   \
>>> +                     : "=&r" (temp), "+" GCC_OFF_SMALL_ASM()
>(v->counter)      \
>>> +                     : "Ir" (i));                                  
>       \
>>> +             } while (unlikely(!temp));                            
>       \
>>
>> Can loongson use the common versions of all these bits of assembly by
>> adding a LOONGSON_LLSC_WAR dependent smp_mb__before_llsc()-like macro
>> before the asm?
>>
>> It would save a lot of duplication, avoid potential bitrot and
>> divergence, and make the patch much easier to review.
>>
>> Cheers
>> James


--
James Hogan




[Index of Archives]     [Linux MIPS Home]     [LKML Archive]     [Linux ARM Kernel]     [Linux ARM]     [Linux]     [Git]     [Yosemite News]     [Linux SCSI]     [Linux Hams]

  Powered by Linux