Search squid archive

Re: Re: Large rock problem

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Alex
After having problems with large-rock branch of squid i moved to test the other one collapsed-fwd
and testing the last release but um still having some problems :


FATAL: Squid has attempted to read data from memory that is not present. This is an indication of of (pre-3.0) code that hasn't been updated to deal with sparse objects in memory. Squid should coredump.allowing to review the cause. Immediately preceding this message is a dump of the available data in the format [start,end). The [ means from the value, the ) means up to the value. I.e. [1,5) means that there are 4 bytes of data, at offsets 1,2,3,4.


and it stops after this error


Best Regards
Ayham


On 12/03/2013 06:41 PM, Alex Rousskov wrote:
> On 12/03/2013 09:19 AM, Ayham Abou Afach wrote:
>
>> sorry alex i think i was using the wrong one
>>      large-rock
>>
>> so i should first redo my test on the new one and then continue with the
>> post.
>
> Yes, please.
>
>
>> but why the large rock branch which is refereed from the large rock wiki
>> is old ??
>
> The Large Rock wiki page mentions both branches and instructs the reader
> to use the Collapsed Forwarding branch for testing. We did not propagate
> Large Rock-related changes on the Collapsed Forwarding branch back to
> the Large Rock branch because there were more important things to do.
>
>
> Hope this clarifies,
>
> Alex.
>





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux