Re: strict memory

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Hi John:

Well, we run a lot of statistical analysis and our code loads a lot of
data into a vector for fast calculations. I am not sure how else to do
these calculations fast without loading it into memory. Thats why we
have to do it this way.

TIA

On Thu, Oct 16, 2008 at 1:00 PM, John R Pierce <pierce@xxxxxxxxxxxx> wrote:
> Mag Gam wrote:
>>
>> Hello All:
>>
>> Running 5.2 at our university. We have several student's processes
>> that take up too much memory. Our system have 64G of RAM and some
>> processes take close to 32-48G of RAM. This is causing many problems
>> for others. I was wondering if there is a way to restrict memory usage
>> per process? If the process goes over 32G simply kill it. Any thoughts
>> or ideas?
>>
>>
>
> In /etc/profile, use "ulimit -v NNNN"   (in kilobytes) to limit the max
> virtual of all processes spawned by that shell
>
>
> 32G per process on a 64G machine sounds like a bit much.    wouldn't a limit
> more like 4GB per user session be more appropriate on a multiuser system?
> _______________________________________________
> CentOS mailing list
> CentOS@xxxxxxxxxx
> http://lists.centos.org/mailman/listinfo/centos
>
_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
http://lists.centos.org/mailman/listinfo/centos

[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]
  Powered by Linux