Re: obscure failures when RAM is low

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 1 Jun 2015, Daniel Pocock wrote:

I agree the GCC error is not an autotools fault and the libtool thing is
not something I can roll my sleeves up and fix right now

That is why I was asking about a way for configure to do a basic sanity
test on available memory.  Maybe my question wasn't clear enough.  I
don't expect configure to magically know how much memory is needed, just
a simple test where the developer can suggest some fixed value (e.g.
1GB) and the configure script stops if there is less.

Running out of memory while running the compiler is not a common problem. I have not observed it for least 17 years.

I also fully understand this is not a bulletproof solution, other
processes could still take memory after the build starts running and it
fails.

The notion of "memory" is a very complex topic. For example, a sufficiently large swap partition might allow the compiler to succeed (while taking more time). This is not something that autoconf can reasonably test.

The amount of "over commit" on many GNU/Linux systems is often huge yet they continue to work fine.

  cat /proc/meminfo

  cat /proc/vmstat

Bob
--
Bob Friesenhahn
bfriesen@xxxxxxxxxxxxxxxxxxx, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,    http://www.GraphicsMagick.org/

_______________________________________________
Autoconf mailing list
Autoconf@xxxxxxx
https://lists.gnu.org/mailman/listinfo/autoconf




[Index of Archives]     [GCC Help]     [Kernel Discussion]     [RPM Discussion]     [Red Hat Development]     [Yosemite News]     [Linux USB]     [Samba]

  Powered by Linux