On Sun, Feb 28, 2016 at 09:56:24PM +0000, Tom Hughes wrote: > From past experience I know that they haven't shown a huge amount of > interest in failures that are specific to 32 bit platforms though in this > case they do have somebody that has been trying to tame some of the wilder > excesses of memory usage in the compilations. You can try to look at which optimization pass is requiring most memory and perhaps disable it for 32-bit host building on selected source files. g++ -fmem-report -ftime-report could help in figuring that out. E.g. if it is debug info generation, you might try to just decrease quality of the debug info through -fno-var-tracking-assignments, if it is var-tracking itself, you might try to decrease --param max-vartrack-size=50000000 to smaller value, or --param max-vartrack-expr-depth=12 , or --param max-vartrack-reverse-op-size=50 Does the problematic source file contain one huge function (especially with lots of control flow)? You might want to try -O1 in that case (-O1 is an option generally trying to avoid most of quadratic algorithms both in compile time and memory time, meant e.g. for huge generated functions with lots of control flow). If it contains many smaller functions, you might try to split them out into smaller files, or compile the file using LTO (-c -flto and then -r -nostdlib -flto relocatably link it together). Jakub -- devel mailing list devel@xxxxxxxxxxxxxxxxxxxxxxx http://lists.fedoraproject.org/admin/lists/devel@xxxxxxxxxxxxxxxxxxxxxxx