"Shuo Liu" <delphet@xxxxxxxxxx> writes: > TopMemoryContext: 11550912 total in 1377 blocks; 123560 free (833 chunks); 11427352 used Whoa ... that is a whole lot more data than I'm used to seeing in TopMemoryContext. How many stats dump lines are there exactly (from here to the crash report)? If there are many (like many thousands) that would explain why TopMemoryContext is so large, but would direct our attention to what's generating so many. If there are just a hundred or so, which'd be typical, then we are looking at some kind of leak of TopMemoryContext data --- though that still doesn't explain the crash. > The spatial database that the script is using is quite large (about 4 > GB). So I think making a self-contained test case would be the last > resort. FWIW, I doubt that the content of the database is the key point here; you could probably generate a test case with relatively little data, or maybe a lot of easily-created dummy data. However stripping it down might require more insight into the nature of the bug than we have at this point. regards, tom lane