Bill Thoen <bthoen@xxxxxxxxxx> writes: > My PostgreSQL is working great for small SQL queries even from my large > table (18 million records). But when I ask it to retrieve anything that > takes it more than 10 minutes to assemble, it crashes with this > "Segmentation Fault" error. I get so little feedback and I'm still pretty > unfamiliar with Postgresql that I don't even know where to begin. Running the client under gdb and getting a stack trace would be a good place to begin. FWIW, when I deliberately try to read a query result that's too large for client memory, I get reasonable behavior: regression=# select x, y, repeat('xyzzy',200) from generate_series(1,10000) x, generate_series(1,100) y; out of memory for query result regression=# If you're seeing a segfault in psql then it sounds like a PG bug. If you're seeing a segfault in a homebrew program then I wonder whether it's properly checking for an error return from libpq ... regards, tom lane ---------------------------(end of broadcast)--------------------------- TIP 1: if posting/reading through Usenet, please send an appropriate subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your message can get through to the mailing list cleanly