Charles Martin wrote: > SELECT max(length(docfilecontents::text)) FROM docfile; > and after a very long time, got: > ERROR: invalid memory alloc request size 1636085512 SQL state: XX000 It would mean that at least one row has a "docfilecontents" close to 0.5GB in size. Or that the size fields in certain rows are corrupted, although that's less plausible if you have no reason to suspect hardware errors. Does the following query work: SELECT max(octet_length(docfilecontents)) FROM docfile; or maybe a histogram by size in hundred of megabytes: SELECT octet_length(docfilecontents)/(1024*1024*100), count(*) FROM docfile GROUP BY octet_length(docfilecontents)/(1024*1024*100); Note that the error message above does not say that there's not enough free memory, it says that it won't even try to allocate that much, because 1636085512 is over the "varlena limit" of 1GB. AFAICS I'm afraid that this table as it is now cannot be exported by pg_dump, even if you had enough free memory, because any individual row in COPY cannot exceed 1GB in text format. Best regards, -- Daniel Vérité PostgreSQL-powered mailer: http://www.manitou-mail.org Twitter: @DanielVerite