Re: Out of memory error during pg_upgrade in big DB with large objects

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I tried (even if vers 15 is not feasible at the moment as we tested only vers 14).

It ended with the same Out of memory failure, just more quickly ( 1 hour instead of 12 hours)


Il 21/11/2022 18:30, Tom Lane ha scritto:
Massimo Ortensi <mortensi@xxxxxxxxxxxxxxx> writes:
I'm trying to upgrade a huge DB from postgres 10 to 14
This cluster is 70+ TB, with one database having more than 2 billion
records in pg_largeobject
I'm trying pg_upgrade in hard link mode, but the dump of databas schema
phase always fails with
pg_dump: error: query failed: out of memory for query result
pg_dump: error: query was: SELECT l.oid, (SELECT rolname FROM
pg_catalog.pg_roles WHERE oid = l.lomowner) AS rolname, (SELECT
pg_catalog.array_agg(acl ORDER BY row_n) FROM (SELECT acl, row_n FROM
FWIW, this query was rewritten pretty substantially in v15.
It's still going to produce a row per large object, but it
should be a lot narrower because most of the ACL-wrangling
now happens somewhere else.  I don't know if migrating to
v15 instead of v14 is an option for you, and I can't promise
that that'd be enough savings to fix it anyway.  But it's
something to think about.

			regards, tom lane





[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql PHP]     [Postgresql Jobs]     [PHP Users]     [PHP Databases]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux