On 4/27/15 8:45 AM, Marc-André Goderre wrote:
Can I change the segment size to allow more memory?
Is it a good idea?
The concerned function work only on the entire table then I can't process a part of it.
Should I split the table in multiple table and merge them after the process?
Please don't top-post.
I use Postgis and PGrouting extension.
The error come when I use a pgrouting function pgr_createtopology()
It appears pgrouting violates the 1GB per chunk limit in the postgres backend when processing large datasets:
https://github.com/pgRouting/pgrouting/issues/291
Changing the segment size would just push the problem down the road. At
some point the same error will happen.
That issue URL has a comment about "Don't try and process all of Europe
at once, give it a bounding box", so that's one possible solution.
Really the function should be changed so it doesn't trying and palloc
more than 1G in a single go... but OTOH there's only so far you can
probably go there too. I imagine the complexity of what the function is
trying to do grows geometrically with the size of the data set, so you
probably need to find some way to break your data into smaller pieces
and process each piece individually.
--
Jim Nasby, Data Architect, Blue Treble Consulting
Data in Trouble? Get it in Treble! http://BlueTreble.com
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general