Search Postgresql Archives

large xml database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,
i have very big XML documment which is larger than 50GB and want to import it into databse, and transform it to relational schema.
When splitting this documment to smaller independent xml documments i get ~11.1mil XML documents.
I have spent lots of time trying to get fastest way to transform all this data but every time i give up because it takes too much time. Sometimes more than month it would take if not stopped.
I have tried to insert each line as varchar into database and parse it using plperl regex..
also i have tried to store every documment Âas XML and parse it, but it is also to slow.
i have tried to store every documment as varchar but it is also slow when using regex to get data.

many tries have failed because 8GB of ram and 10gb of swap were not enough. also sometimes i get that more than 2^32 operations Âwere performed, and functions stopped to work.

i wanted just to ask if someone knows how to speed this up.

thanx in advance
--
---------------------------------------
Viktor BojoviÄ
---------------------------------------
Wherever I go, Murphy goes with me

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux