Search Postgresql Archives

Re: Populating large DB from Perl script

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Kynn Jones wrote:
I have  large database that needs to be built from
scratch roughly once every month.  I use a Perl script to do this.

The tables are very large, so I avoid as much as possible using
in-memory data structures, and instead I rely heavily on temporary
flat files.

I have solved this general problem in various ways, all of them
unwieldy (in the latest version, the script generates the serial ids
and uses Perl's so-called "tied hashes" to retrieve them when needed).

TIA!

kj

I have done this exact same thing. I started with tied hashes, and even tried BerkeleyDB. They only helped up to a point, where they got so big (a couple gig if I recall correctly) they actually slowed things down. In the end I used a stored proc to do the lookup and insert. In the beginning its not as fast, but by the time the db hits 20 gig its still going strong, where my BerkeleyDB was becoming painful slow. (I recently thought of trying a sqlite table, I've had good luck with them, they can get pretty big and still be very fast... but never got around to trying it.)

So... not really an answer (other than I used a stored proc), but I'd be interested in alternatives too.

-Andy

---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?

              http://www.postgresql.org/docs/faq

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux