Search Postgresql Archives

question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,

I recently downloaded postgres 9.4 and I have a client application that runs in Tcl that inserts to the db and fetches records.

For the majority of the time, the app will connect to the server to do insert/fetch.

For occasional use, we want to remove the requirement to have a server db and just have the application retrieve data from a local file.

I know I can use pg_dump to export the tables. The questions are:

1) is there an in-memory db instance or file based I can create that is loaded with the dump file? This way the app code doesn't have to change.

2) does pg support embedded db?
3) Or is my best option to convert the dump to sqlite and the import the sqlite and have the app read that embedded db.

Finally, I am noticing pg_dump takes a lot of time to create a dump of my table. right now, the table  has 77K rows. Are there any ways to create automated batch files to create dumps overnight and do so quickly?

Thanks for your inputs!

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux