Search Postgresql Archives

Bulk Insert/Update Scenario

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I am setting up the data processing pipeline and the end result gets stored in Postgres. Have not been a heavy DB user in general and had question regarding how best to handle bulk insert/updates scenario with Postgres. 

Here is my use case:
* I get file with thousands of entries (lines) periodically.
* I process each entry (line) from the file and data is split and stored in different Postgres tables. Some tables have foreign keys on other tables. There is "no" straight mapping from the entry in file to Postgres tables.
* Some data could be updates on existing rows in Postgres tables while others could be inserts.
* Would like to ensure the atomicity (either all rows gets stored in all tables or nothing gets stored on failure from Postgres).
* Also like to make sure no concurrency issues in case two different processes try to perform above at the same time.
* Ideally, would want to avoid individual upserts after processing every single entry (line) from the file.


I thought this would be a fairly common use case. What is the best way to handle above? What performance issues I should keep in mind and what are the pitfalls? I tried looking around for articles for such use case - any pointers would be greatly appreciated.


By the way, the application is in Python running in Apache Spark and can use any Python libraries that can help simplify above.

Thanks in advance.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux