Search Postgresql Archives

Re: Loading 500m json files to database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 3/23/20 4:24 AM, pinker wrote:
Hi, do you have maybe idea how to make loading process faster?

I have 500 millions of json files (1 json per file) that I need to load to
db.
My test set is "only" 1 million files.

What I came up with now is:

time for i in datafiles/*; do
   psql -c "\copy json_parts(json_data) FROM $i"&
done

which is the fastest so far. But it's not what i expect. Loading 1m of data
takes me ~3h so loading 500 times more is just unacceptable.

some facts:
* the target db is on cloud so there is no option to do tricks like turning
fsync off
* version postgres 11
* i can spin up huge postgres instance if necessary in terms of cpu/ram
* i tried already hash partitioning (to write to 10 different tables instead
of 1)


Any ideas?

Most advanced languages have a bulk copy implementation.  I've found this to be blindingly fast when the receiving table has no indices, constraints.  It's not clear how large your files are, but you might take this time to "normalized" them: extract any id, datatype, etc into table attributes.

--
Sent from: https://www.postgresql-archive.org/PostgreSQL-general-f1843780.html








[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux