Search Postgresql Archives

Re: Insert with Jsonb column hangs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks Adrian & Hector .

I will try the copy approach & post the result here.

On Sat, Mar 9, 2024 at 9:57 PM hector vass <hector.vass@xxxxxxxxx> wrote:


On Sat, Mar 9, 2024 at 4:10 PM Adrian Klaver <adrian.klaver@xxxxxxxxxxx> wrote:
On 3/9/24 08:00, kuldeep singh wrote:
> Copy may not work in our scenario since we need to join data from
> multiple tables & then  convert it to json using  row_to_json . This
> json data eventually  needs to be stored in a target table .

Per:

https://www.postgresql.org/docs/current/sql-copy.html

"
COPY { table_name [ ( column_name [, ...] ) ] | ( query ) }

<...>

query

     A SELECT, VALUES, INSERT, UPDATE, or DELETE command whose results
are to be copied. Note that parentheses are required around the query.

     For INSERT, UPDATE and DELETE queries a RETURNING clause must be
provided, and the target relation must not have a conditional rule, nor
an ALSO rule, nor an INSTEAD rule that expands to multiple statements.
"

>
> Will it be better if we break the process into batches of like 10,000
> rows & insert the data in its individual transactions? Or any other
> better solution available ?
>
> On Sat, Mar 9, 2024 at 9:01 PM hector vass <hector.vass@xxxxxxxxx
> <mailto:hector.vass@xxxxxxxxx>> wrote:
>
>
>
>     On Sat, Mar 9, 2024 at 3:02 PM kuldeep singh
>     <kuldeeparora89@xxxxxxxxx <mailto:kuldeeparora89@xxxxxxxxx>> wrote:
>
>         Hi,
>
>         We are inserting data close to 1M record & having a single Jsonb
>         column but query is getting stuck.
>
>         We are using insert into select * .. , so all the operations are
>         within the DB.
>
>         If we are running select query individually then it is returning
>         the data in 40 sec for all rows but with insert it is getting stuck.
>
>         PG Version - 15.
>
>         What could be the problem here ?
>
>         Regards,
>         KD
>
>
>     insert 1M rows especially JSON that can be large, variable in size
>     and stored as blobs and indexed is not perhaps the correct way to do
>     this
>     insert performance will also depend on your tuning.  Supporting
>     transactions, users or bulk processing are 3x sides of a compromise.
>     you should perhaps consider that insert is for inserting a few rows
>     into live tables ... you might be better using copy or \copy,
>     pg_dump if you are just trying to replicate a large table
>

--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx

What Adrian Klaver said ^
discovered even this works...


create view myview as (select row_to_json from mytable);

create table newtable as select * from myview where 1=0;

copy myview to program 'psql mydb postgres -c ''copy newtable from stdin'' ';



 

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux