Search Postgresql Archives

Re: Alter the column data type of the large data volume table.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 3 Dec 2020, Michael Lewis wrote:

On Wed, Dec 2, 2020 at 11:53 PM charles meng <xlyybz@xxxxxxxxx> wrote:

I have a table with 1.6 billion records. The data type of the primary key
column is incorrectly used as integer. I need to replace the type of the
column with bigint. Is there any ideas for this?

You can add a new column with NO default value and null as default and have
it be very fast. Then you can gradually update rows in batches (if on
PG11+, perhaps use do script with a loop to commit after X rows) to set the
new column the same as the primary key. Lastly, in a transaction, update
any new rows where the bigint column is null, and change which column is
the primary key & drop the old one. This should keep each transaction
reasonably sized to not hold up other processes.

Tell me, please, why

ALTER TABLE <tablename> ALTER COLUMN <columnname> SET DATA TYPE BIGINT

will not do the job?

I've found some varchar columns in a couple of tables too small and used the
above to increase their size. Worked perfectly.

Regards,

Rich





[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux