On Fri, Apr 13, 2012 at 12:31 PM, Steve Crawford <scrawford@xxxxxxxxxxxxxxxxxxxx> wrote:
On 04/13/2012 08:30 AM, Jeff Adams wrote:That really depends on details and your concerns. Is the database used for constant insert/update/select activity or is it a big table used for analysis and can be taken offline for some period? Is the column you want to update a primary or foreign key? How much available disk space do you have? Is a large portion of the data static (historical logs)?
so i can? if so, how do i go about? i should mention that, while i dabble in postgres dba activity, it is not my day job...
Some possible approaches:
1. Just let it run to completion if you can afford the maintenance time.
2. Add a new column of the appropriate type, copy the data into that column then drop the old one and rename the new one. If you do the update all at once you will have severe table bloat but you may be able to do the updates of the new column in batches so that vacuum can reclaim space between update batches. This approach may be useful if you do not have enough maintenance time to do the change all at once.
3. Dump the table data. Truncate the table and modify the column definition. Restore the data. This requires downtime but will probably be faster than in-place modification. However it's not something that you can easily cancel part-way through and not a friendly method if there are foreign-keys involved.
4. Rename the table and create a new table with the structure you want. Copy the old data back into the new table - perhaps in batches. This might be useful if you need to constantly keep collecting data but can afford a delay in analysis of the data.
If partitioning the table would be beneficial, this might be a good time to consider that as well.
Cheers,
Steve
--
Jeffrey D. Adams
National Marine Fisheries Service
Office of Protected Resources
1315 East West Hwy, Building SSMC3
Silver Spring, MD 20910-3282
phone: (301) 427-8434
fax: (301) 713-0376