On 2012-03-12, Carson Gross <carsongross@xxxxxxxxx> wrote: > We've got a postgres database with *a lot* of data in one table. On the > order of 100 million rows at this point. Postgres is, of course, handling > it with aplomb. > ALTER TABLE my_table ALTER COLUMN id TYPE bigint; > However, given the size of this table, I have no idea how long something > like this might take. In general I've had a tough time getting feedback > from postgres on the progress of a query, how long something might take, > etc. I would estimate minutes to hours, it also depends how many foreign keys must be re-checked. > So my question is: is there a way to understand roughly how long something > like this might take? Our DB is out on crappy Amazon ec2 instances, so we > don't exactly have screamers set up. Any tools I can use? use the cloud. set up a clone and do some testing, -- ⚂⚃ 100% natural -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general