Re: performance while importing a very large data set in to database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, Dec 5, 2009 at 7:16 AM, "Ing . Marcos Luís Ortíz Valmaseda"
<mlortiz@xxxxxx> wrote:
> Ashish Kumar Singh escribió:
>>
>> Hello Everyone,
>>
>>
>> I have a very bit big database around 15 million in size, and the dump
>> file is around 12 GB.
>>
>> While importing this dump in to database I have noticed that initially
>> query response time is very slow but it does improves with time.
>>
>> Any suggestions to improve performance after dump in imported in to
>> database will be highly appreciated!
>>
>>
>>
>>
>> Regards,
>>
>> Ashish
>>
> My suggestion is:
> 1- Afterward of the db restore, you can do a vacuum analyze manually on your
> big tables to erase all dead rows

Well, there should be no dead rows, it's a fresh restore, so just
plain analyze would be enough.  Note that autovacuum will kick in
eventually and do this for you.

> 2- Then you can reindex your big tables on any case that you use it.

Again, a freshly loaded db does not need to be reindexed.  The indexes
are fresh and new and clean.

> 3- Then apply A CLUSTER command on the right tables that have these indexes.

Now that's useful, but if you're gonna cluster, do it FIRST, then
analyze the tables.

-- 
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux