Re: Performance Optimization for Dummies 2 - the SQL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> how did you determine that it is done every 500 rows? this is the

The import program pages the import table - it is currently set at 500 rows 
per page. With each page, I run an ANALYZE.

> default autovacuum paramater.  if you followed my earlier
> recommendations, you are aware that autovacuum (which also analyzes)
> is not running during bulk inserts, right?

It's intuitivly obvious, but I can't do bulk inserts. It's just not the 
nature of what we are doing with the data.

> imo, best way to do big data import/conversion is to:
> 1. turn off all extra features, like stats, logs, etc

done

> 2. use copy interface to load data into scratch tables with probably
> all text fields

done

> 3. analyze (just once)

I think this doesn't apply in our case, because we aren't doing bulk 
inserts.

> 4. use big queries to transform, normalize, etc

This is currently being done programmatically. The nature of what we're 
doing is suited for imperitive, navigational logic rather than declarative, 
data set logic; just the opposite of what SQL likes, I know! If there's some 
way to replace thousands of lines of analysis and decision trees with 
ultrafast queries - great...

> important feature of analyze is to tell the planner approx. how big
> the tables are.

But the tables grow as the process progresses - would you not want the 
server to re-evaluate its strategy periodically?

Carlo

>
> merlin
>
> ---------------------------(end of broadcast)---------------------------
> TIP 4: Have you searched our list archives?
>
>               http://archives.postgresql.org
> 




[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux