Re: The Last Optimization

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



You'll have to post your complete schema and the actual queries you are
runing that are so slow.  Postgres can easily handle such large tables, so
there's probably more improvements you can make to the speed.

Chris

On Fri, 6 Sep 2002, Areski Belaid wrote:

> I have a huge table with 14 field and few million of data...
> My application Php/Pg start to be impossible to use.
>
>     Redhat 7.3
>     Dual PIII 900Mhz System
>     2GB RAM
>
> I did already a some optimization optimization :
>
>     max_connections = 64
>     shared_buffers = 32000
>     sort_mem = 64336
>     fsync = false
>                         ---
>     echo 128000000 > /proc/sys/kernel/shmmax
>
>     also Vaccum,analyze and Index
>
>                         ---
>
>     This optimization was enough at the beginning but NOT now with some
> million of instance.
>
>     So WHAT I CAN DO ??? USE ORACLE ???
>
>     I Think maybe to split my mean table to different table Mean_a Mean_b
> ... Mean_z ???
>     IF it's the way someone where I can find doc or help about howto split
> table ???
>
>     I M lost !!! ;)
>
>
>
> Areski
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
> subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your
> message can get through to the mailing list cleanly
>



[Index of Archives]     [Postgresql General]     [Postgresql Admin]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Backpacking]     [Postgresql Jobs]

  Powered by Linux