Hi,
Thanks for reminding me. And the actual number of records is 100,000.
The table is as following:
Table my_messages
----------------------------------------------------------------------------
midx | integer | not null
default nextval('public.my_messages_midx_seq'::text)
msg_from | character varying(150) |
msg_to | character varying(150) |
msg_content | text |
msg_status | character(1) | default 'N'::bpchar
created_dtm | timestamp without time zone | not null default now()
processed_dtm | timestamp without time zone |
rpt_generated | character(1) | default 'N'::bpchar
Indexes:
"msgstat_pkey" PRIMARY KEY, btree (midx)
"my_messages_msgstatus_index" btree (msg_status)
Thanks for help.
From: "Chad Wagner" <chad.wagner@xxxxxxxxx>
To: "carter ck" <carterck32@xxxxxxxxxxx>
CC: pgsql-general@xxxxxxxxxxxxxx
Subject: Re: [GENERAL] Improve Postgres Query Speed
Date: Mon, 15 Jan 2007 19:54:51 -0500
On 1/15/07, carter ck <carterck32@xxxxxxxxxxx> wrote:
I am having slow performance issue when querying a table that contains
more
than 10000 records.
Everything just slow down when executing a query though I have created
Index
on it.
You didn't really provide much information for anyone to help you. I would
suggest posting the table definition (columns & indexes), the queries you
are running, and the output of "EXPLAIN ANALYZE <your query here>;".
--
Chad
http://www.postgresqlforums.com/
_________________________________________________________________
Receive MSN Hotmail alerts over SMS!
http://en-asiasms.mobile.msn.com/ac.aspx?cid=1002