Search Postgresql Archives

slow speeds after 2 million rows inserted

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Greetings,

Ive got a java application I am reading data from a flat file and inserting it into a table. The first 2 million rows (each file contained about 1 million lines) went pretty fast. Less than 40 mins to insert into the database.

After that the insert speed is slow. I think I may be able to type the data faster than what is being done by the java application on the third file.

Table looks like this:

CREATE TABLE data_archive
(
 id serial NOT NULL,
 batchid integer NOT NULL,
 claimid character varying(25) NOT NULL,
 memberid character varying(45) NOT NULL,
 raw_data text NOT NULL,
 status integer DEFAULT 0,
 line_number integer,
 CONSTRAINT data_archive_pkey PRIMARY KEY (id)
)


there is also an index on batchid.

The insert command is like so:

"INSERT INTO data_archive (batchid, claimid, memberid, raw_data, status, line_number) VALUES ('" + commandBatchID + "', '', '', '" + raw_data + "', '1', '" + myFilter.claimLine + "');";

where the raw_data variable is the line from the file.

How can I find out what is causing this slow down and how do I speed it up?

Database is 8.2.0 on x86_64-unknown-linux-gnu.

There is nothing else running on this database server (other than standard linux background programs). PS ax did not show anything else running. No locks other than the occasional lock by the INSERT query.
I have done a FULL vacuum on this table but not reindex (running now).

Thanks in advance,
James


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux